MB1-Meta: Comparison of the ManyBabies 1 results to meta-analytic data
Meta-analyses are often considered to be the most reliable source of information when it comes to deciding whether a phenomenon is real, and how strong the effect is. However, large-scale collaborations, such as ManyBabies, often yield different results than published meta-analyses. To better understand how the two ways to collect and analyze large datasets are related (or not), we update the meta-analysis on infant-directed speech preference and subject it to a joint analysis.
Project Leads
- Martin Zettersten, Princeton University, United States [email]
- Christopher Cox, Aarhus University, Denmark [email]
- Christina Bergmann, University of Applied Sciences, Germany & Max Planck Institute for Psycholinguistics, Netherlands [email]
- Maya Mathur, Stanford University, United States [email]
Status
- Manuscript in press at Open Mind
Links
MB1-Demo: Analysis of supplemental demographic variables
The ManyBabies 1 project provides an unique opportunity not only to take stock of the field and discover how our methods and approaches differ, but to begin to understand the factors that make these effects so difficult to measure. In this ongoing exploratory project, we plan to analyze additional variables collected alongside the main MB1 project, consisting of a wide range of ‘lab factors’ that researchers believe may impact either whether a baby fusses out of a study (e.g., Research Assistant having beard), or whether they truly attend to stimuli (and thus produce an expected effect in the study).
Project Lead
Melissa Kline, Massachusettes Institute of Technology, United States [email]
Links
Contributors
We encourage everyone who is interested in the project to contact the Project Lead (see above) or fill out the MB Sign-Up Form.
Please note that access to infants/infant lab is NOT a prerequisite.