RESUMO
Data collection in psychology increasingly relies on "open populations" of participants recruited online, which presents both opportunities and challenges for replication. Reduced costs and the possibility to access the same populations allows for more informative replications. However, researchers should ensure the directness of their replications by dealing with the threats of participant nonnaiveté and selection effects.
Assuntos
Pesquisadores , Coleta de DadosRESUMO
Many argue that there is a reproducibility crisis in psychology. We investigated nine well-known effects from the cognitive psychology literature-three each from the domains of perception/action, memory, and language, respectively-and found that they are highly reproducible. Not only can they be reproduced in online environments, but they also can be reproduced with nonnaïve participants with no reduction of effect size. Apparently, some cognitive tasks are so constraining that they encapsulate behavior from external influences, such as testing situation and prior recent experience with the experiment to yield highly robust effects.
Assuntos
Ciência Cognitiva/normas , Função Executiva/fisiologia , Inibição Psicológica , Idioma , Memória/fisiologia , Percepção/fisiologia , Psicologia/normas , Desempenho Psicomotor/fisiologia , Reprodutibilidade dos Testes , Adulto , Feminino , Humanos , Masculino , Adulto JovemRESUMO
Crowdsourcing data collection from research participants recruited from online labor markets is now common in cognitive science. We review who is in the crowd and who can be reached by the average laboratory. We discuss reproducibility and review some recent methodological innovations for online experiments. We consider the design of research studies and arising ethical issues. We review how to code experiments for the web, what is known about video and audio presentation, and the measurement of reaction times. We close with comments about the high levels of experience of many participants and an emerging tragedy of the commons.
Assuntos
Ciência Cognitiva , Crowdsourcing , Coleta de Dados/métodos , Humanos , Reprodutibilidade dos TestesRESUMO
Although researchers often assume their participants are naive to experimental materials, this is not always the case. We investigated how prior exposure to a task affects subsequent experimental results. Participants in this study completed the same set of 12 experimental tasks at two points in time, first as a part of the Many Labs replication project and again a few days, a week, or a month later. Effect sizes were markedly lower in the second wave than in the first. The reduction was most pronounced when participants were assigned to a different condition in the second wave. We discuss the methodological implications of these findings.
Assuntos
Participação do Paciente/métodos , Seleção de Pacientes , Adulto , Feminino , Humanos , MasculinoRESUMO
Crowdsourcing services--particularly Amazon Mechanical Turk--have made it easy for behavioral scientists to recruit research participants. However, researchers have overlooked crucial differences between crowdsourcing and traditional recruitment methods that provide unique opportunities and challenges. We show that crowdsourced workers are likely to participate across multiple related experiments and that researchers are overzealous in the exclusion of research participants. We describe how both of these problems can be avoided using advanced interface features that also allow prescreening and longitudinal data collection. Using these techniques can minimize the effects of previously ignored drawbacks and expand the scope of crowdsourcing as a tool for psychological research.