News from the Accelerator – January 2019

It has been an exciting month for the PSA! You can also stay up to date by subscribing to the blog, following us on twitter, signing up for our email list, or joining our slack workspace. For January, we have updates on peer review decisions for our projects, in-progress studies, and new ways to communicate with PSA leadership and stay informed on our activities.

Favorable peer review decisions!

We have just received an in principle acceptance for “Investigating object orientation effects across 14 languages” at Psychonomic Bulletin and Review. Congratulations to Sau-Chin Chen, Anna Szabelska, and all of our co-authors! We will now plan for data collection and share updates soon.

psa_object_langmap_withlogo_110118

We also recently received a favorable revise and resubmit decision for “Accelerated CREP: RRR: Turri, Buckwalter, & Blouw (2015)” at Advances in Methods and Practices in Psychological Science. Braeden Hall and Jordan Wagge have been leading the revision efforts and will resubmit the manuscript soon.

Lab recruitment for moral thinking study

We are recruiting labs for “On the universality of moral thinking: Cross-cultural variation in the influence of personal force and intention in moral dilemma judgments.” The current list of contributing labs is up to 105 sites representing 42 countries! This is a great start, but we still welcome additional labs to join the team. We will likely submit this project as a Registered Report, so the size and diversity of our team will strengthen our Stage 1 submission and chances of securing an in principle acceptance. 

Data collection

Data collection continues for “To which world regions does the valence-dominance model apply.” We have now collected data from over 2,000 participants across 48 labs. More labs and more languages of collection are going “live” every day and we will continue the project through September. You can see demo versions of all active languages here.

Staying up to date on PSA activities

We have started an “on-deck” document that summarizes the various active projects and work of the PSA. If you are wondering what activities could use extra person-power or what the leadership team is working on, this document is a great place to look! We will try to make it more detailed and helpful in the coming days, as well as keep it as up to date as possible.

Chris will be hosting monthly Q&A sessions for PSA members via google hangouts. If you’ve been on the “sidelines”, are not sure how to get more actively involved in the PSA, or have any questions about our policies, processes, or current studies, please join in on an upcoming call to have your questions answered! The times for the meetings will vary considerably from month to month to allow for members from all of our different time zones to have opportunities to join. The first hangout will be Monday, February 25th, 17:00 UTC. You can join the call at this link   (which we will also share via twitter just prior to the meeting).

One last random note: in the PSA members’ roster spreadsheet, we recently switched all lab IDs to 3 digit ISO codes. Lab IDs that were being used for the faces study are being held constant for use in that specific tracking sheet and data collection process, but we will use the new ISO codes going forward.

Thank you for all of the continued efforts you have put into making the PSA a success!

Chris

 

 

News from the Accelerator – December 2018

Starting with this post, the PSA will be releasing monthly updates on the blog to keep everyone, PSA members and interested observers, informed on PSA activities. You can stay up to date by subscribing to the blog, following us on twitter, signing up for our email list, or joining our slack workspace. For December, we have updates on study selection, PSA policy creation, in-progress studies, and activity reports from several of our committees.

A New Study!

On the universality of moral thinking: Cross-cultural variation in the influence of personal force and intention in moral dilemma judgments” as our next official PSA study! This proposal was submitted by Bence Bago (Toulouse School of Economics, France) and Balazs Aczel (Eotvos Lorand University, Hungary). Congratulations! You can read more about in this just-published blog post.

PSA Policies

We are making progress on a wide range of PSA policies. This month, we focus on three:

  • Our code of conduct was approved by vote, and is now official PSA policy. You can read it here.
  • We now call for a vote on our membership policy document. PSA members will receive the draft policy via email and have until January 15th, 2019 to vote.
  • Finally, we welcome feedback on our draft Personnel, Appointments, and Elections Policy Document. Please provide your comments and suggested edits by January 15th, 2019.

Updates on Current Studies

We have 5 studies in various stages of preparation or implementation:

  • The face perception study has been accepted in principle at Nature Human Behaviour, we have translated materials into 19 languages, and data collection has begun in dozens of labs. You can check out a test page of the study in each language here.
  • The object orientation study is under review at Psychonomic Bulletin and Review as a Stage 1 Registered Report.
  • The gendered nature of study is still in the drafting phase as a Stage 1 Registered Report.
  • The Accelerated CREP study is under review at Advances in Methods and Practices in Psychological Science as a Stage 1 Registered Report.
  • The stereotype threat study has recruited 38 data collection labs and is in the planning stage.

Updates from Our Committees

Each month, we will also use the newsletter to highlight updates from at least a few PSA committees in the hopes that all members will have a better sense of ongoing PSA activities they are not directly involved in. This month we have updates from Project Management, Data and Methods, and Ethics.

  • The D&M committee has been very active developing their own set of committee specific policies and procedures. Recently, they have1) voted on and passed the section of the D&M bylaws laying out the formal roles on the D&M committee (i.e., what the Assistant Directors do, what the standing committee members do, what ad-hoc members do)

    (2) revised our data management policy with an eye toward being compliant with the Psych-DS project. our hope is that PSA can promote Psych-DS and be a testing ground for how well it works. hopefully this can help Psych-DS gain traction to be a discipline-wide standard for psychology datasets

    (3) discussed ways for people to propose meta-science projects through D&M. we’ve put this on the backburner for the time being until we’ve solidified more of our policies

    (4) worked on a formal document to identify the needs of PSA-approved projects

    (5) worked with community building to start building a database of lab-specific meta-data

  • The Project Management Committee has started to manage several ongoing projects and are in the process of making our policies official. Nick Coles and Chris are now project managers for the face perception study, Hannah Moshontz is the project manager for the stereotype threat study, and the committee is working to identify individuals interested in taking on a project management role for the object orientation study, the gendered nature of prejudice study, and the trolley problems study! (Email hmoshontz@gmail.com if you are interested!)
  • The Ethics Committee has drafted ethics review guidelines that we will use during the study selection process and will share them with the PSA network for feedback soon. The Ethics Committee has also begun pairing ethics committee members with each PSA study to serve as a dedicated contributor focused on IRB and ethical issues through the full life-cycle of each project.

Wrapping Up 2018

In 2017 we introduced the idea of the PSA, recruited nearly 200 labs to the network, and selected our first studies. In 2018 we built on our initial momentum, made substantial headway formalizing how the PSA works, and made exciting progress on several of our studies. A few highlights include publishing our introductory paper, securing our first in principle acceptance for the face perception study, submitting two more registered reports (still under review :)), selecting 2 new studies, formalizing our leadership team and committee structure, and more than doubling the size of the laboratory network. Thank you for the countless, and at times thankless hours of work that all of you have contributed to make the PSA a success in 2018.

Here’s to an even more productive and exciting 2019 full of Accelerating Psychological Science!

Chris

pasted image 0 (1)

New PSA Study! On the universality of moral thinking: Cross-cultural variation in the influence of personal force and intention in moral dilemma judgments.

We are excited to announce that we have selected “On the universality of moral thinking: Cross-cultural variation in the influence of personal force and intention in moral dilemma judgments” as our next official PSA study! This proposal was submitted by Bence Bago (Toulouse School of Economics, France) and Balazs Aczel (Eotvos Lorand University, Hungary). Congratulations!

The project focuses on people’s reasoning in moral dilemmas in which deontological perspectives (emphasizing individual rights) are in conflict with consequentialist reasoning (following the greater good). In an important and central experiment, Greene et al. (2009) investigated whether people are sensitive to the intention of agents when evaluating moral actions. They found that this is only the case when people have to apply personal force to do so. Yet, their work could not explore the effect of a potentially important component: culture. Therefore, the goal of the present project is to empirically test the universality of the effect of intention on utilitarian and deontological responding by directly replicating Greene et al.’s experiments in non-WEIRD samples as well.

You can read the accepted proposal here: https://osf.io/d2ptq/

Stay tuned for more information on lab recruitment and project planning soon!

If you already know you would like to join this project, please email us (psysciaccelerator@gmail.com) to express your interest.

pasted image 0

A Large-Scale, Multi-Site Examination of Stereotype Threat Across Varying Operationalizations. A Newly Selected Accelerator Study.

We are excited to announce that we have selected “A Large-Scale, Multi-Site Examination of Stereotype Threat Across Varying Operationalizations” as our next official PSA study! This proposal was submitted by Patrick S. Forscher (University of Arkansas), Valerie Jones Taylor (Lehigh University), Neil A. Lewis, Jr. (Cornell University), and Daniel Cavagnaro (California State Fullerton).

This project will test a core proposition of stereotype threat theory, which predicts that the possibility of confirming a negative stereotype can cause people to underperform on the very tasks on which they are stereotyped. Despite its immense theoretical and practical implications, the studies supporting stereotype threat theory suffer from small samples and varying operational definitions of the focal stereotype threat effect.

This project aims to resolve the first problem by leveraging the PSA’s large network of US-based labs to gather a large sample of African American students – larger than would be easily manageable with a single lab’s resources.  For the second problem, the project proposes to use an adaptive design to find, among four procedures each to increase and decrease stereotype threat, the comparison that provides the best evidence for a performance difference.

You can read more about the selected project at its  OSF project page, where we’ve deposited a pdf of the project proposal. If you’re interested in joining the project, fill out this this form, where Patrick, Valerie, Neil, and Dan are already recruiting collaborating labs.

Congratulations to Patrick, Valerie, Neil, and Daniel! We can’t wait to work closely with them on this exciting project.

Update on the Accelerated CREP: Registered Replication Report Submission Coming Soon. More Labs Welcome to Join!

On October 22nd, we will submit a Stage 1 Registered Replication Report (peer review prior to data collection) manuscript to Advances in Methods and Practices in Psychological Science for our Accelerated CREP project, an RRR of Turri, Buckwalter, and Blouw (2015). You can read a preprint of the manuscript here.

We welcome public feedback on the manuscript as well as new contributors who would like to join the project prior to our submission. Remember that all CREP projects are partly focused on training students, so we invite teams of graduate or undergraduate students or their supervisors to sign up here!

To be included in this Stage 1 submission, contributors will need to sign up, review the manuscript, suggest any necessary edits, commit to data collection for the project in 2019, and approve of the manuscript for submission to AMPPS by October 22nd. While teams can still join after the Stage 1 submission and earn co-authorship on the final manuscript, the more sites we have involved by the Stage 1 submission, the stronger our project will be, and the more likely we are to secure an in principle acceptance!

You can also read an updated version of the original blog post announcing the Accelerated CREP project below. Updates appear in bold and italics.

-The Accelerated CREP Team
800px-Plato_Silanion_Musei_Capitolini_MC1377 (1)

Plato suggested knowledge was “justified true belief.”

 

(updated original post below)

The Accelerated CREP

The Collaborative Replication and Education Project (CREP) and the Psychological Science Accelerator are partnering on a project for the 2018-2019 replication season. The mission of the Accelerator is to accelerate the accumulation of reliable and generalizable evidence in psychological science. The mission of the CREP is to improve undergraduate training through crowdsourced replication. We think these two missions can be pursued in tandem.

The CREP (http://osf.io/wfc6u) is a crowdsourced replication project designed for undergraduate researchers. We invite students to replicate a study from a pre-selected list, chosen because they are both highly cited and feasible for undergraduates to complete. Contributors receive CREP approval by demonstrating research transparency at both pre and post data collection stages in order to maximize the value of their replication data for future meta-analyses. Once there are enough samples to draw meaningful conclusions from the data, all contributors are encouraged to collaborate on a research paper. Since launching in 2013, over 350 students have started over 100 replication projects. We have one manuscript in press (Leighton et al. 2018), two more nearing completion (Wagge et al.; Ghelfi et al.), and several more that still need more data, listed here.

The Psychological Science Accelerator is a more recent crowdsourced project, and though it is similar to the CREP in many ways, it is also more advanced in development and scope (Moshontz et al., 2018). It is a network of 364 laboratories that democratically select studies and then conduct them on a global scale. The major difference is that the CREP is specifically focused on involving undergraduates in the educational component of replication science, while the Accelerator is primarily focused on accelerating psychological science more generally, though educators can certainly work with undergraduates on all Accelerator projects.

The CREP and Accelerator have decided to coordinate a pilot test of an “Accelerated CREP” study. This pilot will evaluate the feasibility of the Accelerator formally adding an undergraduate education component, via the CREP, on a more regular basis. It is also an opportunity for the CREP to extend their contributor pool beyond their historical audience, and complete data collection for one CREP study much more quickly than normal. Among the Accelerator’s over 364 active laboratories, we imagine that a subset of PIs would like to either implement the CREP as part of their research methods courses or work with undergraduate researchers on the Accelerated CREP who would benefit from taking “ownership” of a project and contributing to a large-scale collaboration outside of the typical Accelerator workflow.

For this partnership, Accelerator members and one or more undergraduate researchers they supervise, are invited and encouraged to work through the CREP process for a single study between January 1, 2019 and January 1, 2020. While the CREP typically runs studies through the US academic year, doing so in this case would prevent many non-US labs from participating equally. Where possible, we recommend that contributing teams assign a student as the contact person to interact with the CREP Review team, in order for them to experience all aspects of the research experience.

After submitting a Registered Replication Report (RRR) proposal to Advances in Methods and Practices in Psychological Science (AMPPS), we have been invited to submit this collaborative replication of Experiment 1 of Turri, Buckwalter, and Blouw (2015) as a Phase 1 RRR. AMPPS also suggested some methodological changes to broaden the generality of our findings – which we have since incorporated into our protocol. If the Phase 1 RRR is accepted, AMPPS will also call for additional contributors through APS.

Here we give an overview of the full process for the first “Accelerated CREP” study, which differs in a few notable ways from the standard operating procedures of the Accelerator and CREP.

Phase 1 (submission and evaluation, complete). The CREP team conducted their normal study selection process for the 2018/2019 academic year (now the 2019 calendar year). Each year, the CREP team selects one to three new studies to add to the list of available studies. They identify the top three or four cited articles in the top journal in nine sub-disciplines, then code those studies for feasibility and potential student interest. This year they selected one new study, Turri, Buckwalter, & Blou (2015), “Knowledge and Luck”, Experiment 1 (https://osf.io/n5b3w/) with a target N = 100 for each local site.

Phase 2 (Preparation). The CREP invites Direct (or close) replications from contributors. As such, the protocol should match that of the published article. If AMPPS accepts the Phase 1 RRR manuscript, it will serve as the primary protocol by which contributing labs will design their independent replications.

For advanced students, the CREP invites Direct+Plus replications (i.e., direct replications “plus” an extension) which involve the addition of one or more additional independent or dependent variables that are collected after (or independently) of the original study.

The Turri et al. study 1 is exciting to replicate because the methods are all available in the publication, and they can be administered via computer. Further, there may be interesting moderator variables that could be tested across multiple labs (e.g., education, perception of luck vs. ability, culture, etc.).

The CREP asks contributors to create and edit an OSF page (https://osf.io/srh4k/) and provide their materials, ethics approval, and a video of their procedure. For the Accelerated CREP, we hope to recruit undergraduates from 50 or more locations to join the project. We already have 35 institutions signed up to contribute, and we expect more!

Phase 3 (Implementation). CREP contributors submit their project page for review by the CREP team twice, once before and once after data collection. The pre-data review verifies that the contributor is meeting CREP standards for ethics, materials, and procedure. For the post-data review, the CREP team reviews the presentation of the data and the results to verify the data are useable in the aggregate multilevel analyses.

Review teams including two faculty and a student administrative advisor, David Redman, will be led by one of the CREP’s experienced Executive Reviewers, Dr. Jordan Wagge. Faculty on contributing teams will be invited to serve as reviewers on other CREP contributor’s projects in order to ensure high quality replications.

Phase 4 (dissemination). Because the CREP is decentralized, the local data sets are posted publicly in order to go through the post-data review. Contributors are encouraged to present their findings at conferences, but the data are collated for the drafting and submitting of a manuscript reporting the combined findings, because no single replication provides definitive results. In contrast to normal CREP procedure, we invite potential authors to indicate their interest in authorship at the beginning rather than the end of the project. Braedon Hall will act as 1st author and the coordinating author for the RRR under the guidance of graduate advisor, and lead executive reviewer, Jordan Wagge.

The organizers of this partnership consider this a tentative relationship which we will re-evaluate for possible future implementation. In the meantime, come along and join us as we Accelerate a CREP study.

Jon Grahe (CREP Project Leader & Pacific Lutheran University)

Christopher R. Chartier (Accelerator Director & Ashland University)




Additional pronunciation guidance and derivation from Jon Grahe:

“Why CREP rhymes with Grapes”

the grape metaphor for replication science

When considering a bunch of grapes, all from the same DNA, they represent a wonderful metaphor for replication science. Grapes from the same bunch, or different bunches from the same vine, all share DNA, but they are rarely, if ever, identical. They differ from each other in a number of factors much like replications differ from each other, and from an original study. Because of growing conditions, contextual differences; grapes can differ in size, color, and ripeness. All the same DNA, but still different.

In comparison to grapes, replications also differ in size, color, and ripeness. Size is the easiest metaphor to recognize, researchers might have more access or resources to collect data from more participants. Color in research reflects all the diversity in application of a replication, not just diversity of participants and experimenters, but also the time of day, the decorations of the lab, the educational culture on campus, and all the other variables that make research laboratories unique. Finally, ripeness reflects age and experience; certainly applicable in research as replications are conducted by experimenters exploring the task for the first time and by professionals who have completed more studies than they can remember.