News from the Accelerator – January 2020

Happy New Year from the PSA! The end of 2019 was full of progress on many exciting fronts, and we have big plans for the first quarter of 2020. Here we summarize the most important and potentially actionable items for PSA members or other observers.

2019 By the Numbers

Our network now includes 760 researchers, representing 548 labs, in 72 countries. Our website received 37,242 visits in 2019 for 66,334 total since October 2017, and our preprints have now been downloaded 4,954, collectively. Check out this great new map from Nicholas Coles:

Capture

Recent Coverage

The PSA has been covered in a few media outlets in the past several months. Check out this excellent article, first appearing in UnDark and later being picked up by NPR news.

Additionally, this short radio blurb has quotes from Chris Chartier, Jessica Flake, and Eric Hehman, and McGill University recently featured Jessica Flake’s new grant award and planned measurement invariance project through the PSA.

2019 Project Summaries and Status Updates

Our main focus in 2019 was making progress on all 6 of our selected studies. Each project made big strides.

  • 001 Face Perception: In 2019, we completed the first PSA project (PSA001), which involved 214 authors and 11,481 participants from 11 world regions, and 41 countries. Currently, the Stage 2 Registered Report is under review at Nature Human Behaviour. For this project, we also launched the Secondary Analysis Challengewhich grants 10 awards of $200 to research teams that create and execute a pre-registered re-analysis of the project data. Currently, this secondary challenge has 8 submissions and all have been checked for computational reproducibility.
  • 002 Object Orientation and 003 Gendered Prejudice: 2019 was a busy year for the 002 & 003 team. We translated our materials into 16 different languages and implemented our procedure across 19 countries. We have completed data collection at 23 different sites, with more sites continuing their data collection into the spring. We hope to have the Stage 2 Registered Report manuscript under review at Psychonomic Bulletin & Review in 2020.
  • 004 True Belief: The Accelerated CREP collaboration is really picking up steam – 39 teams are collecting data (and more are welcome to join in here — there is information at the top of the form with helpful links). We plan to wrap up data collection around June and work on the final manuscript shortly thereafter, to be reviewed as a Stage 2 Registered Report at Advances in Methods and Practices in Psychological Science. There are still plenty of opportunities to get involved. Contact jordan.wagge@avila.edu or crep.psych@gmail.com for more information.
  • 005 Stereotype Threat: In 2019, we drafted the initial submission of our Stage 1 Registered Report for Nature Human Behaviour and received a strong revise and resubmit. We also submitted a revised version of the manuscript and are still waiting to hear back about the revision’s status. Finally, we recruited 27 collaborating labs to join the project, all of whom have secured IRB approval. If you wish to join too, you can read more about the project, and sign up as a collaborator, here.
  • 006 Moral Thinking:  In 2019, this project’s team also drafted their initial submission of a Stage 1 Registered Report and submitted it Nature Human Behaviour, received a positive revise and resubmit, submitted a revision, received another small R&R :), and are completing the new round of revisions now. We have 147 labs signed up to collect 18,637 potential participants. Most labs are ready to test the link and collect data! If you are interested in joining, you can sign up by filling out this form. Every lab is welcome, but we are specifically searching for collaborators who could collect data from at least 100 participants in India, South-Korea, Japan, or Thailand. The experiment will be run online, so collaborators can simply have to send out a link, and participants are not required to come to the lab.

Ratified Policy

The Analysis Plan Approval Policy is now ratified by vote of the network. This policy was initiated by the Data and Methods Committee, and its drafting and editing was led by Peder Isager. Thank you for the great work on this!

Funding Search Update

  • Synergy grant submitted. Patrick Forscher and Hans IJzerman led the drafting of a Synergy Grant, a large, €10 million grant administered by the European Research Council. The grant seeks to greatly expand team science in the social sciences by establishing three Synergy Centers: the Evidence Synthesis Center led by Denny Borsboom, the Tools and Standards Center led by Lisa DeBruine, and the De-WEIRDing Center led by Hans IJzerman. The grant would deeply involve the PSA and provide it a substantial sum of discretionary money to be used as the PSA sees fit. Synergy Grants have three stages of review. Hans and Patrick will hear the results of the first stage in April.
  • National Science Foundation grant to be submitted this week. Chris Chartier, Neil Lewis, Jr., Heather Urry, Charlie Ebersole, and Hannah Moshontz have drafted a proposal to the NSF (our third try 🙂 at this one) that will be submitted on the 15th. If funded, the grant would support hiring of a PSA dedicated project manager to allow us expanded data collection and more efficient overall workflow and study completion.
  • John Templeton Foundation grant to be submitted this week. Charlie Ebersole and Chris Chartier have drafted a proposal to the JTF that will be submitted on the 17th. If funded, the grant would support hiring of several dedicated PSA staff members to focus on collecting non-WEIRD samples for studies within the JTF human sciences division questions of interest.

A Discussion on the PSA and Meta-Research

Peder M. Isager and Marcel van Assen hosted a discussion session titled How can meta-research improve the Psychological Science Accelerator (PSA) and how can the PSA improve meta-research? at the 2019 Meta-research day in Tilburg (https://bit.ly/2sjzU3b). The majority of the session was devoted to discussing intersections between the meta-research field and the PSA. The discussion is summarized in this blog post.

Actionable Items to Kick Off 2020

Study Selection

We are on the verge of making selection decisions for 8 submissions to the PSA. All members who have created login credentials at our membership site can now access the pdf copies of these submissions and provide their ratings and feedback. These forms will only be open for a week (until midnight on the 20th in the last time zone on earth) to allow quick conclusion of this round of study selection. We think you’ll have fun looking at and evaluating these excellent submissions!

We also received a very interesting, but quite atypical submission, that we are collating feedback on. In response to our last call for studies, a research team submitted a proposal that is not so much a specific study, but rather an intriguing way in which we may select and develop a future study. The SSC found that it had promise, but that it didn’t fit our typical model for submission review and selection. So, the SSC Assistant Directors decided to pull the submission from the standard review track, and instead begin a PSA-wide conversation to consider and eventually decide by consensus (or perhaps vote) if we should implement the proposal. Members, please provide your initial feedback by joining the conversation currently ongoing in our general Slack channel (scroll up to the thread beginning on November 27th).

Resource Capacity Draft Policy Ready for Feedback

We (led by Patrick Forscher) have drafted a policy that lays out how the PSA thinks about the resources that affect its capacity to run new studies. The development of this policy was inspired in part by questions from potential funders as to whether the PSA would be willing to run studies about specific topics. The PSA does not, at present, have guidelines for these decisions. The policy therefore also seeks to lay out these guidelines. You can find a draft of the policy here.

Elections and Appointments

One Associate Director position and several Assistant Director positions will be up for election or appointment in the first quarter of 2020. We will first hold a full network vote for the Associate Director seat. Subsequently, the new line-up of Associate Directors and the Director will vote to appoint the new Assistant Directors. For now, you can consider and prepare for 3 things.

  1. Nominations for running for Associate Director: Any PSA member may run for this seat. We will be sending out a nomination form where people can nominate themselves or nominate someone else. We will confirm with all nominees as to whether or not they’d like to run.
  2. Volunteers to be Election Tellers: Each PSA election is overseen by one Associate Director and three Election Tellers. Tellers will observe all of the actions of the Associate Director (in this election, Charlie Ebersole) to confirm that the election is being run fairly and accurately. We will be seeking volunteers to serve as Tellers for the upcoming election; any PSA member can volunteer.
  3. Prepare to Vote: consider the performance of the current leadership team, and think about what you would most want to see out of new or returning members of this team in 2020 and beyond.

Onward in 2020!

Thank you for all that you do. You can stay informed and in the convo on Slack and by checking out events on the PSA Google Calendar. As always, we are overflowing with gratitude for all that you’ve collectively given to the PSA and excitement for what we can achieve together in 2020 and beyond.

Chris

(We also wanted to pass along this cool collaboration opportunity below, being organized and led by PSA members)

The Transparent Psi Project is looking for collaborators for data collection

Zoltan Kekecs and Balazs Aczel (members of the PSA Methods Committee) are leading this project which is an expert consensus-based replication of one of Bem’s 2011 precognition studies. The project features state of the art methods to maximize transparency and study integrity.

The study involves a computerized experiment taking about 20-30 minutes per session. Group testing is possible in a computer lab, no specialized equipment needed. Labs are expected to recruit at least 100 participants. Participants will be exposed to images with explicit erotic/sexual content in the experiment. No financial compensation is required for the participants.

Data collection is expected to take place in the 2020 spring, and if needed, 2020 fall semester. Every material is provided for ethics/IRB submissions and data collection in English (translation of materials might be necessary by the collaborators). 

The study is pre-registered and the manuscript is accepted in principle for publication (IPA) in the journal Royal Society Open Science. Collaborators in data collection get authorship on the paper.

Sign up here

Preprint of the Stage 1 Registered Report here

With questions contact the lead PI: kekecs.zoltan@gmail.com

 

News from the Accelerator – October 2019

Hi all,

This month we have progress to report on our latest round of study selection, our 6 current studies, a policy document up for a vote, and an invitation to join a new PSA press team!

Vote on the Analysis Plan Approval Policy

We now call for a vote on our analysis plan approval policy. All members will receive a email asking for them to vote yes or no on this proposed policy.

Study Selection Update

We are now reviewing the 11 submissions we received in response to our 2019 call for studies. Initial feasibility checks are happening now (and have actually been completed for several submissions). Peer review requests will be made in the coming weeks. To participate in this process in any way (viewing, rating, or reviewing submissions), you will need to become an official PSA member through our new member website here: https://member.psysciacc.org/. The more the merrier!

Progress on All 6 Current Studies

PSA 001 is nearing completion and we will submit the stage 2 manuscript to Nature Human Behaviour, release exploratory data, and update our preprint all on October 31.

PSA 002/003 are moving quickly now with lots of recent activity on both translation and data collection. Several teams have even completed collection for these studies!

PSA 004 is also in a period of rapid progress with many teams finalizing their materials and a handful of sites already collecting data.

PSA 005 recently received some wonderful news, with a very favorable revise and resubmit decision from Nature Human Behaviour on the stage 1 registered report manuscript. The lead team is working on the revisions now.

PSA 006 is back under review as a stage 1 registered report at Nature Human Behaviour after its own revise and resubmit decision. We hope to be hearing some good news soon!

Would You Like to Join PSA’s Press Team?

The PSA is looking for people who would like to help with press issues. This is a great opportunity to either join the network or become more involved.
Potential tasks of the press team include:

  • Draft press releases
  • Contact bloggers, journalists, and media outlets with PSA updates
  • Organize contacts in PSA member university press offices
  • Write plain language summaries of PSA projects (for e.g. In-Mind, The Conversation, Psych Today, and other outlets that allow contributed articles)
  • Write Twitter threads, Facebook posts, etc.
  • Eventually coordinate TED-style talks

We will always provide the source info about the PSA and current studies so even people who are not familiar with the structure, goals or are not even sure what is going on in the PSA are welcome to volunteer! If you’re interested please send an email to psysciaccelerator@gmail.com.

News from the Accelerator – September 2019

August was a very productive and exciting month for the PSA. Here we summarize last month’s progress and activities, some exciting new things coming in the near future, and make a few small requests of all PSA members.

PSA Member Site

We have been organizing information about PSA members in a shared google sheet until now. This worked OK for a while, but as our network grew it became disorganized, unwieldy, and error-prone. Lucky for us, Erin Buchanan has been working hard on a new membership website (complete with member database) and is ready for all PSA members to register and provide some information. Eventually, we hope the site can become a “one stop shop” for each PSA member to find information about the status and next steps for any PSA studies they have joined, and any PSA tasks that need new contributors.

For now, we are asking all members of the PSA to create an account on the site, login in, and complete the member information form. Erin also put together this awesome tutorial video to walk you through the process!

2 Policy Documents are Open for PSA-wide Feedback

Vision Statement

We have drafted and released a PSA vision statement. This document is a precursor to a more complete 5-year strategic plan to be drafted in 2020. We welcome feedback this month before editing and ultimately calling for a vote to make this statement a PSA policy.

Analysis Plan Approval Policy

In future projects, the Data and Methods (DM) committee intends to review the project analysis plan before the project is registered and/or submitted for peer review. This process is meant to ensure that all necessary components (sample size justification, analysis scripts, etc.) are present in the analysis plan when it is submitted for external review. Furthermore, the approval process should provide some minimum reassurance that major methodological issues and flaws in the analysis plan have been addressed before launch of data collection, in the very rare cases that such flaws escape the awareness of the co-author team. In line with the proposed guidelines for PSA policy proposals, the DM committee have sought and incorporated feedback from the PSA director and associate directors. We now invite all members of the PSA to provide feedback on the document, which can be found here.

The document will be open for feedback for one week as of the posting of this newsletter. After this period the DM committee will incorporate feedback, and subsequently submit the policy for approval by vote by PSA Directors and all members of the PSA.

Policy Document up for a Vote

We now call for a vote on our “meta-policy” document! It describes how current PSA polices can be amended and how members can propose new policies. All members will receive a separate email asking for them to vote yes or no on this proposed policy document.

The PSA Just Turned 2!

August 26th marked two years of accelerating psychological science. To celebrate we had a flurry of productive hackathons and published 6 blog posts on exciting new developments and future plans for the PSA. We:

Seeking Additional Assistant Director of the Project Monitoring Committee

We are calling for applications for a second Assistant Director of the Project Monitoring Committee, to serve alongside Hannah Moshontz. You can read more and apply here.

2 PSA Members Suggest a Theory Committee

Peder Isager and Nick Coles have made a quite interesting suggestion – that the PSA could use a theory committee – in a recent blogpost. Have a read and let them know what you think!

Introducing the PSA001 Secondary Analysis Challenge

We are offering up to 10 awards of $200 to research teams that use data from our first study (PSA001) to follow an analysis pipeline meeting our specifications. We describe the background, rationale, and details below. 

Background

Psychology datasets contain a wealth of information, including dozens, hundreds, and sometimes even thousands of variables. Datasets that are well-documented can be even richer, as appropriate documentation can allow these datasets to be merged with secondary information (or meta-data), exponentially expanding the universe of possible analyses.

Although some researchers use publicly posted data in their research, we believe the potential of secondary analyses is, as yet, untapped. Some of this untapped potential may result from the typical structure of a psychology dataset release. In the best case, the dataset is described in an article in a journal (such as the Journal of Open Psychology Data or Scientific Data). In the worst, the dataset is undocumented and only available on request (if at all). We believe we can do better to make our datasets maximally informative.

Phased dataset release (with incentives)

Our test case for innovating with improving the data release process is PSA001, a project to test whether the valence-dominance model of face perception generalizes across world regions.  The primary dataset contains ratings from over 11,000 participants across 11 world regions, 48 countries, and 28 languages. Each participant rated 120 faces twice on one of 13 traits. In addition to these ratings, we have access to datasets containing various meta-data. These include datasets of participant characteristics (such as race and gender some locations only), site characteristics (such as world region and institutional affiliation), and characteristics of the faces that were rated (such as the gender of the face, picture luminance, and the size of various facial features).

In our release of this dataset, we are following the lead of other high quality data releases by carefully curating and documenting our datasets.  However, we are adding an extra innovation: we are structuring the release in a way that we think will maximize the value of the resulting secondary analyses.  Specifically, we are releasing separate exploratory and confirmatory segments of the data and incentivizing the use of these separate segments by offering up to 10 awards of $200 to research teams who complete the analysis pipeline of exploring with the exploratory segment, confirming with the confirmatory segment, and sharing the results on PsyArXiv.

The details

The data release plan for this project consists of three phases: release of a simulated dataset (to allow people not directly involved in the project time to understand the variables we collected), release of an exploratory segment (⅓ of the full dataset), and release of a confirmatory segment (the full dataset).  We will stratify by lab when creating our exploratory and confirmatory segments; in other words, we will randomly sample ⅓ of the participants within each lab that contributed data to create the exploratory segment. The full dataset will demarcate the exploratory and confirmatory All data drops will occur at randomly selected UTC times between 12am and 11pm.

We will provide up to 10 awards of $200 each for research teams that make secondary contributions from the exploratory and confirmatory datasets.  If more than 10 teams submit contributions the winners will be chosen at random.  To be eligible, a research team must:

  • Write a computationally reproducible script that analyzes the exploratory dataset.  The script may be written in any data analysis software, but we strongly encourage the use of open-source software such as R.
  • Post the script to a project on the Open Science Framework and create a date-stamped preregistration of the script using OSF preregistrations.  The proposing teams can use a preregistration template, such as this one for secondary data analysis, or they can use an open-ended preregistration that only contains the script that the team will use to analyze the confirmatory segment and a date stamp.  At the top of the script, the proposing team should write their names and the following text: “I commit to analyzing the confirmatory segment of PSA001 Social Faces using this script upon the project’s release”. The date stamp of the preregistration must be before 12pm UTC, November 30, 2019, which is the point at which the confirmatory segment will be released.  The script will be checked for computational reproducibility by a member of the PSA’s Data and Methods Committee.
  • After the release of the confirmatory segment, post a preprint to PsyArXiv detailing the results of the analyses of the exploratory and confirmatory segments.  To be eligible for the award, the preprint must be date-stamped by 12pm UTC, January 31, 2020.  For the purposes of winning the award, the preprint may be very brief –tables or figures illustrating the results along with some descriptive text are sufficient.  However, if the research team wishes, the preprint may be more detailed.  The PsyArXiv preprint should be tagged with the study code for this project: “PSA001”.

Before issuing the awards, members of the Data and Methods Committee will verify that these steps have been followed.

Below are the key dates of this data release plan:

  1. The simulated dataset, along with a codebook, will be released (posted on OSF, tweeted, Facebooked, and blogged) on August 31, 2019, 24:00 UTC, 8pm EST (so it’s available today!!). It, along with detailed documentation of the dataset, are available at this OSF page.
  2. The exploratory segment can be found here, and was posted on October 31, 2019, concurrent with the submission of this project’s Stage 2 Registered Report.
  3. The preregistered analyses should be submitted via this form by November 30, 2019, 12pm UTC.
  4. The confirmatory segment will be released concurrently with the publication of the Stage 2 paper at Nature Human Behaviour.
  5. The preprint should be posted within one month of the release of the confirmatory segment.  Once posted, the preprint can be submitted to the PSA001 team via this form.

If you have questions about this process, or the data that we have available, contact the PSA001 data manager (Patrick S. Forscher) at schnarrd@gmail.com.

Conclusion

We hope this project can serve as an exemplar of how the details of data release can add value to the scientific knowledge generated from a particular dataset. We hope you consider participating in our Secondary Analysis Challenge so we can see if this is indeed the case.

The PSA’s Draft Vision Document is Open for Feedback

We are releasing our draft vision document for feedback from anyone. You do not need to be a PSA member to comment, and we welcome your input via comments in the google doc or sent via email (psysciaccelerator@gmail.com). 

Soon after founding the PSA, we collectively established our mission statement, core values, selected our first studies, and drafted initial policies for how we conduct our projects. Then we got to work! We focused on the most pressing needs of planning and conducting our studies and setting up our organizational and governance structures. We now need to also focus on long-term planning to ensure the PSA is a sustainable project that works best for psychological science and its members.

The current document represents an early step in our formal planning process in two ways: 1) it will undergo several rounds of feedback from PSA members prior to being voted on as possible PSA policy, and 2) if it is ratified it will form just one portion of a more detailed strategic plan to be drafted in 2020 and then reassessed annually thereafter.

Before we can draft a full strategic plan that includes more specific activities, goals, funding strategy, and a projected budget, we need broad community buy-in on the ideas and basic plans outlined in the vision document. To that end, the draft now open for feedback from anyone (PSA member or not) during the entire month of September 2019. The PSA Directors will edit the document in response to this feedback during October, and we will put the revised document up for a vote by the PSA network during November.

If the vision statement is not ratified during that vote, we will continue to iteratively solicit feedback, edit, and vote until a version has been ratified. If it is ratified during the November vote, it will be published on the PSA website in December. We will then begin the more detailed strategic planning process. The 5-year strategic plan will follow the same timeline in 2020 as this vision document is following in 2019. The Directors will draft the plan January through August, we will solicit feedback in September, we will edit in October, and all PSA members will vote in November. The process of assessing, revising, seeking feedback, and ultimately voting on the strategic plan will occur annually thereafter following the same month-by-month schedule.

Thank you in advance for any feedback you provide!