News from the Accelerator – October 2019

Hi all,

This month we have progress to report on our latest round of study selection, our 6 current studies, a policy document up for a vote, and an invitation to join a new PSA press team!

Vote on the Analysis Plan Approval Policy

We now call for a vote on our analysis plan approval policy. All members will receive a email asking for them to vote yes or no on this proposed policy.

Study Selection Update

We are now reviewing the 11 submissions we received in response to our 2019 call for studies. Initial feasibility checks are happening now (and have actually been completed for several submissions). Peer review requests will be made in the coming weeks. To participate in this process in any way (viewing, rating, or reviewing submissions), you will need to become an official PSA member through our new member website here: https://member.psysciacc.org/. The more the merrier!

Progress on All 6 Current Studies

PSA 001 is nearing completion and we will submit the stage 2 manuscript to Nature Human Behaviour, release exploratory data, and update our preprint all on October 31.

PSA 002/003 are moving quickly now with lots of recent activity on both translation and data collection. Several teams have even completed collection for these studies!

PSA 004 is also in a period of rapid progress with many teams finalizing their materials and a handful of sites already collecting data.

PSA 005 recently received some wonderful news, with a very favorable revise and resubmit decision from Nature Human Behaviour on the stage 1 registered report manuscript. The lead team is working on the revisions now.

PSA 006 is back under review as a stage 1 registered report at Nature Human Behaviour after its own revise and resubmit decision. We hope to be hearing some good news soon!

Would You Like to Join PSA’s Press Team?

The PSA is looking for people who would like to help with press issues. This is a great opportunity to either join the network or become more involved.
Potential tasks of the press team include:

  • Draft press releases
  • Contact bloggers, journalists, and media outlets with PSA updates
  • Organize contacts in PSA member university press offices
  • Write plain language summaries of PSA projects (for e.g. In-Mind, The Conversation, Psych Today, and other outlets that allow contributed articles)
  • Write Twitter threads, Facebook posts, etc.
  • Eventually coordinate TED-style talks

We will always provide the source info about the PSA and current studies so even people who are not familiar with the structure, goals or are not even sure what is going on in the PSA are welcome to volunteer! If you’re interested please send an email to psysciaccelerator@gmail.com.

News from the Accelerator – September 2019

August was a very productive and exciting month for the PSA. Here we summarize last month’s progress and activities, some exciting new things coming in the near future, and make a few small requests of all PSA members.

PSA Member Site

We have been organizing information about PSA members in a shared google sheet until now. This worked OK for a while, but as our network grew it became disorganized, unwieldy, and error-prone. Lucky for us, Erin Buchanan has been working hard on a new membership website (complete with member database) and is ready for all PSA members to register and provide some information. Eventually, we hope the site can become a “one stop shop” for each PSA member to find information about the status and next steps for any PSA studies they have joined, and any PSA tasks that need new contributors.

For now, we are asking all members of the PSA to create an account on the site, login in, and complete the member information form. Erin also put together this awesome tutorial video to walk you through the process!

2 Policy Documents are Open for PSA-wide Feedback

Vision Statement

We have drafted and released a PSA vision statement. This document is a precursor to a more complete 5-year strategic plan to be drafted in 2020. We welcome feedback this month before editing and ultimately calling for a vote to make this statement a PSA policy.

Analysis Plan Approval Policy

In future projects, the Data and Methods (DM) committee intends to review the project analysis plan before the project is registered and/or submitted for peer review. This process is meant to ensure that all necessary components (sample size justification, analysis scripts, etc.) are present in the analysis plan when it is submitted for external review. Furthermore, the approval process should provide some minimum reassurance that major methodological issues and flaws in the analysis plan have been addressed before launch of data collection, in the very rare cases that such flaws escape the awareness of the co-author team. In line with the proposed guidelines for PSA policy proposals, the DM committee have sought and incorporated feedback from the PSA director and associate directors. We now invite all members of the PSA to provide feedback on the document, which can be found here.

The document will be open for feedback for one week as of the posting of this newsletter. After this period the DM committee will incorporate feedback, and subsequently submit the policy for approval by vote by PSA Directors and all members of the PSA.

Policy Document up for a Vote

We now call for a vote on our “meta-policy” document! It describes how current PSA polices can be amended and how members can propose new policies. All members will receive a separate email asking for them to vote yes or no on this proposed policy document.

The PSA Just Turned 2!

August 26th marked two years of accelerating psychological science. To celebrate we had a flurry of productive hackathons and published 6 blog posts on exciting new developments and future plans for the PSA. We:

Seeking Additional Assistant Director of the Project Monitoring Committee

We are calling for applications for a second Assistant Director of the Project Monitoring Committee, to serve alongside Hannah Moshontz. You can read more and apply here.

2 PSA Members Suggest a Theory Committee

Peder Isager and Nick Coles have made a quite interesting suggestion – that the PSA could use a theory committee – in a recent blogpost. Have a read and let them know what you think!

Introducing the PSA001 Secondary Analysis Challenge

We are offering up to 10 awards of $200 to research teams that use data from our first study (PSA001) to follow an analysis pipeline meeting our specifications. We describe the background, rationale, and details below. 

Background

Psychology datasets contain a wealth of information, including dozens, hundreds, and sometimes even thousands of variables. Datasets that are well-documented can be even richer, as appropriate documentation can allow these datasets to be merged with secondary information (or meta-data), exponentially expanding the universe of possible analyses.

Although some researchers use publicly posted data in their research, we believe the potential of secondary analyses is, as yet, untapped. Some of this untapped potential may result from the typical structure of a psychology dataset release. In the best case, the dataset is described in an article in a journal (such as the Journal of Open Psychology Data or Scientific Data). In the worst, the dataset is undocumented and only available on request (if at all). We believe we can do better to make our datasets maximally informative.

Phased dataset release (with incentives)

Our test case for innovating with improving the data release process is PSA001, a project to test whether the valence-dominance model of face perception generalizes across world regions.  The primary dataset contains ratings from over 11,000 participants across 11 world regions, 48 countries, and 28 languages. Each participant rated 120 faces twice on one of 13 traits. In addition to these ratings, we have access to datasets containing various meta-data. These include datasets of participant characteristics (such as race and gender some locations only), site characteristics (such as world region and institutional affiliation), and characteristics of the faces that were rated (such as the gender of the face, picture luminance, and the size of various facial features).

In our release of this dataset, we are following the lead of other high quality data releases by carefully curating and documenting our datasets.  However, we are adding an extra innovation: we are structuring the release in a way that we think will maximize the value of the resulting secondary analyses.  Specifically, we are releasing separate exploratory and confirmatory segments of the data and incentivizing the use of these separate segments by offering up to 10 awards of $200 to research teams who complete the analysis pipeline of exploring with the exploratory segment, confirming with the confirmatory segment, and sharing the results on PsyArXiv.

The details

The data release plan for this project consists of three phases: release of a simulated dataset (to allow people not directly involved in the project time to understand the variables we collected), release of an exploratory segment (⅓ of the full dataset), and release of a confirmatory segment (the full dataset).  We will stratify by lab when creating our exploratory and confirmatory segments; in other words, we will randomly sample ⅓ of the participants within each lab that contributed data to create the exploratory segment. The full dataset will demarcate the exploratory and confirmatory All data drops will occur at randomly selected UTC times between 12am and 11pm.

We will provide up to 10 awards of $200 each for research teams that make secondary contributions from the exploratory and confirmatory datasets.  If more than 10 teams submit contributions the winners will be chosen at random.  To be eligible, a research team must:

  • Write a computationally reproducible script that analyzes the exploratory dataset.  The script may be written in any data analysis software, but we strongly encourage the use of open-source software such as R.
  • Post the script to a project on the Open Science Framework and create a date-stamped preregistration of the script using OSF preregistrations.  The proposing teams can use a preregistration template, such as this one for secondary data analysis, or they can use an open-ended preregistration that only contains the script that the team will use to analyze the confirmatory segment and a date stamp.  At the top of the script, the proposing team should write their names and the following text: “I commit to analyzing the confirmatory segment of PSA001 Social Faces using this script upon the project’s release”. The date stamp of the preregistration must be before 12pm UTC, November 30, 2019, which is the point at which the confirmatory segment will be released.  The script will be checked for computational reproducibility by a member of the PSA’s Data and Methods Committee.
  • After the release of the confirmatory segment, post a preprint to PsyArXiv detailing the results of the analyses of the exploratory and confirmatory segments.  To be eligible for the award, the preprint must be date-stamped by 12pm UTC, January 31, 2020.  For the purposes of winning the award, the preprint may be very brief –tables or figures illustrating the results along with some descriptive text are sufficient.  However, if the research team wishes, the preprint may be more detailed.  The PsyArXiv preprint should be tagged with the study code for this project: “PSA001”.

Before issuing the awards, members of the Data and Methods Committee will verify that these steps have been followed.

Below are the key dates of this data release plan:

  1. The simulated dataset, along with a codebook, will be released (posted on OSF, tweeted, Facebooked, and blogged) on August 31, 2019, 24:00 UTC, 8pm EST (so it’s available today!!). It, along with detailed documentation of the dataset, are available at this OSF page.
  2. The exploratory segment can be found here, and was posted on October 31, 2019, concurrent with the submission of this project’s Stage 2 Registered Report.
  3. The preregistered analyses should be submitted via this form by November 30, 2019, 12pm UTC.
  4. The confirmatory segment will be released concurrently with the publication of the Stage 2 paper at Nature Human Behaviour.
  5. The preprint should be posted within one month of the release of the confirmatory segment.  Once posted, the preprint can be submitted to the PSA001 team via this form.

If you have questions about this process, or the data that we have available, contact the PSA001 data manager (Patrick S. Forscher) at schnarrd@gmail.com.

Conclusion

We hope this project can serve as an exemplar of how the details of data release can add value to the scientific knowledge generated from a particular dataset. We hope you consider participating in our Secondary Analysis Challenge so we can see if this is indeed the case.

The PSA’s Draft Vision Document is Open for Feedback

We are releasing our draft vision document for feedback from anyone. You do not need to be a PSA member to comment, and we welcome your input via comments in the google doc or sent via email (psysciaccelerator@gmail.com). 

Soon after founding the PSA, we collectively established our mission statement, core values, selected our first studies, and drafted initial policies for how we conduct our projects. Then we got to work! We focused on the most pressing needs of planning and conducting our studies and setting up our organizational and governance structures. We now need to also focus on long-term planning to ensure the PSA is a sustainable project that works best for psychological science and its members.

The current document represents an early step in our formal planning process in two ways: 1) it will undergo several rounds of feedback from PSA members prior to being voted on as possible PSA policy, and 2) if it is ratified it will form just one portion of a more detailed strategic plan to be drafted in 2020 and then reassessed annually thereafter.

Before we can draft a full strategic plan that includes more specific activities, goals, funding strategy, and a projected budget, we need broad community buy-in on the ideas and basic plans outlined in the vision document. To that end, the draft now open for feedback from anyone (PSA member or not) during the entire month of September 2019. The PSA Directors will edit the document in response to this feedback during October, and we will put the revised document up for a vote by the PSA network during November.

If the vision statement is not ratified during that vote, we will continue to iteratively solicit feedback, edit, and vote until a version has been ratified. If it is ratified during the November vote, it will be published on the PSA website in December. We will then begin the more detailed strategic planning process. The 5-year strategic plan will follow the same timeline in 2020 as this vision document is following in 2019. The Directors will draft the plan January through August, we will solicit feedback in September, we will edit in October, and all PSA members will vote in November. The process of assessing, revising, seeking feedback, and ultimately voting on the strategic plan will occur annually thereafter following the same month-by-month schedule.

Thank you in advance for any feedback you provide!

Recognizing Invisible Labor in the PSA

One of the biggest threats to the sustainability of large-scale collaborations is the rewards system in place for scholarly labor.  Scholarly labor is rewarded with credit for scholarly products. Typically, people scan authorship lists to observe whom to credit, with the lion’s share going to the first- and last-listed authors.

This disproportionate assignment of credit serves as a disincentive for collaborating on papers with author lists that are too long: if the bulk of the credit goes to the first and last authors, middle authorship is only worthwhile when earned through a minimal investment of effort.  This disincentive weighs especially heavily on multi-site projects, which require dozens, if not hundreds, of people to pull off successfully.

The problem is still harder in a standing organization like the PSA.  Maintaining an organization requires a lot of administrative labor – labor that is often invisible if there aren’t mechanisms in place to surface it.  In academia, we see this invisible labor problem on a small scale in the typical treatment of academic lab managers: these people are necessary to produce the science that graces academic journals, but are seldom, if ever, properly credited.

Given that the success of the PSA depends on ongoing administrative labor, we would like to find ways to disrupt the perverse incentive structure that inhibits standing collaborations in academia.  The PSA already endorses a contributorship model of awarding credit for scholarly products and uses the CRediT taxonomy to show project contributions.  We will also be unveiling the results of some ongoing initiatives to implement CRediT when we share the results of our first study.  

One additional initiative that we’ve started is to award stipends for the administrative roles that people fill for PSA studies.  For now, we will only be giving six of these stipends, and the stipends themselves are relatively small – $400 – but, if we are able to secure regular sources of funding for the PSA, paying our staff is one of our highest priorities.  You can help us increase the size and number of these stipends by donating to our Patreon (and you can read more about our ongoing Patreon campaign here).

Finally, we want to make some of the invisible contributions to the PSA more visible by highlighting some of the contributors.  Below, we profile five of these people, each at a different place in their scientific career, all of whom have made outstanding contributions to the PSA and its projects.  Only by recognizing the value of contributions like theirs will we subvert the system that heaps credit on the few at the expense of the many.

Profiles of five major contributors to the PSA

Nicholas Coles

Nicholas Coles is a fifth year Social Psychology PhD student at the University of Tennessee.  He’s a member of both the Project Monitoring and Community Building committees at the PSA.

unnamed

Nick has been a rockstar Project Monitor for PSA001 (face perception).  As the first project monitor for any PSA project, he has helped define what project monitors do, and has made a myriad of tracking sheets, and forms, and other materials that have served as templates for similar tools in other PSA projects. As the PSA001 project monitor, Nick has also served as the primary point of contact for an author list of nearly 200 people. Nick’s efforts have helped produce a data collection effort spanning 11,000 participants, 48 countries, and 28 languages.  As if that weren’t enough, Nick also developed a web application that displays an interactive map of the PSA network.

If you want to find out more about Nick and his work, check out his website.

Anna Szabelska

Anna finished her PhD in Cognition at Queen’s University Belfast and is looking for her next adventure. She was recently accepted to the NASA Datanauts project, a program that applies data science methods to NASA datasets. This program, along with her work for the PSA, is the coolest thing that’s happened in her professional life.

pasted image 0

Anna has been an exceptionally active member of the PSA from the very start. She was a founding member of the Data and Methods Committee and helped draft its bylaws. She has also co-led a project to create a standard psychology dataset format (Psych-DS), which would enable projects ranging from automated meta-analysis to standardized data analysis tools. She has also played a critical role on PSA002 (object orientation), and in that capacity, she has helped define what methodologists do in PSA studies. One of the more interesting outcomes from that work is a novel meta-science initiative – prediction markets to determine how accurate experts are in predicting whether the object orientation effect replicates in a given language. Finally, Anna has shared her enthusiasm for the PSA with anyone who is willing to listen, giving official talks on the PSA for RLadies Dublin, Women Who Code, and Google Women Techmakers, and co-organizing a PSA workshop and unconference.

You can find more about Anna (including her CV) at her LinkedIn profile.

Jeremy Miller

Jeremy Miller is a Professor of Psychology at Willamette University in Salem, OR. Jeremy is on sabbatical and has generously volunteered a chunk of his sabbatical time to working with the PSA. 

pasted image 0 (1)

Jeremy is the Project Monitor for two PSA projects, PSA002 (object orientation) and PSA003 (gendered prejudice), which have been bundled together for the purpose of efficient data collection. Jeremy coordinates communications between the many, many parties involved in pulling off a project involving 49 labs, 16 languages, and two separate projects, with an eye toward ensuring all parties adhere to the PSA’s ever-evolving policies. His effectiveness in this role is informed by his experience as the head of a data collection lab for PSA001 (face perception). Jeremy also serves on the Project Monitor committee, working with Project Monitors across PSA projects to ensure that the PSA uses its resources effectively and efficiently. 

You can read more about Jeremy and his lab at his lab website.

Marton Kovacs

Marton is a second-year master’s student at Eotvos Lorand University, Hungary and is a huge fan of both the PSA and the broader movement to improve psychological research. He is planning to apply to PhD programs to do meta-scientific research, focusing specifically on ways to increase research efficiency by minimizing human error.

pasted image 0 (2)

Even though Marton is relatively early in his scientific career, he has played a critical role in swiftly moving the PSA006 (moral dilemmas) from conception to a submitted Registered Report. As Data Manager, he drafted the data management plan that will ensure that the data from more than 130 participating labs are credible, transparently shared, and efficiently collected. He is also helping develop tools that will help solve some of the unique problems that large-scale collaborations face. This includes the problem of properly crediting contributions mentioned in this post: Marton is developing a Shiny app that helps authors create human- and machine-readable contributorship information so that the “invisible contributions” to multi-author projects are made visible.

You can find out more about Marton on his website

Sophia Christin Weissgerber

Sophia C. Weissgerber recently started a postdoc position in cognitive psychology at the University of Kassel in Germany. She is survey manager for PSA004 (true belief), a partnership with the Collaborative Replication and Education Project (CREP), which uses multi-site replications to provide training and professional growth experiences for students and instructors. Since Sophia is a huge fan of the CREP-project, she likes to expose her 3rd semester students to real-world hands-on research experience, for example replication of Griskevicius et al. (2010).

pasted image 0 (3)

Sophia performed a feat of programming wizardry for PSA004, implementing an experimental design in the online platform SocSciSurvey for a project involving 54 sites (and counting), as well as 26 countries and 12 languages. Each site requires a unique survey and survey link, so Sophia also does extensive coordination with the participating sites to personalize each survey for each site. This labor is critical for the project’s success. Sophia also enjoys participating in other PSA-projects, e.g. PSA001 (face perception). She is really excited about the PSA community and work (quote: “super-awesome”) and together with Hans IJzerman, Rick Klein, and Anna van ‘t Veer, she is currently working on a non-technical primer on how to conduct code review in psychological science.

You can read more about Sophia and her work on her website.