News from the Accelerator – August & September 2020

Today’s newsletter is double-sized, summarizing PSA activities from both August and September. It was a fun and busy 62 days with an election, some exciting news on publications, our first conference, and much more detailed below. As always, thank you for everything you do to accelerate psychological science. Happy reading!!

Associate Director Election Results

We had two excellent candidates in our election for an open Associate Director position: Peter Mallik and John Protzko. Please join us in congratulating John Protzko as our new Associate Director following an extremely close network vote. We look forward to seeing the vision and energy that John will bring to this new role. We also thank Peter for running, and are thrilled to continue working closely with him in other PSA roles! 

Get to know John Protozko: He is a metascientist at the University of California, Santa Barbra. He participated in PSA001 on data collection and is the PM on PSA006. His research interest is in causality as it applies to people and the implications for theory in fields such as cognitive development, social perception, and metascience. He. Loves. Food. and will travel the world for it.

PSA 2020 Conference

The first ever PSA conference is in the books, and we had tons of fun! Thank you so much to all of the organizers, presenters, moderators, hackers, and attendees for making this first meeting a success. Over the span of three days, multiple people presented studies and research on a wide variety of content, and others tuned in to listen to these talks from all over the world. If you weren’t able to make it this year, or had to miss a session, each of the talks were recorded, and are available to watch here. All videos that were successfully recorded have been uploaded (although some have been lost 😢 ).

Whether you attended the conference or not we would like to provide feedback by taking this short survey! We believe that each person in our network has a valuable opinion on the conference that we would like to learn more about as we consider the possibility of hosting another one in the future.

Study Updates

  • PSA 001: The stage 2 (final) paper was accepted at Nature Human Behaviour with only minor formatting edits required before publication!!!!!!!!!! This marks the true completion of our first ever study at the PSA.
  • PSA 002/003- The 002 and 003 teams are in the final stages of moving our experiments from Open Sesame/Qualtrics to Lab.JS. Thanks to the hard work of Merle Schuckart, we have drafts of the new programs and are in the process of ensuring that the web versions of the studies retain fidelity to the original protocols and troubleshooting the studies. The new version of the experiment should fully integrate both the 002 and 003 procedures into one online protocol, greatly simplifying future data collection.
  • PSA 004: Data collection is still going on and will wrap up at the end of the year! We do however need more CREP reviewers as we wrap up this project at the end of the year. CREP reviewers look over each lab’s OSF page to ensure that each lab is performing the study as planned. If you or someone else is interested please have them fill out this form.
  • PSA 005: We have finished our latest revision of the stage 1 manuscript based on our communications with the editor. This should be the last revision before stage 1 acceptance. The newest revisions were submitted on Wednesday!!!!
  • PSA 006: 006 will continue data collection through the end of 2020. We have collected lots of great data from all of our members and are excited to see this study through to the end.
  • PSA 007: PSA’s new project: Semantic Priming Across Many Languages (SPAML) is just getting started. Learn more by checking out the presentation from the conference: click here! The words2manylanguages part of the project is currently recruiting all kinds of help. We are trying to create a repository of datasets to use as our criterion for the modeling section of this project. We downloaded a bunch of them, but we need to clean them up to a consistent format (naming, csv column names synced, etc.). Anyone who’s interested in helping will be given credit via CREDIT (so this task would count + reviewing the paper for authorship). You can email buchananlab@gmail.com. Join slack or email to be kept updated on the two other pre-projects, more announcements on those soon. All three will need to be completed as part of the design for the full 007 project.
  • PSACR 001/002/003: Data collection is complete for PSACR 001 and 003, and PSACR 002 data collection is ending on October 23rd. A summary of data collected is available here http://formr.psysciacc.org/shiny/shiny_app/!

Director’s Statement Installments 

Chris is releasing a Director’s Statement in individual posts/installments to solicit feedback and ultimately guide priorities and activities for the remainder of his Directorship with the PSA. You can read the opening letter here and see the planned section titles with tentative release dates here!

Community Building Update

At the end of December 2020, Crystal Steltenpohl will be stepping down as an Assistant Director of the Community Building and Network Expansion Committee. Her announcement reads:

“I look forward to seeing what new energy the next assistant director will bring. I have really enjoyed working with Natalia Dutra on this committee, whose goals of diversity, inclusion, and community building in the PSA are close to my heart. Of course, even as a non-assistant director, I will still be involved and support the CBNEC. The director group is so awesome and open to feedback. I don’t usually look forward to committee meetings but the all hands meetings are fun and productive and informative. The general PSA meetings remind me of why I got into science. We have an awesome, collaborative community full of talented people from across the globe.

While I have no control over who takes my place, I really hope folks from non-US/CA/EU areas consider going for the position. Having someone from these geographic regions in a leadership role, especially for this committee, is crucial for PSA’s growth and development. I will miss being CBNEC assistant director but I am SO excited about seeing where new leadership can take this committee and the PSA more generally.

If you think you might be even kinda sorta interested in this position, please DM me on Slack or email me at cnsteltenp@usi.edu! I’d love to chat about the group and brainstorm on where it can go from here!”  

Data and Methods Update

We have some pretty exciting things happening in the Data and Methods Committee! We have added a new member to our ranks, Will McAuffille, and we are ecstatic to get to work more with him! We’ve started up a subcommittee on meta science that our very own Nick Fox is leading. Lastly, we are currently in the midst of a flurry of activity and progress on the measurement study related to PSA 006. 

Secondary Analysis Challenge

A year after the secondary analysis challenge was announced we have posted the results and submissions! We are proud to announce that it seems to have been a success as we had 8 submissions, and we learned new information from each of them. Each submission script was checked for computational reproducibility by two members of our team (Abigail Noyce and Patrick Forscher). To read more, please visit the blogpost!

IT’S THE MAP!!!

Nicholas Coles has updated our membership map and it looks sick! We continue to grow as a network day by day but it had been some time since we had updated the map. The previous map had us at 760 researchers and now we are up to 1021 and counting. Head over to the website to take a gander out how much we have grown! 

Cool articles: 

Incentivizing Discovery through the PSA001 Secondary Analysis Challenge

Patrick S. Forscher, Abigail Noyce, Lisa M. DeBruine, Benedict C. Jones, 

Jessica K. Flake, Nicholas A. Coles, Christopher R. Chartier

Please cite this blog post as:

Forscher, P. S., Noyce, A., DeBruine, L. M., Jones, B., Flake, J. K., Coles, N. A., & Chartier, C. R. (2020, May 26). Incentivizing Discovery through the PSA001 Secondary Analysis Challenge.  https://psysciacc.org/2020/09/14/incentivizing-discovery-through-the-psa001-secondary-analysis-challenge/

On September 1, 2019, we announced a new initiative to promote the re-use of a large dataset we collected for the PSA’s first study. Called the Secondary Analysis Challenge, this initiative started from the premise that much of the potential richness in psychological datasets is currently untapped.

The Challenge aimed to tap that richness using two methods. The first method was relatively standard: we created extensive documentation of the project data and meta-data. The second method was a bit more unusual: we provided incentives to research teams who were willing to follow an analysis pipeline that we thought would minimize the chance of false positives and maximize the chance of true discoveries.

Specifically, with the help of a generous donation out of Hans Rocha IJzerman’s book income, we were able to offer $200 awards to up to 10 teams that:

  • Submitted a preregistered script that analyzed an exploratory segment of our data
  • Revised the script in response to a computational reproducibility check
  • Used the revised script to analyze a confirmatory segment of the data that was separate from the exploratory data
  • Posted a preprint to PsyArXiv with a brief description of the results

What happened?

We received a total of eight submissions to the challenge. Each submission consisted of a link to a preregistered script that conducted an analysis on the exploratory segment of the PSA001 data.

Two members of our team (Abigail Noyce and Patrick Forscher) checked the submitted scripts for computational reproducibility. These checks involved running the submitted analysis script on one of our computers. We focused primarily on ensuring that the scripts ran without error, but we also sometimes commented on other issues, such as unusual parameter estimates, and possible bugs. We did not comment at all on the validity or theoretical grounding of the analyses. In most cases we did not even have access to this information.

If the computational reproducibility check led to changes in the code, we asked the submitting teams to add a comment to the top of the revised script listing the changes. These revised scripts were uploaded to the project OSF page.

After these computational reproducibility checks, we released the confirmatory segment of the data. We asked the teams to use the revised scripts to analyze the confirmatory segment. To receive the award, the proposing team also had to write a brief preprint reporting the results to PsyArXiv. We did not give any strong requirements on the contents of these preprints; we merely asked that they be made public.

Carlota Batres, one of our Challenge participants, mentioned specifically that the Challenge structure provided her with opportunities she would not otherwise have had. “I hope there will be more initiatives like this one which leverage the collaborative power of the PSA,” she commented.

What are the preprints like?

Most, but not all, of the preprints rely on connecting the PSA001 face rating data to some other secondary dataset, such as the Chicago Face Database, the Project Implicit Demo Website datasets, or the UN Migration database. Aside from that similarity, the preprints varied substantially in structure, format, and content. Some are fairly brief descriptions of results. Some are submission-ready articles with publication-quality figures. The topics range from an investigation of the “halo effect” to the possible link between face perception and the race and gender IAT scores averaged across regions. The Challenge seems to have elicited a variety of submissions that bear on a variety of scientific topics. 

We think the preprints cover many interesting topics. However, no one within the PSA reviewed the preprints themselves for scientific veracity. You can help ensure that we advance science through this exercise by looking through the preprints yourself and providing comments and feedback to the authors. To facilitate this process, we have included links and descriptions of the preprints at the end of this post.

Lessons for Future Challenges

We think this Challenge was a success. The Challenge also holds some lessons for people who wish to facilitate their own, similar, initiatives:

  • Communicate a clear schedule. Clarity of communication helps keep the Challenge process predictable for the participants.
  • Conduct computational reproducibility checks. The computational reproducibility checks were effortful. I estimate that I spent at least an hour and a half per submission, including running code, debugging, and communicating with submitters. However, the checks uncovered multiple issues and sometimes led to substantive revisions to the analysis plan. These checks were effortful but worthwhile.
  • Enrich the target dataset with meta-data. Meta-data are data about the data — in other words, the time of data collection, information about the stimuli, and other details about the data collection process. These meta-data are important for primary studies, but are crucial for secondary analyses. In this Challenge, we archived a huge amount of meta-data alongside our primary data and documented these in our data management plan. These meta-data greatly facilitated the task of generating interesting exploratory analyses.
  • Build pipelines to other datasets. The extensive documentation of the PSA001 data made it easy to merge our dataset to other interesting datasets. In fact, we included Chicago Face Database ID variables in the data that we released and explicitly noted in our data management plan how to merge our data to the Chicago Face Database. This is another step that we took that, I think, allowed the Challenge to be generative for exploratory analysis.

Our Challenge format should not be viewed as a substitute for other forms of peer review. By design, the Challenge did not evaluate the merits of the theoretical logic, the analysis plan, or criteria for inference. Hopefully, these are issues that peer reviewers evaluate if and when these proposals are submitted for consideration at scientific journals.

Overall, we view the Challenge format as a promising supplement that can enhance the scientific value of large datasets. We look forward to observing other innovations people adopt to enhance the value of psychological data.

Appendix: Secondary Analysis Challenge Preprints

Preprint 1: PSA001 Secondary Analysis: Examining the “attractiveness halo effect”

Authors: Carlota Batres

Abstract: Research has found that attractiveness has a positive “halo effect”, where physically attractive individuals are ascribed with socially desirable personality traits. Most of the research on this “attractiveness halo effect”, however, has been conducted using Western samples. Therefore, this report aims to examine the “attractiveness halo effect” across eleven world regions using thirteen ratings on faces, including attractiveness, that the Psychological Science Accelerator network collected. We found that for both male and female faces, attractiveness generally correlated positively with the socially desirable traits and negatively with the socially undesirable traits. More specifically, across all eleven world regions, individuals rated as more attractive were rated as more confident, emotionally stable, intelligent, responsible, sociable, and trustworthy as well as less weird. These results replicate previous findings of the “attractiveness halo effect” in Western samples and suggest that the positive effect of attractiveness can be found cross-culturally.

Preprint 2: Is facial width-to-height ratio reliably associated with social inferences? A large cross-national examination

Authors: Patrick Durkee and Jessica Ayers

Abstract: Previous research suggests that facial width-to-height ratio (fWHR) may be associated with behavioral tendencies and social judgments. Mounting evidence against behavioral links, however, has led some researchers to invoke evolutionary mismatch to explain fWHR-based inferences. To examine whether such an explanation is needed, we leveraged a large cross-national dataset containing ratings of 120 faces on 13 fundamental social traits by raters across 11 world regions (N = 11,481). In the results of our preregistered analyses, we found mixed evidence for fWHR-based social judgments. Men’s fWHR was not reliably linked to raters’ judgments for any of the 13 trait inferences. In contrast, women’s fWHR was reliably negatively associated with raters’ judgments of how dominant, trustworthy, sociable, emotionally stable, responsible, confident, attractive, and intelligent women appeared, and positively associated with how weird women appeared. Because these findings do not follow from assumptions and theory guiding fWHR research, the underlying theoretical framework may need revising.

Preprint 3: Variance & Homogeneity of Facial Trait Space Across World Regions [PSA001 Secondary Data Analysis]

Authors: Sally Xie and Eric Hehman

Abstract: This preregistration is part of the PSA secondary analysis challenge. We investigate how the facial ‘trait space’ shifts across countries and world regions, using the PSA_001 dataset shared by the Psychological Science Accelerator. The facial trait space refers to the interrelationships between many of the trait impressions that people infer from faces. Here, we examine whether this trait space is more homogeneous (or less differentiated) in some cultures than others.

Preprint 4: Hester PSA001 Preregistration Preprint—Region- and Language-Level ICCs for Judgments of Faces

Authors: Neil Hester and Eric Hehman

Abstract: We report the results of preregistered analyses of the PSA001 face perception data. We tested whether the target-level intra-class correlations (ICCs) would be higher in specific regions (i.e., more culturally homogeneous samples) than in the global data set (i.e., a less culturally homogeneous sample). We also report perceiver-level ICCs as well as by-trait perceiver- and target-level ICCs.

Preprint 5: Do regional gender and racial biases predict gender and racial biases in social face judgments?

Authors: DongWon Oh and Alexander Todorov

Abstract: Trait impressions from faces are more simplified for women than men. This bias stems from gender stereotypes; when strong stereotypes exist for a group of faces (e.g., of women’s or Blacks’), they are evaluated more positively/negatively when they fit/violate the stereotypes, making the impressions simpler (i.e., more one-dimensional). In this preregistered study, using trait impression ratings of faces collected from various world regions (+11,000 participants in 48 countries), scores of implicit associations (+18,000 and +212,000 participants in +200 countries), and mixed-effects models, we ask (1) whether simplified facial impressions are found for women and Blacks across regions and (2) whether the regional level of stereotypes about genders and races is correlated with the level of simplification in the face-based impressions of women and Blacks, respectively. The results were not coherent across analyses. The interpretation of the results and the limitations of the study are discussed.

Preprint 6: Hierarchical Modelling of Facial Perceptions: A Secondary Analysis of Aggressiveness Ratings

Authors: Mark Adkins, Nataly Beribisky, Stefan Bonfield, and Linda Farmus

Abstract: The Psychological Science Accelerator’s (PSA) primary project tested for latent structure using exploratory factor analysis and confirmatory factor analysis but we decided to diverge from this approach and model individual traits separately. Our interest mainly was in examining the interplay between “stimulus ethnicity” and “stimulus sex” to discover how differing levels of these criterion differ across region, country, lab etc. While the necessary and prerequisite hierarchical structural information about each trait could certainly be found within the primary project’s dataset, we did not assume that any specific factor structure from the PSA’s primary analysis would necessarily hold, therefore we based our decision to model the data from each trait separately using a mixed model framework.

Preprint 7: Population diversity is associated with trustworthiness impressions from faces

Authors: Jared Martin, Adrienne Wood, and DongWon Oh

Abstract: People infer a number of traits about others’ based simply on facial appearance. Even when inaccurate, face-based trait impressions can have important behavioral consequences including voting behavior and criminal sentencing. Thus, understanding how perceivers infer others’ traits is an important social and psychological issue. Recent evidence suggests that face-based trait impressions may vary by culture. In the present work, we attempt to understand cultural differences in face-based trait impressions. As part of the Psychological Science Accelerator’s Secondary Data Analysis Challenge, we report a set of pre-registered analyses testing how cultural differences in present-day diversity relate to a) 13 face-based trait impressions, b) sensitivity to physical features of the face, c) and the mental structure underlying trait impressions. We find that greater present-day diversity might be related to lower trustworthiness ratings, in particular. We discuss this finding in the context of other recent work and suggest further analysis of the mental structure of face-based trait impressions across cultures.

Preprint 8: The Facial Width-to-Height Ratio (fWHR) and Perceived Dominance and Trustworthiness: Moderating Role of Social Identity Cues (Gender and Race) and Ecological Factor (Pathogen Prevalence)

Authors: Subramanya Prasad Chandrashekar
Abstract: People effortlessly form trait impressions from faces, and these impressions can affect a variety of important social and economic outcomes. Trait impressions based on facial features can be approximated to distinct dimensions: trustworthiness and dominance (Oosterhof & Todorov, 2008). One of the facial features, the facial width-to-height ratio (face ratio) is associated with the trait impressions. I tested whether social category (gender, race) of the target being perceived shapes the relationship between face ratio and perception of dominance and trustworthiness. In this preregistered study, using trait impression ratings of faces collected from 8800 participants across 44 countries, I employ mixed-effects analysis and report results on (1) the direct influence of social categories (gender and race) of the target on perceived dominance and trustworthiness, (2) the moderating role of social categories (gender and race) on the direct relationships between face ratio and perceived dominance and trustworthiness, and (3) the moderating role of pathogen prevalence on the direct relationships between face ratio and perceived dominance and trustworthiness.

News from the Accelerator- July 2020

July has been extremely active in the PSA, and this newsletter contains updates on many projects, but we’d like to highlight two things up front:

First, the PSA has taken a very big and important step as an organization this month by initiating our first truly democratic election for an open Associate Director position! You can read the candidates’ statements at the links below to prepare for the vote.

Second, we are continuing data collection on the PSA-Covid-Rapid studies as long as we can afford to, which is likely through the end of August. The team could use some more data collection help to make sure we clear our pre-registered N targets!  Email psacovid@gmail.com if you are interested in joining or have questions about joining.

Election

The two candidates in our election for an open Associate Director position, Peter Mallik and John Protzko, have provided written statements describing why they’ve chosen to run, and describing what they’d like to achieve in the Associate Director role if elected. 

You can read each statement by clicking the candidate’s name here: Mallik and Protzko. Additionally, we have posted each statement in dedicated slack channels. I encourage all members to read them, consider them carefully, and ask any questions you have for the candidates! You can ask questions by joining the PSA slack workspace at this link, and then joining the #associatedirectorelection_protzko and #associatedirectorelection_mallik channels. The slack discussion closes today, but you can read the questions and answers throughout the election. Soon, we will email each PSA member with instructions about how to vote.

Conference

Approximately ⅔ of the available seats are full, so make sure to register now to claim your spot if you’d like to attend. We have a recent surge of attendees requesting free admission, and would very much like to to be able to fill all requests, so we would be very grateful if a few additional members were able and willing to pay the $60 admission fee to open up more free spots. Also, the program is  slowly but surely filling up. If you have work or ideas you would like to present at the conference be sure to fill out of new submission form! We’ve started a general info page that highlights a few presentations already set and links out to the current draft program. We hope to have the full program set in late August and will share an update with the full network when it is ready to go!

Current Studies

  • PSA 001- The Stage 2 Registered Report has been resubmitted to Nature Human Behaviour. Now we are waiting for the final decision from the editor.
  • PSA 002/003- The 002 and 003 teams are in the final stages of moving our experiments from Open Sesame/Qualtrics to Lab.JS. Thanks to the hard work of Merle Schuckart, we have drafts of the new programs and are in the process of ensuring that the web versions of the studies retain fidelity to the original protocols and troubleshooting the studies. The new version of the experiment should fully integrate both the 002 and 003 procedures into one online protocol, greatly simplifying future data collection.
  • PSA 004- Data collection has slowed down as many members’ universities have concluded their semesters. We do however need more CREP reviewers as we wrap up this project at the end of the year. CREP reviewers look over each lab’s OSF page to ensure that each lab is performing the study as planned. If you or someone else is interested please have them fill out this form.
  • PSA 005- 005 is waiting on a decision on our Stage 1 submission at Nature Human Behaviour.
  • PSA 006- 006 is continuing data collection and is slated to finish sometime between October and December. We continue to have new labs join and are extremely excited to finish up this year!
  • PSACR 001/002/003- The PSACR bundle is still collecting data. We now have over 19,000 participants (i.e., people who have completed the general questions about the pandemic and participated in one or more of the study surveys). The survey is offered in 38 (!!) languages and dialects, with a handful more to be implemented very soon. We are still planning to collect data as long as we can afford to, which we expect to be through August. We are also welcoming new labs to join data collection if they use one of the languages we’ve translated already (see here for a list of implemented and soon-to-be implemented languages http://formr.psysciacc.org/shiny/shiny_app/) . Email psacovid@gmail.com if you are interested in joining or have questions about joining.

A Chapter on the PSA, QRPS, and Clinical Psych

A group of PSA members, led by Julie Beshears, are beginning work on an invited chapter about the PSA and how the PSA’s design can affect the presence of questionable research practices (QRPs) in clinical psychology research.  Ideally, they are seeking individuals with clinical experience, but other contributions are welcome.  If this project interests you, you may join the slack channel (proj_clinical_chapter) or email Julie at jebeshears@eagles.usi.edu.

Committee Updates

  • Ethics committee- We would like to congratulate our newest Assistant Director of the ethics committee!! Mike Mensink was just appointed the position last week and seems very eager to start in his new position. Welcome Mike and we look forward to collaborating with you soon!
  • Translation and Cultural Diversity Committee- The Translation and Cultural Diversity Committee (TCDC) would like to collect data on the translation capacity of the PSA for future projects: in which countries and for which languages do we have adequate potential translators? This knowledge will help us to determine our potential capacity, or lack-there-of, in many possible data collection languages, and thus make better plans for future projects. Please answer this very short, 1-minute survey, to help us. Thank you!!
  • Project monitoring Committee- The member website is receiving an update soon. A big shout out to the fantastic Erin Buchanan for helping the team find a way to manage projects through Canvas. Erin will be going over this update and much more at the conference in September, so make sure you check that out for more information!

Recorded for the Chinese Open Science Network Available Online 

Chris Chartier recently gave a talk to the Chinese Open Science Network. They recorded the presentations and have posted it here, so anyone can watch and reach out with any questions or feedback (cchartie@ashland.edu). 

Thank you for all that you have, and continue to do, on behalf of the PSA. Onward!

Savannah and Chris

News from the Accelerator- June 2020

Hello all you amazingly talented and dedicated Accelerators! We’ve made some really tremendous progress in June, and this newsletter will provide updates on PSA personnel, the upcoming PSA conference, all of our current studies and more.

Research Coordinator

Thanks to seed funding from Ashland University and some personal donations from PSA leadership, the PSA has hired Savannah Lewis to serve as Research Coordinator. She will help to organize our conference (see below), provide administrative support for all of our studies, and fundraise to better support our members. Some of you have already worked with Savannah on Slack or Zoom and can attest to how lucky we are to have her working on the PSA (Chris speaking here :)! Shoot her a Slack DM to welcome her and connect on any initiatives or projects you’d like help with.


Translation Capacity Assessment

The Translation and Cultural Diversity Committee (TCDC) wants to assess the translation capacity of the PSA: across many possible languages of translation, how many potential translators do we have? This knowledge will help us to determine our potential availability and scarcity in languages, and thus make better plans in future projects. Please answer this very short, 1-minute survey, to help us.

Conference

We are now in the process of lining up specific sessions, moderators, and presenters for the conference. As of right now we have 120 confirmed attendees (with a cap of 300)! This means we still have spots open if you have not yet registered, which you can do here. We have also started distributing receipts to all registered attendees who were able to pay the $60 fee (thank you, your payment also created 2 free spots for others at the conference). Please be on the lookout for those receipts in your inbox, and let us know if you need any additional documentation to facilitate reimbursement from your institution. 

SIPS 

Several members gave a really nice presentation about the PSA as an “unconference” session at SIPS. Thanks to Erin Buchanan, Jordan Wagge, Max Primbs, Crystal N. Steltenpohl, and Jeremy Miller for representing us there. The session was recorded, so we also now have an overview video link that we can share with folks who are curious about the PSA. Feel free to pass it along to colleagues! 

Current Studies

  • PSA 001- The Stage 2 Registered Report has been resubmitted to Nature Human Behaviour. Now we are waiting for the final decision from the editor.
  • PSA 002/003- Thanks to the amazing collaboration between Merle Shuckart, Alessandra Souza, and Erin Buchanan we have been able to start getting this study bundle online. If your lab is going to be running the study online, each lab will need to provide an information text to participants signing up explaining the goals, risk, etc. 
  • PSA 004- Data collection has slowed down as many members’ universities have concluded their semesters. We do however need more CREP reviewers as we wrap up this project at the end of the year. CREP reviewers look over each lab’s OSF page to ensure that each lab is performing the study as planned. If you or someone else is interested please have them fill out this form.
  • PSA 005- The PSA005 team completed its feasibility pilot of the technical implementation of the study. The leadership team then finished its revision to the manuscript and resubmitted this revision to Nature Human Behaviour! COVID-19 has disrupted the data collection timeline, but the leadership has decided to monitor the situation at the start of each semester to determine the safety and feasibility of in-person data collection.
  • PSA 006- The PSA 006 project is currently collecting data. So far, the ~150 participating labs have collected data over 8, 000 participants. Following the registered plan, the team is conducting sequential data analyses in three cultural clusters. Further rounds of data collection are expected later this year.
  • PSACR 001/002/003- The PSACR bundle is collecting data in 19 languages, and the data collection deadline has been extended through mid-July or longer (we will continue collecting data as long as we can afford to). We are collecting data through PSA member labs and also through several panels. Over 10,000 people have participated so far, meaning that they completed the general questions about the pandemic and one or more of the study surveys. 

Thank you for all that you have, and continue to do, on behalf of the PSA. Let’s keep accelerating psychological science together!

-Chris and Savannah

News from the Accelerator – May 2020

This month, we have updates on myriad fronts, including but not limited to, a spike in new PSA membership sign-ups, progress on studies PSA 001 through PSA 006, information about our recently launched PSA Covid Rapid Studies, a call for new members on two PSA working groups, and several interesting opportunities to collaborate with other PSA members on research projects.

MEMBERSHIP

Our network has officially grown to over 1, 000 members. Welcome new accelerators!! We have no doubts that you will do great things in our network. Just in the last couple months since our last newsletter, we have added hundreds of members. We are stoked to see our network continue to grow.

STUDY UPDATES

  • PSA 001- The Stage 2 Registered Report received a revise-and-resubmit from Nature Human Behaviour. The lead authors have made the requested changes, and the participating labs are currently reviewing the updated manuscript. After the lead authors address those comments, they will resubmit the manuscript.
  • PSA 002/003- As a result of the pandemic, face-to-face data collection for the 002/003 bundle has been suspended until further notice. We are currently testing the feasibility of moving the procedure online, in case there are no opportunities for in-person data collection for the remainder of the year. Our goal is to complete data collection and submit 002’s Stage 2 registered report by the end of 2020. Thus far, we have collected data from 2,560 participants across 16 languages!
  • PSA 004- The Accelerated CREP originally should have been ending data collection right about now, but has been extended through the end of 2020 because of Coronavirus-related delays. You can still join the project to collect data (onboarding form here)! All data collection can happen online.
  • PSA 005- The lead team is working on finishing up its revisions of the Stage 1 Registered Report. Our editor at Nature Human Behaviour asked us to do some pilot testing of our technical implementation of the adaptive algorithm that governs condition assignment. With Erin Buchanan’s help, we implemented all aspects of the project in formr and conducted a variety of tests to ensure that the condition assignment changes as appropriate in light of different kinds of evidence. This whole effort went slower than expected due to the outbreak of the coronavirus, but we received an extension from our editor. We’ve also finished a first draft of our paper revisions and our response to reviewers. We expect to submit the revision within 1-2 weeks.
  • PSA 006- The PSA 006 project is currently collecting data. So far, the ~150 participating labs have collected data from 8,450 participants. Following the registered plan, the team is conducting sequential data analyses in three cultural clusters. Further rounds of data collection are expected later this year.
  • PSACR 1/2/3- PSA Covid Rapid (PSACR) is three bundled studies running together along with some general questions. PSACR 001 and 003 will be submitted for publication after data collection is complete. PSACR 002 was accepted as a Stage 1 Registered Report at Nature Human Behavior. Data collection is live for all three studies in English, Hungarian, and Dutch and many more languages are in the translation and implementation pipeline. Over 5,000 people have participated so far. Many people have been working hard to pull off this project and in particular, Heather Urry, Max Primbs, and Erin Buchanan have done tremendous work to keep the project afloat, catch errors, and help move translation and implementation along.
  • Upcoming New Studies- Resulting  from our 2019 call for studies, 1 proposal has been accepted pending minor revisions, two are being revised and resubmitted with possible acceptance in 2020, and several more are being revised and resubmitted in preparation for future calls for studies.

PSA SERVER PURCHASED

Thanks to a generous donation from Erin Buchanan, we were able to purchase new server equipment to support the increasing complexity and data collection load of our current and future PSA studies! The new server has been blazing fast in our initial tests, and it should be ready for us to start using in earnest soon. This also means we are going to be able to overhaul our  member website to give it a functioning dashboard that will be more effective and user-friendly for members.

PLANNING THE PSA’S FIRST CONFERENCE

We are hoping to host a virtual meeting in 2020 and are exploring possible dates. Our preliminary plan is to hold the conference in September (on the 1st, 2nd, and 3rd, or on 8th, 9th, and 10th) with sessions spreading across 12 hour each days to help accommodate the vast time zone differences of PSA members. We are hoping to get feedback on which set of days works best for our members (starting on the 1st of September or starting on the 8th of September). We are planning to have a range of session types, including update presentations on current PSA studies, workshops on how to get involved in our studies, study submission tips, as well as presentations on non-PSA research that our members are currently working on. We hope to have further details for y’all soon. Please reach out to slewis16@ashland.edu to give feedback on preferred dates or if you would like to be added to our email list of possible attendees.

DATA AND METHODS COMMITTEE SEEKS NEW MEMBERS

The PSA Data and Methods Committee is seeking a new standing member of our committee. The Data and Methods Committee (DM) is responsible for operation and policy development related to methodology, meta-science, and data management within the PSA. Those interested to join the committee should read our bylaws here: https://osf.io/p65qe/. This position is a two-year commitment, starting in June 2020, with the following responsibilities:

  • Attend bi-weekly Data and Methods committee meetings
  • Assist with the committee’s operation and research responsibilities
  • Advise on matters of committee policy
  • Vote on research projects and amendments to the committee’s bylaws
  • Fulfill assigned responsibilities (as delegated by the Assistant Directors)

Some examples of recent work carried out by the committee are:

  • Writing, revising, and ratifying the analysis plan approval process
  • Coordinating with the larger PSA leadership to develop a needs assessment
  • Selecting methodological reviewers for proposals
  • Recruiting methodologists and data managers for accepted projects
  • Developing a metascience study submission process
  • Conducting and contributing to DM research projects undertaken by DM members

We expect Standing Committee Members will devote approximately 8-16 hours per month of time to their work. These are not set limits and may fluctuate from month to month. Any members of the PSA are eligible to apply. Previous service experience with the PSA is desired but not required. Application materials to submit:

  1. Statement of interest outlining interest in working with the DM and relevant experience (short; max 500 words)
  2. Relevant documentation that supports the information described in the statement of interest and experience (can be a CV, a personal website, blog, osf page, etc.)
  3. E.g., If you describe experience managing open data you might want to put on your CV links to OSF pages, or link to a github or blog, that is ok — submit a CV or other document that provides background for your experience

Applicants who move on to the next step of recruitment will be asked to attend a DM committee meeting so that they can become acquainted with our process and procedures. Review of applications to begin within one week, review rolling until position is filled.  Please send application materials to: dataandmethods@gmail.com.

PSA METASCIENCE SUBCOMMITTEE SEEKS NEW MEMBERS

The PSA Metascience subcommittee is looking for up to 5 new members to help facilitate Metascience research projects within the PSA.  The role of the subcommittee will be to review proposals on work that uses the PSA to research the process of science, as well as develop policy to support metascience research. Responsibilities of subcommittee members will include reviewing proposals, interfacing with members of the Study Selection, Data and Methods, and Data Management Committees when appropriate, and meeting monthly as a group to review progress.

If you’re interested, send your information to Nick Fox: nfox423@gmail.com by Friday May 29th.  Include a short paragraph on why you’d like to join the subcommittee, what you’re looking to get out of being a subcommittee member, and any metascience or methodological research experience you’ve had (it’s not required, but will be helpful for us to try and maximize the diversity of experiences!). 

OTHER OPPORTUNITIES TO COLLABORATE WITH PSA MEMBERS

African Many Labs

  • We are starting the ManyLabs: Africa project, which is a doctoral thesis on the replicability and generalizability of effects across African and Western populations. The project is led by Adeyemi Adetula, currently at Université Grenoble Alpes (France). The project is supervised by Drs. Hans IJzerman, Dana Basnight- Brown and Patrick Forscher

    We call for researchers from Africa to serve as collaborators in a pilot study (using the CREP framework for a one-study project) and researchers from Africa, Europe, and North America for the main study (ManyLabs: Africa). As part of this project, we will provide online training / webinar courses on open science access and tools to African researchers. 

    Collaborators can earn authorship for data collection, translation, and coordination of their research unit. Other activities could include consultation on study/effects nomination and evaluation.

    Interested researchers can sign up here. For further information, please contact Adeyemi Adetula at adeyemiadetula1@gmail.com.

Two Opportunities from Balazs Aczel: 

  • Balazs Aczel’s lab advertises a collaboration opportunity to find out why people violate lock down. Join the project if you can organize online data collection in your country to extend the multinational datasets. Collaborators are expected to collect data (min 350 pps) from a country with some level of current lock down. More info:  https://tinyurl.com/ConfinementNonadherence
  • Research on how you cope with home-office

Our team would like to explore how researchers cope with working from home now and in general. This is a great opportunity for the research community to change old routines and optimize the time they spend in the office and work from home.

From our survey results, we will create recommendations for institutions on how to support researchers’ efficiency and work-life balance regarding their options and condition in remote working. After responding, you can sign up to win a 100 USD Amazon (or your choice) voucher. If your region was in recent lockdown, please share your experience with home-office in this ~4-min anonymous survey. Follow this link to the Survey.

We appreciate it if you share this survey with your colleagues. Thank you for your support.

Balazs Aczel, ELTE, Hungary