News from the Accelerator – July 2019

We have a range of updates and small requests for help this month, including paper submissions, an upcoming data release, elections, hackathons, and a crowdfunding campaign.

Current Studies

  • 001 Face Perception. We are nearly done with data collection in all world regions! We will release a simulated data set in August, complete data collection in October, release ~1/3 of the data set in late October as an exploratory segment, and then release the remaining ~2/3 of the data set as a confirmatory segment concurrently with the release of the Stage 2 manuscript at Nature Human Behaviour.
  • 002 Object Orientation & 003 Gendered Prejudice. Data collection is now rolling out in English speaking labs and will expand to the other study languages soon if all goes smoothly with the early collection sessions.
  • 004 True Belief. Data collection is also currently rolling out for this project as labs test their lab-specific links and get a final check on all materials.
  • 005 Stereotype Threat. The Stage 1 Registered Report manuscript is now under review at Nature Human Behaviour. We still welcome new labs to join the project if they are based in the USA and can recruit black college students.
  • 006 Trolley Problem. The Stage 1 Registered Report manuscript received a favorable “revise and resubmit” from Nature Human Behaviour. The lead team is currently working on the revisions and will reach out to all co-authors for feedback and suggested edits soon.
  • 007 and beyond? Remember that the deadline for submissions to this year’s round of study selection is September 15th!

Funding

PSA members are actively working on a range of grant submissions to the US National Science Foundation, the John Templeton Foundation, the European Research Council Synergy program, and Canada’s National Research Council, but there are probably other opportunities for us to pursue. If you know of a funding mechanism that would be a good fit for the PSA, please pass along any information you have, and let us know if you’d like to lead the grant writing effort or collaborate on a submission!

Patreon

Remember that we can also support the PSA via our Patreon page. Even just $1 a month can be extremely helpful and provide meaningful funding if we get a large base of supporters. I’ve given myself the challenge of getting the PSA one new Patreon supporter every workday. I’ve been successful for the past week, and I hope to keep the streak going for a very long time! Perhaps you could find a new supporter each year? Each month? Each week? Every little bit helps.

Hackathons

We have been holding weekly PSA hackathons for the last three weeks and plan to continue these for the foreseeable future. They’ve been extremely fun, productive, and energizing! The three previous hack topics were “brainstorming funding sources,” “editing our needs assessment process for new studies,” and “updating and organizing our file management system.” This Friday, August 2nd, at 14:00 UTC we will host a hackathon on improving our get involved page on psysciacc.org. We will announce all future hackathons on twitter and on the new #hackathons slack channel. Let us know if you have any ideas or requests for future sessions.

Meta-Policy

We have drafted a new PSA policy defining the process by which the PSA creates and approves new policies :). We now welcome feedback on this policy before we put it up for a vote. Please provide your comments in the next two weeks in the draft document.

Elections

PSA elections for Assistant Directors, Associate Directors, and the Director will begin in January 2020 and then be held each subsequent July and January. During each election, we will select 3 Assistant Directors (who serve 3 year terms). Most elections will also select an Associate Director (who serve 4 year terms). We will elect a new Director every 5 years. You can see more details about the upcoming elections, including the initial schedule (which was selected randomly), towards the bottom of this policy document. More details on election specifics are coming soon!

 

-That’s all for this month. Be on the lookout for some exciting posts and updates during August as we celebrate 2 years of accelerating psychological science (our anniversary is August 26th). I’ll leave you with two pictures of PSA members meeting up (one online and one in person) and making each other smile about the future of psychological science!

Chris

pasted image 0

pasted-image-0-1.png

News from the Accelerator – June 2019

June was an action-packed month for the Accelerator. This news roundup includes a new call for studies, exciting updates on several current studies, info on the PSA’s first hire (!), and tons of links to other great things PSA members are up to.

2019 Call for Studies

We welcome new study submissions now through September 15th. A detailed description can be found in the blogpost announcing the call. Please do not hesitate to respond to this email with any questions you have about the submission, review, and selection process.

The PSA’s First Hire

Patrick Forscher has accepted a postdoc at Université Grenoble Alpes in Hans IJzerman’s lab to coordinate and write grants to fund the PSA.  The first grant they’re working on is a Synergy Grant, which could provide big-time funding for PSA activities. Receiving the grant would have major implications for the PSA, so you’ll very probably be hearing more about the grant preparations in the coming months.  Congrats, Patrick!

002 Object Orientation and 003 Gendered Prejudice: Ready for Data Collection

This bundled project will now begin to roll out data collection! 003’s preregistration was finalized, we’ve recorded a demo video of a mock data collection session, and have established the collaboration agreement describing how contributions on both studies will be credited. If you’re a member lab for this study, be on the lookout for a detailed update email on next steps very soon!

004 True Belief: Accepted in Principle at AMPPS and Ready for Data Collection

The Accelerated CREP collaboration is also ready to begin data collection, after receiving an in principle acceptance at Advances in Methods and Practices in Psychological Science. You can read more about the project, and detailed instructions for how to get involved, in this highly informative blog post from project leader Braeden Hall.

Planning for PSA Activities at SIPS

Many Accelerators will be attending this year’s meeting of the Society for the Improvement of Psychological Science in Rotterdam. Sau-chin Chen has put together a google sheet highlighting sessions led by PSA members. There will also be 5 PSA specific work meetings at the conference. Follow the conversation on our Slack workspace to see the time and location for those as they are determined.

Project Monitoring Committee

The Project Monitoring Committee has changed its name to the Project Monitoring Committee. The initials and role will remain the same: Project Monitors (PMs) will monitor, encourage, and track progress on projects. They’ll still be the person on the team who maintains a big-picture view, has an eye towards adherence to PSA policies, and helps submitting authors anticipate and overcome issues. The new name captures this role better, and makes more clear that the PM isn’t responsible for making progress on a project or leading a project.

T-Shirts and Stickers

We are selling PSA t-shirts and stickers. You can reply to this email with order details (sizes and quantities) and address and we will ship to you. Short sleeve shirts are $15, long-sleeve shirts are $20, and stickers are $5. You can see pictures of each here.

PSA Talks and Shout-outs

20190629_104250.jpg

The PSA has been presented or mentioned in many recent presentations, workshops, and blogposts. Pictured above are Neil Lewis Jr. (Assistant Director of Funding), Jess Flake (Assistant Director of Methods), and Hans IJzerman (Associate Director) in Estonia where they are providing open science training at the Empirical Methods in Cognitive Linguistics meeting. Here are more links to other pictures and articles from twitter.

Other Collaboration Opportunities

  • Several members of the PSA have started a MetaResearch Hub. Anyone interested in meta-research should check it out. You can join a project or share your ideas to find collaborators:  bit.do/MetaResearchHub
  • Rick Klein, Olivier Dujols, and Hans IJzerman are looking for collaborators on a new (non-PSA) cross-national project to assess whether individual differences exist in risk distribution, social thermoregulation, and food sharing. The project home is at the OSF, the current proposal is in Google Docs, and the current collaborator list in Google Sheets. Interested researchers can contact them directly. 

Update on the Accelerated CREP. A Collaborative CREP Replication Project Powered by the PSA 

Post authored by Braeden Hall

Last year, the Collaborative Replications and Education Project (CREP; pronounced like grape) and the Psychological Science Accelerator (PSA) partnered on a project, now referred to as the Accelerated CREP (or PSA 004 Justified True Belief). The mission of the Accelerator is to accelerate the accumulation of reliable and generalizable evidence in psychological science, while the mission of the CREP is to improve undergraduate training through crowdsourced replication. Over the course of the last year, the CREP and PSA networks have worked together to collaboratively fulfill these two parallel missions by putting together a registered replication report (RRR), now provisionally accepted at Advances in Methods and Practices in Psychological Science (AMPPS; preprint here: psyarxiv.com/zeux9). The purpose of the Accelerated CREP is to harness the power of undergraduate research projects across five continents to address the need for replication in the field of Psychological Science. Having students perform replications as part of their psychological science education serves both as a pedagogical tool for teaching junior researchers about open science and as a rigorous scientific method for collecting high quality, transparent data for a field in need of corroboration.  

When we first began this project, we didn’t know whether we would simply preregister the study or attempt to publish it as a registered report. However, after writing a short proposal to AMPPS for the project, we were invited in June of 2018 to submit our study plan as a full Stage 1 registered replication report (RRR). After several months of writing, planning, and R coding multilevel data simulations and power analyses, we submitted our collaborative study plan in November 2018. In January 2019, AMPPS requested that we resubmit our paper with revisions that considerably improved the study plan. After making these revisions and other improvements, our team resubmitted the paper back to AMPPS for review in February 2019. In April 2019, AMPPS requested a few additional changes, which were resubmitted the following month. And in June 2019, the Accelerated CREP Stage 1 RRR was provisionally accepted at AMPPS. 

A lot of changes have been made to the Accelerated CREP project over the last year since its inception, and many of those changes arose from our decision to publish our plan as a registered report. We originally planned a very simple direct replication, but the AMPPS editor and reviewers made suggestions for how we could improve our design, including testing similar stimuli as a random factor to improve our test’s generalizability, adjusting our cross-cultural and covariate analysis plans, and improving our theoretical precision. One of the original authors also suggested that we transition from using a binary response to a scaled response for a more sensitive measure of people’s attributions of knowledge. When making all of these changes, it was also important for us to always consider student feasibility against other goals of the study. And, while publishing a registered report considerably delayed our original study timeline, it has no doubt resulted in a much more rigorous, much more fine tuned study from which students can learn. 

Due to the delays in our study timeline, we decided to allow some sites who had written their curriculum around the replicated Turri, Buckwalter, and Blouw (2015) study to participate in a traditional CREP preregistered direct replication with their students – which can all be found forked off the Accelerated CREP’s OSF page. The data from these direct replication studies will not be used for our primary planned multilevel analyses, but they may serve as an interesting comparison to the rest of the study. 

The first phase of this joint study took a lot of collaboration and planning, so we’re eager to begin data collection! See below for more information about how to participate, using SoSciSurvey (and alternatives), CREP procedures, translation procedures, and some open science teaching resources. 

Survey is Now Live on SoSciSurvey

As of July 27th, the universal English version of our main online survey is now live! We will have it open for one week for testing/debugging across all 50+ labs before resetting the data for sites to begin collecting data from participants. 

Our survey monitor, Sophia Weissgerber, has worked hard to code a universal survey in SoSciSurvey which is now open for contributors to start testing so they can request site specific versions, if they need one. Teams not testing in English will use the PSA procedures to translate the study materials to their local language (all posted to the OSF), and then Sophia will create a language specific version for the study with the help of our translation coordinator, Jan Philipp Röer. By using SoSciSurvey, we are able to provide a boilerplate study experience to all interested labs that is both free and accessible globally (via internet access). 

We will also work with labs to create specific versions of the survey for those who have more unique needs or who want to run an extension of the study. We will allow sites to use other survey programs/applications; however, all data must conform to our universal data template – which our data monitor, Daniel Dunleavy, will share to the OSF after contributors have finish testing the current version. Each lab’s data collection plan must be cleared by the CREP review team prior to collecting data.  

Steps to Participate 

If you and/or your students are interested in participating in our collaborative, student-led project, you will need to take the following steps (this is an abbreviated list, see the CREP step-by-step for more detailed information): 

  • Step 1: Become a member of the PSA
    • The PSA is comprised of 760 researchers representing 548 laboratories from 72 countries across 6 continents (see map here).
    • You can join easily by providing your contact info here.
  • Step 2: Sign up for the Accelerated CREP
    • First, provide us with your information.
      • If your site will have more than one team of researchers collecting data concurrently, you should submit just one OSF page for review. We ask that you describe the multiple teams in your OSF page wiki, and clearly identify the materials, methods, data, and results from each team in your files.
    • Then, someone from the CREP will contact you with more instructions and a project number within two business days. 
      • We will be tracking some details related to each site’s project here
    • In the meantime, please feel free to start working on the CREP step-by-step instructions provided here
  • Step 3: Get Ethics/IRB Approval 
    • Once you are familiar with the CREP procedures and you have been assigned a project number, your next step is to get ethics approval from your institution. 
      • To help contributors expedite their IRB/Ethics application process, we have provided a template for you to use to fill out your own institution’s ethics application – which can be found here
  • Step 4: Submitting Protocol for Review 
    • After you have received your institutional ethics approval, you and/or your students will submit your lab’s protocol for review to the CREP review team. 
      • To submit your protocol for review, you will fork your study off the Accelerated CREP parent OSF page to create your lab’s preregistration on the Open Science Framework (OSF).
        • Your lab’s preregistration will include a written protocol that should match all protocols and procedures set forth in the Stage 1 RRR. 
        • A video of the methods used for the study. 
        • Your institutional ethics approval. 
      • The CREP review team will work with your group to approve your protocol. The goal is approval, no contributors will be denied outright. The review process is just to ensure that your lab understands the study plan entirely. 
  • Step 5: Collect Data
    • After the CREP review team has approved your protocol, you will be cleared to collect data. 
    • Overall data collection will end on June 1st, 2020. 
      • Labs who turn in their data report after this date may not have their data represented in the Stage 2 paper. 
  • Step 6: Submit Data 
    • After data collection ends or your team reaches its target sample size, your team may submit your data, analysis report, and other information to the CREP for a final review. 
    • Once approved, we will collate your data into the aggregate dataset for analysis. 
      • Teams will also post their site level analyses to their preregistered OSF pages.

Authorship Requirements and Current Contributions

When you sign up to contribute to this study, you must agree to our Collaboration Agreement. If you have any questions/concerns about this agreement, please email me at HallBF@hendrix.edu. Contributors who  collect a larger sample (N > 100) and/or collect non-university participants will be given more advanced authorship. Authors can be added to a Registered Replication Report at different stages. Details about stage 1 author contributions can be found in the manuscript in the “Author Contributions” section: Preprint: https://psyarxiv.com/zeux9.

What constitutes authorship on the final manuscript?

All authors must make contributions to writing – review and editing (which involves approval of the submission) and must also make contributions in at least one another category of the CRediT taxonomy (see table below). “Writing – Review and Editing” minimally requires closely reading the full manuscript, providing any relevant feedback and suggested edits, and confirming approval of submission. Most authors will contribute to “Investigation” and “Writing – Review & Editing.”

PIs of contributing labs are responsible for their students and staff working on the project. This includes: honestly reporting the contributions of their lab members, evaluating whether contributions merit authorship according to the above paragraph and the CRediT table below, and verifying that contributions are correctly described in any formal publication. PIs are also responsible for showing this agreement to lab members who will be authors and making sure they agree to it.

Author contributions will be reported on research products (e.g., the Registered Report submission, the final manuscript) using the CRediT taxonomy.

CRediT Contributor Roles
# Role Definition
1 ​Conceptualization ​Ideas; formulation or evolution of overarching research goals and aims.
2 Data curation Monitoring activities to annotate (produce metadata), scrub data and maintain research data (including software code, where it is necessary for interpreting the data itself) for initial use and later re-use.
3 Formal analysis Application of statistical, mathematical, computational, or other formal techniques to analyse or synthesize study data.
4 Funding acquisition ​ ​Acquisition of the financial support for the project leading to this publication.
5 ​Investigation ​Conducting a research and investigation process, specifically performing the experiments, or data/evidence collection.
6 ​Methodology ​Development or design of methodology; creation of models.
7 Project administration ​ ​Monitoring and coordination responsibility for the research activity planning and execution.
8 ​Resources ​Provision of study materials, reagents, materials, patients, laboratory samples, animals, instrumentation, computing resources, or other analysis tools.
9 ​Software ​Programming, software development; designing computer programs; implementation of the computer code and supporting algorithms; testing of existing code components.
10 ​Supervision ​Oversight and leadership responsibility for the research activity planning and execution, including mentorship external to the core team.
11 ​Validation ​Verification, whether as a part of the activity or separate, of the overall replication/reproducibility of results/experiments and other research outputs.
12 ​Visualization ​Preparation, creation and/or presentation of the published work, specifically visualization/data presentation.
13 Writing – original draft ​ ​Preparation, creation and/or presentation of the published work, specifically writing the initial draft (including substantive translation).
14 Writing – review & editing ​ ​Preparation, creation and/or presentation of the published work by those from the original research group, specifically critical review, commentary or revision – including pre- or post-publication stages.

How will authorship order be determined?

For the final manuscript, authorship will be assigned in tiers according to contributor roles (see table) using the CRediT taxonomy (see . Contributions will be described in the author note.

CRediT Authorship Tiers
Tier 1 Contributions to conceptualization, methodology, formal analysis, software, or resources, and writing – original draft, review and editing. (e.g., submitting authors, project monitor). 
Tier 2 Major contributions to validation, project administration, and/or writing – original draft, review and editing. Ordered alphabetically unless otherwise determined by discussion with project leadership.
Tier 3 Investigation and writing – review and editing (Data collection PIs, students, and staff). Ordering alphabetical.
Tier 4 Supervision and writing – review and editing (e.g., Chartier, Ethics Committee Liaison). Ordered alphabetically with Chartier and Grahe last.

What if I have questions or concerns related to authorship?

We encourage you to ask questions and raise concerns about authorship as early as possible. We suggest directing questions or concerns to Braeden Hall (hallbf@hendrix.edu), Jordan Wagge (jordan.wagge@avila.edu), the director of the CREP (Jon Grahe, graheje@plu.edu), or to any director, associate director, or assistant director of the PSA, including: associate directors Heather Urry and Charlie Ebersole, and director Chris Chartier.

Educational Tools 

If you would like some awesome open resources on teaching with replication and other great educational open science tools, check out this OSF page:

Consolidating Teaching Resources

If you would like any more information or have questions about how to build a solid curriculum around open science, please don’t hesitate to reach out to us for more resources. We are a team of enthusiastic replicators and educators, and we welcome any and all to join the project, especially students! 

-Braeden Hall

The Psychological Science Accelerator Calls for Study Submissions

patrick-tomasso-406271-unsplash.jpg

The Psychological Science Accelerator (PSA), a network of over 500 labs collaborating to collect psychological data from large-scale international samples of people, is currently accepting study proposals from all areas of psychological science. Anyone can submit a proposal, regardless of PSA membership status, research area, methodology, or career stage. We especially encourage submissions in research areas including but not limited to clinical psychology, community psychology, developmental psychology, and neuroscience. The deadline for submission is September 15th, 2019.

The mission of the PSA is to accelerate the accumulation of reliable and generalizable evidence in psychological science, reducing the distance between the truth about human behavior and mental processes and our current understanding. For a full overview of the PSA, please see our paper introducing our policies and procedures (https://psyarxiv.com/785qu/). For information on previous successful submissions, please see each study’s status page, with descriptions and links to study materials and preprints (https://psysciacc.org/studies/).

Capture

Proposed studies can test novel hypotheses or focus on the replication of previous findings, can be basic or applied in focus, and can be exploratory or confirmatory in nature. Because accepted studies will likely involve considerable use of resources, the study selection process begins with the evaluation of a stage 1 Registered Report style submission. Once submitted, the proposals will be masked and checked for completeness. Then, our Study Selection Committee will conduct an initial evaluation of all proposed studies (see below for more information on the feasibility check portion of this initial evaluation), and vote on their acceptability for further consideration. Studies that pass this check will be fully evaluated by 6-11 peer reviewers, shared with all members of the PSA network to be rated, and either provisionally accepted or rejected by the Study Selection Committee.

Studies that are provisionally accepted enter a second stage of preparation and final evaluation before the PSA officially accepts the project, makes a public announcement, and recruits data collection labs. During this second stage of preparation and evaluation, the PSA and proposing authors collectively conduct a needs assessment. The goal is to identify all of the resources the provisionally-accepted project will need in terms of methodology, funding, translation, ethics, data monitor, project monitor, and logistics. This is an in-depth feasibility evaluation that will ensure that the project, if officially accepted, can be carried out to a high standard for rigor.

Projects that pass the in-depth feasibility evaluation proceed to the preparation phase. During this phase, the PSA assigns key PSA personnel to the project, including a project monitor, methodologist, data monitor, ethics monitor, and translation monitor in accordance with the needs assessment (these roles can be filled by proposing authors if they have the skills and capacity to do so). This group will then meet with the proposing authors to 1) establish a collaboration agreement, 2) identify who will be the lead communicator with data collection labs, and 3) establish a clear deadline for the proposing authors to convert their PSA proposal into a journal-quality draft of a Stage 1 Registered Report (RR). This journal-quality draft RR represents the final record of the motivation for the study and the methods to be used, including analysis plan. PSA directors will review the RR to make the final decision about whether to officially accept the project and make a public announcement to recruit data collection labs (with a link to the RR). Projects that are officially accepted move to the next stage of preparation, in which proposing authors will revise the RR based on feedback on the draft sent out to data collection labs and obtain ethics approval at their primary institution.

Even after official acceptance, PSA studies may go through revisions. For instance, reviewers at journals may suggest additional measures or procedural changes. Accepted proposals are not unchangeable. However, proposing authors and the PSA will make every effort to minimize the likelihood of post-acceptance changes. If the changes necessary after acceptance are too many in number or too significant, the PSA may ask proposing authors to withdraw their project and resubmit it during the next call for studies. Changes that are most likely to trigger a request for resubmission are those that increase the burden on data collecting sites.

After official acceptance, all review materials (submissions, peer reviews, network ratings, and Study Selection Committee decision letters) will be made publicly available for accepted submissions, but not rejected submissions.

Feasibility

All feasibility decisions are made with respect to our current, and ever-changing, resources. Although the PSA is comprised of hundreds of labs from around the world who have agreed, in principle, to volunteer some of their resources to PSA projects, we may not be able to accommodate all types of designs. A few important feasibility considerations that may result in a study not moving beyond the feasibility check include:

  • Does the study require specialized equipment (e.g., eye-tracking, EEG) or proprietary experimental software (e.g., E-Prime) to be used at the data collection sites?
  • Can experimental materials and analysis scripts be shared easily and made publicly available?
  • Does the study require “hard-to reach” samples (e.g., children, minority groups, clinical populations, etc.)?
  • Is the target sample size per site, number of data collection sites, duration of the individual data collection sessions, and number of data collection sessions required fully justified and balanced so as to not overburden the PSA network?
  • Is the likelihood and severity of risk to participants kept to a minimum, such that the risk is not greater than what participants would face normally and would not require special consideration or deliberation from an ethics board?

Characteristics of strong submissions

Beyond simply being feasible given current PSA resources, strong submissions will also:

  • Accurately and clearly describe literature relevant to the study’s goals and design, such that researchers unfamiliar with the subject can understand the basic concepts behind the theory/phenomenon and the purpose of the research.
  • Clearly articulate the purpose of the research, relevant research questions, and hypotheses (if confirmatory).
  • Clearly articulate the research design, with a focus on sound methodology appropriate to the research questions, including adequate power analysis to justify sample size.
  • Provide examples of relevant material, for example websites, experimental scripts (e.g., E-prime, Inquist, OpenSesame), precise experimental design, and/or stimuli.
  • Accurately and clearly describe an analysis strategy appropriate to the research questions and design. Pilot or simulated data and working analysis scripts are ideal for clarity.
  • Make a compelling case for the importance of large-scale collaborative data collection for the project.

Submission Format and Guidelines

The following components are required for all submissions:

  • Cover Page, including the title of the study, date of the latest draft, and keywords
  • Abstract of up to 150 words
  • Main body submission text of up to 5,000 words
  • A version of the submission with cover page included
  • A masked version of the submission without the cover page
  • References
  • Supplementary materials

The following guidelines are intended to assist you in the preparation of your study submission to the Psychological Science Accelerator. Submissions normally include a description of the key background literature and motivation for the study, hypotheses, study procedures, proposed statistical analysis plan, a statistical power analysis, and pilot data (wherever applicable).

Introduction

A review of the relevant literature that motivates the research question and a full description of the study aims and hypotheses.

Method

A full description of proposed sample characteristics, including criteria for data inclusion and exclusion (e.g., outlier extraction). Procedures for objectively defining exclusion criteria caused by technical errors or for any other reasons must be specified, including details of how and under what conditions data would be replaced.

A description of study procedures in sufficient detail to allow another researcher to repeat the methodology exactly, without requiring further information.

Proposed analysis pipeline, including all preprocessing steps, and a precise description of all planned analyses, including appropriate correction for multiple comparisons. Specify all covariates or regressors. Specify analysis decisions that are contingent on the outcome of prior analyses.

Results

Studies involving Neyman-Pearson inference must include a statistical power analysis. Estimated effect sizes should be justified with reference to the existing literature or theory. Because publication bias inflates published estimates of effect size, power analysis should be based on the lowest available or meaningful estimate of the effect size.

In the case of highly uncertain effect sizes, a variable sample size and interim data analysis is permissible but with inspection points stated in advance, appropriate Type I error correction for ‘peeking’ employed, and a final stopping rule for data collection outlined.

For studies involving analyses with Bayes factors, the predictions of the theory must be specified so that a Bayes factor can be calculated. Authors should indicate what distribution will be used to represent the predictions of the theory and how its parameters will be specified.

Full descriptions must be provided of any outcome-neutral criteria that must be met for successful testing of the stated hypotheses. Such quality checks might include the absence of floor or ceiling effects in data distributions, positive controls, or other quality checks that are orthogonal to the experimental hypotheses.

Supplemental Materials

Include full questionnaires, stimuli, and materials needed to conduct the study. Pilot data can be included to establish proof of concept, effect size estimations, or feasibility of proposed methods. Simulated data and analysis scripts are ideal for clarity of the exclusion criteria and analysis plan.

These guidelines were adapted from https://osf.io/pukzy.

 

Submissions can be made via this google form. Submissions will be accepted until 23:59 in the last time zone on earth on September 15th, 2019. If you have any questions, do not hesitate to email the PSA Director (Chris Chartier, cchartie@ashland.edu).

News from the Accelerator – May 2019

This month we announce an upcoming call for studies, seek contributors to several key committees, and give mini progress reports on all 6 of our current studies.

New Call for Studies Coming Soon

It is almost time for another round of study selection (our 3rd)! On June 15th we will release a detailed call for studies post with our updated submission guidelines. The deadline for study submissions will be September 15th. To begin planning now, you can review our prior call for studies, or reply to this email with any presubmission inquiries. We are excited to see what great ideas are submitted this round!

Committees Seeking Contributors

Several committees could use additional members to better support our current studies, provide training for the PSA network, and to prepare for our next round of study review and selection.

  • Ethics Committee: The Ethics Committee is seeking representation from the following broad geographic regions: Africa, Asia, Australia, and Central and South America. We are currently a group of 7; however, we are almost exclusively located in the US and Europe, and we want to meet the diverse needs of all PSA-represented regions and beyond. Currently, we have the following roles: 1) provide reviews of proposed PSA projects with an eye toward ethics, 2) provide reviews of accepted PSA project IRB and ethics review board materials prior to distribution to the larger PSA group, 3) provide ethics-oriented consultation to ongoing PSA projects, 4) work with other PSA committees to ensure ethical practices. Zoom-based meetings are approximately monthly. For more information and/or to submit a CV, please contact Erica Musser at emusser@fiu.edu.
  • Study Selection Committee: The Study Selection Committee serves a vital role in the workflow of the PSA by facilitating the review of proposed projects and ultimately deciding on their acceptance to be run by our network. We are currently seeking 2-4 new members to aid in our next round of selection in late summer and early fall of this year. Committee members must commit to serve for 1 year but may choose to resign at any point thereafter. Unlike many of the other PSA committees, the work of the Study Selection Committee is concentrated to a few months per year. The PSA solicits study proposal submissions on an ad hoc basis. After each call for submissions is closed, the Study Selection Committee evaluates the submissions for feasibility and general quality. We then vote on whether each proposal merits review. Suitable proposals are then reviewed by experts in and outside of the network. The Study Selection Committee compiles and synthesizes this feedback. The committee then makes the final selections on the basis of reviewers’ responses, evaluations from the PSA network, and the PSA’s mission, values and capacity. If you would like to join the committee, please submit a CV to committee co-chair and PSA Assistant Director Kathleen Schmidt (kathleenschmidt1@gmail.com or kathleen.schmidt@siu.edu).
  • Training Committee: The Training Committee is recruiting committee coordinators to help organize and provide trainings for the PSA and larger science community. If you are interested in joining our team, please read about our openings and apply! Link: https://forms.gle/yiS89d22bNA6P5hRA
  • Translation Coordinator for Study 004, The Accelerated CREP Project: We are looking for a translation coordinator for PSA 004 (Accelerated CREP) to coordinate and monitor the translations according to PSA’s translation policies. The Accelerated CREP is a pedagogical project conducted by student teams in collaboration with faculty supervisors. The preprint of the registered report (under review and hopefully very close to an in principle acceptance at AMPPS) is available at https://psyarxiv.com/zeux9   . The study is currently translated into nine different languages and most of these are ready for the next step in the translation process (back translations and controlling for equivalence). The translation coordinator is expected to coordinate the remainder of the translation process and monitor them. The translation coordinator will earn authorship on the resulting manuscript.

Current Studies

  • 001 FACE PERCEPTION: We have collected data from over 10,000 participants and are on track to finish data collection in September. 001 Preprint
  • 002 OBJECT ORIENTATION & 003 GENDERED PREJUDICE: The 003 pre-registration is being finalized by the lead team and data collection for the bundled projects will commence in June! 002 Preprint
  • 004 TRUE BELIEF: The Stage 1 Registered Report is under review (following 2 rounds of “revise and resubmit”) at Advances in Methods and Practices in Psychological Science. 004 Preprint
  • 005 STEREOTYPE THREAT: The Stage 1 Registered Report draft is nearly complete and will be submitted to Nature Human Behavior very soon.
  • 006 TROLLEY PROBLEM: The Stage 1 Registered Report is under review at Nature Human Behaviour. 006 Preprint

Postdoctoral Researcher Position

Tomorrow (June 1) is the last day to apply for the PSA post-doc position. We (a team led by Hans IJzerman) are looking for a talented and motivated postdoctoral researcher. The position is for two years and will be hosted at LIP/PC2S at Université Grenoble Alpes. The position’s primary focus is to support the Psychological Science Accelerator (PSA: https://psysciacc.org) in grant writing. The first priority for the postdoctoral researcher is to help write a Synergy Grant (https://erc.europa.eu/funding/synergy-grants), where the proposed co-PIs are John Ioannidis (Stanford University), Denny Borsboom and Eric Jan Wagenmakers (University of Amsterdam), Lisa DeBruine and Ben Jones (University of Glasgow), and Hans IJzerman (Université Grenoble Alpes). The postdoctoral researcher will also be involved in PSA-related research and will be involved in some supervision of students who do PSA-related research. The candidate we search should be an excellent writer and should have ample experience with open science (and be familiar with R and/or Python). To read more about context in which the postdoctoral researcher will be embedded, please go to www.corelab.io.