Skip navigation

Resources

Join JEDI  

last updated 2021-05-03

JEDI members are jointly developing a collection of resources for journal editors in the social sciences. We are very grateful to JEDI members who have contributed resources for this page (see below for a list) and warmly welcome further suggestions. These can be made either by posting them to the mailing list, or by directly emailing Priya Silverstein at [email protected]. Corresponding links will then be added to this page.

Please check back here from time to time, as we hope this page will be updated regularly!

If you are a journal editor and are not yet a member, please join JEDI.

General

Publishers often offer guidelines and associated resources relating to editorial practices. While typically provided for editors of the journals that they publish, these resources may also be more generally helpful. For some examples see:

Incoming editors

If you are just starting out as a journal editor, you might find the Committee on Publication Ethics’ short guide to ethical editing for new editors helpful, as well as this glossary of publishing and editing terms.

The PKP school also offers a free course in becoming an editor, focusing on how to perform the major tasks required of an editor for a scholarly journal, how to analyze and solve common problems that may arise when editing a scholarly journal, how to assist other members of the journal team, and where to look for help with difficult issues.

The Council of Science Editors has sample correspondence for an editorial office that you can customize to suit your journal.

Ethics

The Committee on Publication Ethics has many resources to help journal editors deal with ethical issues, including guidelines and case studies. For example, they have guidelines on publication manipulation.

The Council of Science Editors has a white paper on publication ethics including a guide to editor roles and responsibilities.

Diversifying social science research

Systemic inequality exists within social science research. Roberts et al. (2020) examine racial inequality in psychological research to date and offer recommendations for editors and authors for working towards research that benefits from diversity in editing, writing, and participation.

Open science

A strong consensus is emerging in the social sciences and cognate disciplines that knowledge claims are more understandable and evaluable if scholars describe the research processes in which they engaged to generate them. Citing and showing the evidence on which claims rest (when this can be done within ethical and legal constraints), discussing the processes through which evidence was garnered, and explicating the analysis that produced the claims facilitate expression, interpretation, reproduction, and replication. The Committee on Publication Ethics has a list of principles of transparency and best practice in scholarly publishing.

Nosek et al. (2015) presents an overview of the Transparency and Openness Promotion (TOP) Guidelines for journals, which have been used to generate the journal-level TOP Factor and provide a clear view of areas in which editors can consider steps towards more open science at their journals. A similar initiative is the DA-RT Journal Editors’ Transparency Statement (JETS).

Resources for authors

Aczel et al. (2020) present a consensus-based checklist to improve and document the transparency of research reports in social and behavioral research along with an online application that allows users to complete the checklist and generate a report that they can submit with their manuscript or post to a public repository.

Data and code

A set of stable and easily adoptable core practices has begun to emerge with regard to data citation and management. For example, the social sciences are increasingly adopting the use of permanent identifiers, such as digital object identifiers (DOIs), for research products, articles and datasets. Similarly, there is now a strong consensus that sharing data via trusted digital repositories is preferable to doing so via personal websites.

Journals are increasingly adopting data and code availability policies. The American Economic Association provides helpful guidance on implementation of their data and code availability policy that could easily be applied to other journals and fields.

For a discussion of the impact of journal data policy strictness on the code re-execution rate (i.e., how likely the code is to run without errors) and a set of recommendations for code dissemination aimed at journals, see Trisovic et al. (2020).

Open science badges

One way of incentivizing open science is to offer open science badges to signal and reward when underlying data, materials, or preregistrations are available. Implementing badges is associated with an increasing rate of data sharing, as seeing colleagues practice open science signals that new community norms have arrived. See the guidance on badges by the Center for Open Science for more information on how to implement badges at your journal. However, it is important to note that receiving a badge for sharing data and code does not necessarily mean that analyses are reproducible -- for this we turn to pre-publication verification of analyses.

Pre-publication verification of analyses

Some journals have now adopted a policy whereby data and code are not only required for publication in the journal, but must be checked before publication to ensure that the analyses are reproducible -- that the results in the manuscript match the results that are produced when someone who is not one of the authors re-runs the code on the data. This is called pre-publication “verification of analyses”, “data and code replication”, or “reproduction of analyses”. For more information on how to implement a policy like this at your journal, see the Data and Code Guidance by Data Editors developed by Lars Vilhuber and colleagues, which is used by the American Economic Association journals, Canadian Journal of Economics, the Review of Economic Studies, and the Economic Journal as a reference.

Unshareable data

When data is unshareable (e.g. when data are sensitive), synthetic data can still be shared. This allows pre-publication verification of analyses to still take place. Dan Quintana has done a lot of work promoting the sharing of synthetic datasets and providing resources to help authors do so -- see his YouTube video, blog post, and Quintana (2020).

Verification Reports

Verification Reports (VRs) are an article format focusing specifically on computational reproducibility and analytic robustness. VRs meet this objective by repeating the original analyses or reporting new analyses of original data. In doing so they provide the verifiers conducting the investigation with professional credit for evaluating one of the most fundamental forms of credibility: whether the claims in previous studies are justified by their own data. Chris Chambers has introduced this format at Cortex (see his introductory editorial). For examples of the first two VRs published by Cortex , see Chalkia et al. (2020) and Mirman et al. (2021). If you’re interested in including VRs as an article type at your journal, Cortex’s author guidelines provide more information on this format.

Registered Reports

Registered Reports is a publishing format used by over 250 journals that emphasizes the importance of the research question and the quality of methodology by conducting peer review prior to data collection. High quality protocols are then provisionally accepted for publication if the authors follow through with the registered methodology. This format is designed to reward best practices in adhering to the hypothetico-deductive model of the scientific method. It eliminates a variety of questionable research practices, including low statistical power, selective reporting of results, and publication bias, while allowing complete flexibility to report serendipitous findings. Although Registered Reports are usually reserved for hypothesis-testing research, a version for exploratory research -- Exploratory Reports -- is now also being offered.

Resources for editors

See the resources for editors by the Center for Open Science for more information on implementing Registered Reports at your journal. More advice for reviewers and editors can be found in Box 2 and 3 of this preprint by Chris Chambers and colleagues.

Resources for authors and reviewers

The Journal of Development Economics has developed a website dedicated to guidance for authors and reviewers on Registered Reports. Eike Rinke and colleagues at the Journal of Experimental Political Science have also created a great FAQ page for authors.

Computational research

Willis and Stodden (2020) highlight nine decision points for journals looking to improve the quality and rigor of computational research and suggest that journals reporting computational research aim to include “assessable reproducible research artifacts” along with published articles.

The American Journal of Political Science Verification Policy provides a role-model example of how computational research can be made to be more rigorous and error-free through additional steps in the editorial process – but it also shows how this requires resources on top of the procedures editors have become accustomed to over decades.

Improving the quality of reviews

Although academics are expected to peer-review articles as part of their job, they often receive little (or no) formal training for this. Early career researchers are often keen to be involved in reviewing papers, but without having had many (or any) of their own papers reviewed, they don’t know what a review should look like. Here are some how-to guides from different fields that editors can share with their reviewers in order to help increase the quality of the reviews they recieve.

In addition to this, there are papers covering what not to do as a reviewer, for example humiliate the authors (Comer et al., 2014) or be adversarial (Cormode, 2009).

For a more complete bibliography organized alphabetically, see the list here.

List of contributors to this page (alphabetical by first name)

  • Ana Trisovic
  • Andrew Foster
  • Ashley Randall
  • Chris Chambers
  • Colin Elman
  • Diana Kapiszewski
  • Eike Rinke
  • Elena Naumova
  • Kevin Arceneaux
  • Lars Vilhuber
  • Michael Weiss
  • Priya Silverstein