Skip to PREreview

PREreview of An Inception Cohort Study Quantifying How Many Registered Studies are Published

Published
DOI
10.5281/zenodo.10144353
License
CC BY 4.0

Review of the manuscript “An Inception Cohort Study Quantifying How Many Registered Studies are Published” submitted to Advances in Methods and Practices in Psychological Science

Authors: Eline Ensinck & Daniel Lakens

Reviewer: Olmo van den Akker

In this manuscript, the authors present an inception cohort study in which they found that preregistrations on the Open Science Framework often remain unpublished. Using a short survey, they also explored the causes of this leaky publication pipeline. The study cleverly uses the OSF embargo policy to extract the relevant data and would be a valuable contribution to the existing literature on preregistration, especially if the following point are improved. These points mainly relate to improving the clarity of the methods and the overall readability of the manuscript.

Abstract

-          I would only include the 169 preregistrations that were research studies in the second sentence as this is the number that is used as the denominator in calculating the proportion of published preregistrations.

Introduction / current study

-          You use the term ‘published’ throughout the manuscript but this term is ambiguous given that preregistrations can also be said to be published (e.g. “our preregistration was published on the Open Science Framework”). Please resolve any ambiguity by defining publication/publishing in the introduction and make sure that it is clear throughout the manuscript what is being published. The same holds for the term ‘sharing’ as one can share a study in a paper but also a study (plan) in a preregistration.

-          The introduction (the part before ‘The current study’) focuses only on the first (and perhaps the third) research question. It would improve the readability of the manuscript if the other research aims were also touched upon in the introduction. Alternatively, you could label research aims 2-4 as secondary since the first one seems to be the main one.

-          Aim 2 could be described more clearly. The question seems to focus on the policies (plural) of a study registry (singular) implying a within-subjects design, but later in the sentence it seems to compare registries with different policies, implying a between-subjects design. In the study itself, only the OSF registry is assessed so I don’t think it is warranted to make claims about the differences between registries as you do in the last sentence of the paragraph. In addition, that the research question is later split into two questions can cause some confusion. I suggest omitting these ‘subquestions’ and focusing on the question presented in the first sentence.

-          For the paragraph outlining the third research aim the subject in the first sentence is ‘fellow researchers’ and the subject in the second sentence is ‘a registry’. I think it would improve readability if the subject in these sentences is the same.

-          “Registries often require researchers to update a registered study with the results. An example of such a study registry is ClinicalTrials.gov. The requirement to register certain types of studies does not exist in psychology” à This section seems to use two types of requirements: (1) the requirement from registries that registrations should be updated with results, and (2) the requirement from governmental authorities to register studies in the first place. This double meaning makes this section difficult to parse. I suggest splitting these meanings up, and possibly using different wordings. For the requirement of clinical trials in medicine to be registered, you could use the following references:

o   European Commission. (2012). Commission guideline: Guidance on posting and publication of                 result-   related information on clinical trials in relation to the implementation of Article 57(2) of Regulation (EC) No 726/2004 and Article 41(2) of Regulation (EC) No 1901/2006. Retrieved from https://web.archive.org/web/20230305174330/https://op.europa.eu/en/publication-detail/-/publication/9a64920e-1134-11e2-8e28-01aa75ed71a1/language-en

o   Food and Drug Administration Amendments Act of 2007. (2018). Retrieved from https://www.govinfo.gov/content/pkg/PLAW-110publ85/pdf/PLAW-110publ85.pdf

Method

-          I applaud the choice of categorizing preprints as published but it would be good to see some results for preprints vs. peer-reviewed publications in an exploratory analysis. The same holds for other publication types that may not be considered standard, like posters. Also, since the definition is so broad, would slides of presentations at a conference be considered published? I don’t believe these are currently included in the database. It would be good to include them or provide a reason for their absence.

-          I think it is important to provide the actual data from your study in Table 3. The top half of the table could involve the actual data and the bottom half of the table could involve the extrapolated numbers that consider the whole of the OSF. As it is, it appears like you assessed all OSF registries and present these data in Table 3, which is not the case.

-          Please elaborate on what you mean by ‘final classification’ on page 5 and why April 2022 was chosen as a cutoff date. In general, I can imagine this section is hard to parse for readers who are not familiar with the OSF user interface. Perhaps a short explanation, possibly with some images could help.

-          In the section ‘Data sources’ you discuss sampling registries in one sentence and studies in the sentence after. Having consistent subjects could help here as well.

-          “Our sampling strategy increased the probability that studies with multiple registrations were included in the study” à Compared to which situation?

-          “We aimed to examine at least 300 registrations (150 for which the registration was made public by the researchers, and 150 for which the registration was made public automatically after four years). We had no justification for our sample size given uncertainty about the number of studies that would fit our inclusion criteria but relied on an informal trade-off between time constraints and sufficiently accurate estimates.” à You mention a goal of examining at least 300 registrations, but you now only include 169 in the analysis. Please discuss this discrepancy as well as why you believe it is a large enough sample to draw conclusions about the publication rate of preregistrations on OSF and beyond.

-          “We classified OSF registrations as made public by the researcher when any of the associated OSF project pages was public, and classified the project as made public automatically by the OSF after four years when the associated project page was not made public.” à In the first part of the sentence, you talk about classifying OSF registration, while in the second part of the sentence you talk about classifying the project. I believe the second part should be about registrations as well. In general, it would be good to strongly distinguish between registrations and projects in this section given that the link between them is crucial to the proxy you are using. Maybe a table with all the possibilities and their resulting classifications would help.

-          “During data analysis we made sure that we did not include the same study twice (even though a single study can have multiple registrations, for example because an analysis subcomponent and a hypothesis subcomponent each get a separate ID in the OSF database).” à From this description it does not become clear how you made sure that there were no duplicates. From experience, avoiding duplicates is challenging given that authors often relabel projects and preregistration/paper titles.

-          “To circumvent this limitation, we relied on a proxy indicator to classify projects as opened by the user, or opened automatically when the embargo was lifted.” à I wonder whether you have been in touch with the COF to validate this method. Such a validation would strengthen the method used so I would encourage contacting the COF.

-          Please explain (1) how you ended up with 315 registrations (i.e., what was the reason for stopping there and not checking more registrations) and (2) what the randomization process looked like for selecting the registrations from the population of 17,729 registrations.

-           

Results

-          I would refer to Table 2 as soon as you discuss the reasons for exclusion on page 7 and possibly merge this section with the section about exclusions before and after emails. A PRISMA flow diagram could also help to make the process of including registrations clearer to readers.

-          “The number of included and excluded registrations in the sample is summarized in Table 1.” à It seems that the table only explicates included studies so phrasing it like this is a little confusing.

-          “Furthermore, if no publication could be found, there always remained the possibility that we failed to find it.” à This seems to be a tautological statement. Please elaborate.

-          “Based on an extensive search” à It might be my affinity with the topic, but I find this search is very interesting from a methodological standpoint. I would prefer to see it in the main text.

-          “We remained uncertain when, for example, there was very little information in the preregistration, and the information that was available seemed related to a paper without a link to an OSF registration.”  à This sentence can be omitted given the information provided earlier.

-          “for one registration we did not have to email the researcher because we could retrieve the required information from a paper in which they explicitly discussed their file drawer (van Elk, 2021)” à To improve readability, I think it’s better to provide this information in a separate sentence.

-          “After updating our classification based on responses the publication rate in group 1 increased by 6% (meaning that we failed to identify some publications corresponding to registrations) while the publication rate in group 2 decreased by 1%.” à For clarity, it would be nice to have a short explanation within brackets for the second part of the sentence as well.

-          Page 10 mentions 50 registrations that were classified as unpublished and 25 registrations that were classified as published. This does not add up to the 76 registrations for which you contacted the authors of the registration. Also, if I’m not mistaken the section does not state information about the number of authors that you could not find contact information for.

-          “Our sample contained an almost equal number of manually opened and automatically opened registrations to examine our research question about the consequences of automatically opening registrations after an embargo.” à This sentence seems to be redundant given the information provided elsewhere.

-          “Multiplying our publication estimates by the estimated number registrations on the OSF before November 2017 that can be classified as an actual research study we find that 5550 registrations are published, and 3994 remain unpublished” à Please stress in this sentence that the that this is an estimate. Also, I believe you forgot an ‘of’ in between ‘number’ and ‘registrations’.

-          I assume that the discrepancy between the sample mean (61.3%) and the population estimate of 58.2% is because of differences in the proportion of manually opened and automatically opened registrations but I think it would be good to make that explicit in the text.

-          “This shows that a platform that enables researchers to make their registration public will make studies that would otherwise remain in the file-drawer known to the research community” I don’t think you can make this claim based on the data. Policies like this probably differ strongly between registries in all kinds of ways. The same holds for the conclusion about automatically opened registrations.

-          “The goal of the study, which was not publishing the results because the project was educational or intended to be shared with stakeholders” à This was also an inclusion criterion in your study.  How do you explain the prevalence of this motivation given that you already filtered for ‘educational registrations’?

-          “According to the self-reported causes for non-publication researchers shared with us logistical issues are the biggest cause of unpublished studies in our sample. Researchers leave academia and fail to complete a project before their contract ends, move on to a new position with a different research focus, or experience a lack of time to complete this specific project given other responsibilities.” à Superfluous sentence given the information provided in the preceding paragraph.

-          “Out of 43 responses more diverse reasons were given for keeping the OSF project closed” à For readability, I think it would be good to mimic the structure of the sentence before.

-          Throughout the manuscript you refer to the supplement, typically “for more details”.  It would be good, however, to know what kind of details readers can expect there. That way, they can make a more informed decision about whether they want to access those supplementary materials or not.

-          “Our study provides an important data point to understand how many studies are performed but remain unpublished.” à I think this is an unnecessary sentence unless you indicate explicitly why you think this is a particularly important data point.

-          “The possibility that around 41.8% of research studies registered on the OSF (excluding multi-lab studies and Registered Reports) are not shared should make scientists reflect on the efficiency of scientific research.” à It might be good to come full circle with the introduction and add a sentence or two about research waste (over and above what you say in the sentence after.

-          “Better time-management, designing projects from the outset in a way that they can be completed by other team members, or making unfinished projects available to potential collaborators could mitigate these issues.” à I think a more expansive discussion is warranted here. How can we attain these goals in practice? What are the most important barriers? The same holds for making researchers conduct registered reports.

Discussion

-          “Extrapolating from our sample to the population of all registrations” à You already extrapolated in the first sentence so I would rephrase this.

-          “In short, the exact percentage of studies that remain unpublished will be different in other contexts” à I believe this makes sense, but I would welcome a more elaborate discussion on how the estimate would change if these other factors were considered, as well as a discussion about how future studies can incorporate these factors. An added factor that could be discussed is that the OSF contains a majority of psychology registrations and other disciplines are less represented (see Van den Akker er al, 2023: ‘Selective Hypothesis Reporting in Psychology’).

-          “Our study demonstrates that the percentage of unpublished studies can be high. Empirically examining the size of the file drawer can determine where there is room for improvement” à Seem like a superfluous sentence given the earlier information.

-          “It is recommended to follow a comprehensive preregistration template to increase the quality of the reregistration (Akker et al., 2023)” à Please refer to this paper as (Van den Akker et al., 2023), also in the reference list.

Please don’t hesitate to contact me if you have any questions or comments about this review. All the best with the revision!

Olmo van den Akker (ovdakker@gmail.com)

Competing interests

The author declares that they have no competing interests.