Checklist Elements
Guidance around each Checklist element, including instructions, examples, roles, workflow positions, and implementation notes.
Last updated
Guidance around each Checklist element, including instructions, examples, roles, workflow positions, and implementation notes.
Last updated
What follows is a description and guidance for each Checklist element. It can be used on its own to assess compliance of a manuscript, or in conjunction with the other components.
Each Checklist element may Pass, Fail or be N/A. A manuscript is presumed to be compliant with the Handbook if all Checklist elements either pass or are not applicable. If any element fails, your journal should take appropriate action. The action taken is dependent upon the particular internal protocols for your journal. Some examples of corrective action include contacting authors separately upon each failure, or collating all failures and pass them on to the authors for correction through a single communication. The corrective activity undertaken will be specific to your journal guidelines.
from [1]: Pass (this manuscript complies with this element by having a data availability statement; no other statements are required by this publisher.)
Possible values: Pass or Fail Implementation note: n/a; simple presence/absence check
Source (see ): F1000 Role: Administrator Workflow position: Initial QC checks
from [1]: Pass (This manuscript has just one availability statement (for data) and one digital object within it (a container for a few data files and a reporting guideline document). This manuscript passes by having this digital object listed in the appropriate availability statement, and by having the digital object and its constituent files correctly labelled and formatted. There are no other digital objects or availability statements to review.)
Possible values: Pass or Fail Implementation note: Although this is a simple presence/absence check, please note that the way 'clearly and correctly' is defined is highly dependent upon your particular journal's guidance. For example, some journals allow accession numbers (e.g. from UniProt) without any accompanying resolvable portion, while others would require DOIs or other types of persistent, globally-unique, resolvable identifiers.
Source (see ): F1000 (data); PRO-MaP Table 3 Recommendation 1.5 (materials, equipment); MDAR Material (materials); TOP1:Citation; STORMS 8.1-8.5, 16, 17 Role: Administrator Workflow position: Initial QC checks
Dataset
Domain-specific repositories can make the dataset more FAIR, and many journals either prefer or require deposition in domain-specific repositories where appropriate repositories exist; please refer to your journal’s specific requirements for the dataset you are assessing. Your journal may have a list of recommended resources; alternatively, you may be asked to assess this element either through your own expertise or using other journal requirements. If the dataset is not in a domain-specific repository, it may have instead been deposited within a generalist repository. Please check your particular journal requirements as some journals may not allow the use of generalist repositories for the type of dataset you are assessing. Now continue reading below this table to help you determine if the repository is recognised by your journal.
Software
Workflow
Workflow metadata is highly specialised. Now continue reading below this table to help you determine if the repository is recognised by your journal.
Protocols and methods
Novel standards
Register novel terminologies, model/formats, reporting guidelines and identifier schemata at FAIRsharing. Now continue reading below this table to help you determine if the repository is recognised by your journal.
Newly-developed databases
Register newly-developed repositories and knowledgebases at FAIRsharing.
Now continue reading below this table to help you determine if the repository is recognised by your journal.
Other
Generalist repositories may be used in this case, or other specialist repositories that are not included in this table.
Now continue reading below this table to help you determine if the repository is recognised by your journal.
from [1]: one (there is only a single digital object used as a project container for all data files relevant to this manuscript)
Possible values: Pass or Fail Implementation note: n/a, simple numeric value.
Source (see ): Pilot member request Role: Administrator Workflow position: Initial QC checks
Instructions: Check that this digital object's identifier (e.g. DOI, ARK, URL) works, that it resolves to the correct object, and that it is of a type recognised by your journal. If it resolves, next check that the identifier is in the correct format (e.g. correctly as vs incorrectly as ). This element passes if the identifier is of an appropriate type and resolves to the correct object; if the identifier is not of an appropriate type or it fails to resolve correctly, this element fails.
from [1]: Pass (I click on and it takes me to the correct OSF page containing the documents listed within the availability statement; the identifier is a DOI, which is recognised by my journal)
Possible values: Pass or Fail Implementation note: While checking the identifier resolves is straightforward, the test of the identifier’s suitability is more involved. Usually journals require that identifiers for digital objects be both globally unique and persistent (e.g. DOI) though individual requirements may vary. Additionally, some identifiers (such as database accession numbers) only become unique upon combining with a (sometimes non-persistent) URL prefix. It may be that the digital object’s identifier type is clear, allowing you to assess its validity very quickly. If there is any confusion with regards to the identifier type, it may help you to review the identifier type’s record in FAIRsharing. Search for the identifier name in FAIRsharing (see ‘’) and open the FAIRsharing record (if present). Review the record’s information to help you determine if the identifier meets your journal’s requirements.
Source (see ): F1000 (data, software); MDAR Analysis.Data Availability, Analysis.Code Availability, Materials (data, materials, software); TOP2:Transparency, TOP1:Citation; ARRIVE 20; GigaScience; Nature; PRO-MaP Table 3 Recommendation 1.5 (materials); FAIR4RS F1 Role: Editorial Office Workflow position: Initial QC checks
from [1]: Pass (I click on and it takes me to the correct OSF page, which shows a CC0 licence)
Possible values: Pass or Fail Implementation note: It may be that the digital object’s licence is clearly stated when you follow its link, allowing you to assess its validity very quickly. If there is any confusion with regards to the licensing, it may help you to read this repository’s record in FAIRsharing. Search for the repository’s name in FAIRsharing (see ‘’) and open the FAIRsharing record (if present). Review the record’s licence information (see ‘’) to see if the available licences within that repository are ones that your journal allows. You may wish to combine this information with a review of the information on the digital object’s page within the repository.
Source (see ): F1000 (data), FAIR4RS R1.1 Role: Editorial Office Workflow position: Initial QC checks
from [1]: Pass (I click on and it takes me to the correct OSF page, where I am able to download all files without restriction)
Possible values: Pass or Fail Implementation note: Follow the digital object’s link to its repository record. Can you download and view it? If not, is there a clear description of why access is controlled? It may be that the availability of the digital object licence is clearly stated when you follow its link, allowing you to assess this element very quickly. If there is any confusion around the digital object’s openness, it may help you to read this repository’s record in FAIRsharing.
You may already have the repository’s FAIRsharing record open from checking earlier elements; if not, simply search for the repository’s name in FAIRsharing (see ‘’) and open the FAIRsharing record (if present). Review the record’s data availability (see ‘’) to determine if this element passes or fails.
Source (see ): F1000 (data); TOP2:Transparency Role: Editorial Office Workflow position: Initial QC checks
Instructions: This check is not applicable ("N/A") if the digital object is openly available (and therefore has a Pass status for element ). If the digital object is not open, but does provide access to reviewers of the manuscript, then this element passes. Otherwise, this element fails.
from [1]: N/A ( passed because the digital object is openly available, therefore this element automatically passes)
Possible values: Pass, Fail or N/A Implementation note: If the digital object is not open, you will now need to determine if there is access for peer review of this manuscript. Visit the digital object’s repository record and follow any provided instructions for accessing the digital object. Are you able to view the digital object? It may be that you can view the digital object as a result of following the instructions, allowing you to assess this element very quickly. If there is any issue with determining the status of this element, it may help you to read this repository’s record in FAIRsharing.
You may already have the repository’s FAIRsharing record open from checking earlier elements; if not, simply search for the repository’s name in FAIRsharing (see ‘’) and open the FAIRsharing record (if present). Review the record’s data availability (see ‘') to determine if this element passes or fails.
Source (see ): F1000 (data); TOP2:Transparency; pilot member request. Role: Editorial Office Workflow position: Initial QC checks
from [1]: Pass (I click on and it takes me to the correct OSF page. I either follow the Implementation note below or my own understanding of OSF to note that this is a generalist and NOT a domain-specific repository. I also know that domain-specific repositories are not required by F1000 for this type of digital object, so this check passes)
Possible values: Pass or Fail Implementation note: Follow the digital object’s link to its repository record. Check if the repository is ‘appropriate’ according to the guidance table below and your journal’s requirements. In this table, we have provided type-specific guidance for a number of common digital object types; match the digital object being assessed with its type in the table below and follow its guidance. Note that there is also a row for “other” types; if you have a digital object that does not fit with any of the types listed in the table, please consider this advice. It may be that the decision on the ‘appropriateness’ of a repository is clear, allowing you to assess this element very quickly. If there is any confusion around the type of repository, it may help you to read its record in FAIRsharing.
You may already have the repository’s FAIRsharing record open from checking earlier elements; if not, simply search for the repository’s name in FAIRsharing (see ‘’) and open the FAIRsharing record (if present). Review the record’s subject area(s) (see ‘’) and type of data it accepts to determine if this element passes or fails by establishing whether or not the record’s subject area(s) match the subject area / domain of the digital object you are assessing.
VizieR () for astronomy; PANGAEA () for Earth and Environmental Sciences; TROLLing () for Linguistics, ENA () for Genomics.
Software repositories have specialist capabilities such as versioning, forking, code checks and more. Information is available in FAIRsharing repository records regarding versioning; see ''. Now continue reading below this table to help you determine if the repository is recognised by your journal.
Software Heritage (), GitHub ()
WorkflowHub ()
Linking out to protocols provides valuable methodological details for all original research articles, especially methods papers. Please refer to your journal’s specific requirements for protocols and methods, as relevant. Where your journal requires protocol availability statements, they may also ask for information on protocol versioning, forking and on the presence/absence of a long-term preservation strategy. Information is available in FAIRsharing repository records regarding versioning; see ''. Additionally, there are often checks that the version of the protocol/method stored within the repository is plausible and the same as those actually used in the research (no outcome switching). Information is available in FAIRsharing repository records regarding versioning; see ''. Now continue reading below this table to help you determine if the repository is recognised by your journal.
Protocols.io ()
FAIRsharing ()
FAIRsharing ()
Indicative generalist repositories are available at
Once you have determined that the repository is suitable for your type of digital object, you next need to discover if it is one that your journal recognises. It may be that you have an internal set of guidance around this that allows you to assess this element very quickly. If there is any uncertainty, it may help you to review those repositories listed within your journal or publisher’s FAIRsharing policy record by searching using your journal or publisher name. Once you have found the policy record, you can review the list of related databases (see ‘’) and determine if the repository containing the digital object is also recognised by your journal. Remember that not all journals provide a list of recommended repositories, and even where lists are present they may be indicative rather than exhaustive. Instead, your journal may provide a list of approved characteristics/attributes of repositories, in which case you may wish to confirm those characteristics with the metadata within the FAIRsharing record for the repository the digital object is stored within.
Source (see ):: F1000 (data, software); MDAR (Design.Laboratory Protocol); TOP2:Transparency; ARRIVE 19, 20; Nature; PRO-MaP Table 3 Recommendation 2.1; GigaScience; REAPPRAISED A - Analysis and Methods Role: Editor (internal or academic, as appropriate for your journal) Workflow position: Editor Assessment
from [1]: N/A (the digital object does not require anonymisation)
Possible values: Pass, Fail or N/A Implementation note: none.
Source (see ): F1000 (data) Role: Editor (internal or academic, as appropriate for your journal) Workflow position: Editor Assessment
Instructions: If evidence is provided that the originating research for the has been approved by a specific, recognised committee (e.g. ethics committee), then this element passes. Evidence may be given via the references to approval documents; an example for digital objects for material is the listing of permits and/or reference number(s) from IRB or equivalent committee(s) to show approval for any ethics requirements. If it does not require such approval, then set the status to ‘N/A’ (not applicable). If the research that created the digital object should have been approved by an appropriate committee but there is no evidence of this, the element fails.
from [1]: N/A (the digital object does not require approval by a recognised committee)
Possible values: Pass, Fail or N/A Implementation note: none.
Source (see ): REAPPRAISED E - Ethics; MDAR (Design.Ethics) Role: Editor (internal or academic, as appropriate for your journal) Workflow position: Editor Assessment
Instructions: This element is specialised; while it is important to provide metadata in domain-specific formats, the stringency of the Checklist elements depends on a number of factors, such as whether or not the journal requires domain-specific repositories, and whether or not an appropriate format exists. If the applies an appropriate domain-specific metadata format, this element passes. If no such format exists, or if such a format is neither applied nor mandated, the element is not applicable (‘N/A’ status). Finally, if the format is mandated but is not used, the element fails.
from [1]: N/A (the digital object does not have an appropriate domain-specific metadata format, and it is not mandated for this data type and this journal)
Possible values: Pass, Fail or N/A Implementation note: Follow the digital object’s link to its repository record. Can you determine the format used (or formats, if the digital object is a container)? It may be that this information is clearly stated when you follow its link, allowing you to assess this element very quickly. If there is any confusion around the format used within this digital object, it may help you to read this repository’s record in FAIRsharing.
You may already have the repository’s FAIRsharing record open from checking earlier elements; if not, simply search for the repository’s name in FAIRsharing (see ‘’) and open the FAIRsharing record (if present). Review the record’s associated models/format (see ‘’) to determine if this element passes or fails. If a format is provided as part of the output of a repository, then you may be able to download that digital object in that format.
Source (see ): F1000 (data), PRO-MaP Table 3 Recommendation 1.4 (methods), MDAR Reporting (materials), NIH Standards (all), FAIR4RS R3 Role: Peer reviewer Workflow position: Peer reviewer and revisions
Instructions: This element is specialised; while it is important to provide complete metadata in domain-specific formats, the relevance of this element depends on a number of factors, such as whether or not the journal requires domain-specific repositories, and whether or not an appropriate format exists. Please set this element to not applicable (‘N/A’ status) if ‘’ is either N/A or Fail. However, if your journal mandates the use of a domain-specific format for this , then a Pass or Fail is determined by whether or not all mandatory fields within the format have been provided.
from [1]: N/A (because was also N/A)
Possible values: Pass, Fail or N/A Implementation note: Use the documentation provided by the format specification and format website to determine the completeness of the metadata for your digital object. To find these documents, you may wish to use the format’s record in FAIRsharing (see ‘’), if present. Format records in FAIRsharing will provide general information regarding that standard as well as links out to documentation, specifications and related standards, databases and policies.
Source (see ): MDAR Material (materials); PRO-MaP Table 3 Recommendation 1, Recommendation 3 and Recommendation 6 (methods), FAIR4RS F2-F4 and R1 Role: Peer reviewer Workflow position: Peer reviewer and revisions
from [2]: This element passes because the three files listed within the Data Availability section are all listed within the References section.
Possible values: Pass, Fail or N/A Implementation note: If you are tagging citations with a type, the type should be as precise and accurate as possible and not a catch-all term (e.g. “other”). There is a specific reference type for “dataset” in the .
Source (see ): F1000 Role: Production Editor Workflow position: Production and Typesetting