Checklist Elements
Guidance around each Checklist element, including instructions, examples, roles, workflow positions, and implementation notes.
What follows is a description and guidance for each Checklist element. It can be used on its own to assess compliance of a manuscript, or in conjunction with the other components.
Each Checklist element may Pass, Fail or be N/A. A manuscript is presumed to be compliant with the Handbook if all Checklist elements either pass or are not applicable. If any element fails, your journal should take appropriate action. The action taken is dependent upon the particular internal protocols for your journal. Some examples of corrective action include contacting authors separately upon each failure, or collating all failures and pass them on to the authors for correction through a single communication. The corrective activity undertaken will be specific to your journal guidelines.
Manuscript-level checks
1 - Are the availability statements for relevant digital objects present?
Example from [1]: Pass (this manuscript complies with this element by having a data availability statement; no other statements are required by this publisher.)
Possible status values: Pass or Fail Implementation note: n/a; simple presence/absence check
Source (see Included Sources): F1000 Role: Administrator Workflow position: Initial QC checks
2 - Are all digital objects and their contents clearly and correctly represented within the appropriate availability statement(s)?
Example from [1]: Pass (This manuscript has just one availability statement (for data) and one digital object within it (a container for a few data files and a reporting guideline document). This manuscript passes by having this digital object listed in the appropriate availability statement, and by having the digital object and its constituent files correctly labelled and formatted. There are no other digital objects or availability statements to review.)
Possible status values: Pass or Fail Implementation note: Although this is a simple presence/absence check, please note that the way 'clearly and correctly' is defined is highly dependent upon your particular journal's guidance. For example, some journals allow accession numbers (e.g. P12345 from UniProt) without any accompanying resolvable portion, while others would require DOIs or other types of persistent, globally-unique, resolvable identifiers.
Source (see Included Sources): F1000 (data); PRO-MaP Table 3 Recommendation 1.5 (materials, equipment); MDAR Material (materials); TOP1:Citation; STORMS 8.1-8.5, 16, 17 Role: Administrator Workflow position: Initial QC checks
3 - How many digital objects are present across all availability statements?
Example from [1]: one (there is only a single digital object used as a project container for all data files relevant to this manuscript)
Possible status values: Pass or Fail Implementation note: n/a, simple numeric value.
Source (see Included Sources): Pilot member request Role: Administrator Workflow position: Initial QC checks
Digital Object-level checks
4 - Is the identifier provided for this digital object valid and recognised?
Example from [1]: Pass (I click on https://doi.org/10.17605/OSF.IO/T765V and it takes me to the correct OSF page containing the documents listed within the availability statement; the identifier is a DOI, which is recognised by my journal)
Possible status values: Pass or Fail Implementation note: While checking the identifier resolves is straightforward, the test of the identifier’s suitability is more involved. Usually journals require that identifiers for digital objects be both globally unique and persistent (e.g. DOI) though individual requirements may vary. Additionally, some identifiers (such as database accession numbers) only become unique upon combining with a (sometimes non-persistent) URL prefix. It may be that the digital object’s identifier type is clear, allowing you to assess its validity very quickly. If there is any confusion with regards to the identifier type, it may help you to review the identifier type’s record in FAIRsharing. Search for the identifier name in FAIRsharing (see ‘Searching FAIRsharing’) and open the FAIRsharing record (if present). Review the record’s information to help you determine if the identifier meets your journal’s requirements.
Source (see Included Sources): F1000 (data, software); MDAR Analysis.Data Availability, Analysis.Code Availability, Materials (data, materials, software); TOP2:Transparency, TOP1:Citation; ARRIVE 20; GigaScience; Nature; PRO-MaP Table 3 Recommendation 1.5 (materials); FAIR4RS F1 Role: Editorial Office Workflow position: Initial QC checks
5 - Is the licence for the digital object allowed by your journal?
Example from [1]: Pass (I click on https://doi.org/10.17605/OSF.IO/T765V and it takes me to the correct OSF page, which shows a CC0 licence)
Possible status values: Pass or Fail Implementation note: It may be that the digital object’s licence is clearly stated when you follow its link, allowing you to assess its validity very quickly. If there is any confusion with regards to the licensing, it may help you to read this repository’s record in FAIRsharing. Search for the repository’s name in FAIRsharing (see ‘Searching FAIRsharing’) and open the FAIRsharing record (if present). Review the record’s licence information (see ‘Licencing for Databases and Standards’) to see if the available licences within that repository are ones that your journal allows. You may wish to combine this information with a review of the information on the digital object’s page within the repository.
Source (see Included Sources): F1000 (data), FAIR4RS R1.1 Role: Editorial Office Workflow position: Initial QC checks
6 - Is the digital object openly available? If not, are there clearly-stated and valid ethical or data protection reasons for access to be controlled?
Example from [1]: Pass (I click on https://doi.org/10.17605/OSF.IO/T765V and it takes me to the correct OSF page, where I am able to download all files without restriction)
Possible status values: Pass or Fail Implementation note: Follow the digital object’s link to its repository record. Can you download and view it? If not, is there a clear description of why access is controlled? It may be that the availability of the digital object licence is clearly stated when you follow its link, allowing you to assess this element very quickly. If there is any confusion around the digital object’s openness, it may help you to read this repository’s record in FAIRsharing.
You may already have the repository’s FAIRsharing record open from checking earlier elements; if not, simply search for the repository’s name in FAIRsharing (see ‘Searching FAIRsharing’) and open the FAIRsharing record (if present). Review the record’s data availability (see ‘Conditions for Data Access’) to determine if this element passes or fails.
Source (see Included Sources): F1000 (data); TOP2:Transparency Role: Editorial Office Workflow position: Initial QC checks
7 - If access is controlled, is the digital object available to peer reviewers?
Example from [1]: N/A (Element 6 passed because the digital object is openly available, therefore this element automatically passes)
Possible status values: Pass, Fail or N/A Implementation note: If the digital object is not open, you will now need to determine if there is access for peer review of this manuscript. Visit the digital object’s repository record and follow any provided instructions for accessing the digital object. Are you able to view the digital object? It may be that you can view the digital object as a result of following the instructions, allowing you to assess this element very quickly. If there is any issue with determining the status of this element, it may help you to read this repository’s record in FAIRsharing.
You may already have the repository’s FAIRsharing record open from checking earlier elements; if not, simply search for the repository’s name in FAIRsharing (see ‘Searching FAIRsharing’) and open the FAIRsharing record (if present). Review the record’s data availability (see ‘Accessibility for pre-publication review') to determine if this element passes or fails.
Source (see Included Sources): F1000 (data); TOP2:Transparency; pilot member request. Role: Editorial Office Workflow position: Initial QC checks
8 - Has the digital object been deposited in an appropriate repository recognised by your journal?
Example from [1]: Pass (I click on https://doi.org/10.17605/OSF.IO/T765V and it takes me to the correct OSF page. I either follow the Implementation note below or my own understanding of OSF to note that this is a generalist and NOT a domain-specific repository. I also know that domain-specific repositories are not required by F1000 for this type of digital object, so this check passes)
Possible status values: Pass or Fail Implementation note: Follow the digital object’s link to its repository record. Check if the repository is ‘appropriate’ according to the guidance table below and your journal’s requirements. In this table, we have provided type-specific guidance for a number of common digital object types; match the digital object being assessed with its type in the table below and follow its guidance. Note that there is also a row for “other” types; if you have a digital object that does not fit with any of the types listed in the table, please consider this advice. It may be that the decision on the ‘appropriateness’ of a repository is clear, allowing you to assess this element very quickly. If there is any confusion around the type of repository, it may help you to read its record in FAIRsharing.
You may already have the repository’s FAIRsharing record open from checking earlier elements; if not, simply search for the repository’s name in FAIRsharing (see ‘Searching FAIRsharing’) and open the FAIRsharing record (if present). Review the record’s subject area(s) (see ‘Finding Repositories by Subject’) and type of data it accepts to determine if this element passes or fails by establishing whether or not the record’s subject area(s) match the subject area / domain of the digital object you are assessing.
Dataset
Domain-specific repositories can make the dataset more FAIR, and many journals either prefer or require deposition in domain-specific repositories where appropriate repositories exist; please refer to your journal’s specific requirements for the dataset you are assessing. Your journal may have a list of recommended resources; alternatively, you may be asked to assess this element either through your own expertise or using other journal requirements. If the dataset is not in a domain-specific repository, it may have instead been deposited within a generalist repository. Please check your particular journal requirements as some journals may not allow the use of generalist repositories for the type of dataset you are assessing. Now continue reading below this table to help you determine if the repository is recognised by your journal.
VizieR (https://doi.org/10.25504/FAIRsharing.hLKD2V) for astronomy; PANGAEA (https://doi.org/10.25504/FAIRsharing.6yw6cp) for Earth and Environmental Sciences; TROLLing (https://doi.org/10.25504/FAIRsharing.6yw6cp) for Linguistics, ENA (https://doi.org/10.25504/FAIRsharing.dj8nt8) for Genomics.
Software
Software repositories have specialist capabilities such as versioning, forking, code checks and more. Information is available in FAIRsharing repository records regarding versioning; see 'Versioning within Databases'. Now continue reading below this table to help you determine if the repository is recognised by your journal.
Software Heritage (https://doi.org/10.25504/FAIRsharing.6ffb92), GitHub (https://doi.org/10.25504/FAIRsharing.c55d5e)
Workflow
Workflow metadata is highly specialised. Now continue reading below this table to help you determine if the repository is recognised by your journal.
WorkflowHub (https://doi.org/10.25504/FAIRsharing.07cf72)
Protocols and methods
Linking out to protocols provides valuable methodological details for all original research articles, especially methods papers. Please refer to your journal’s specific requirements for protocols and methods, as relevant. Where your journal requires protocol availability statements, they may also ask for information on protocol versioning, forking and on the presence/absence of a long-term preservation strategy. Information is available in FAIRsharing repository records regarding versioning; see 'Data Preservation Policies'. Additionally, there are often checks that the version of the protocol/method stored within the repository is plausible and the same as those actually used in the research (no outcome switching). Information is available in FAIRsharing repository records regarding versioning; see 'Versioning within Databases'. Now continue reading below this table to help you determine if the repository is recognised by your journal.
Protocols.io (https://doi.org/10.25504/FAIRsharing.132b10)
Novel standards
Register novel terminologies, model/formats, reporting guidelines and identifier schemata at FAIRsharing. Now continue reading below this table to help you determine if the repository is recognised by your journal.
FAIRsharing (https://doi.org/10.25504/FAIRsharing.2abjs5)
Newly-developed databases
Register newly-developed repositories and knowledgebases at FAIRsharing.
Now continue reading below this table to help you determine if the repository is recognised by your journal.
FAIRsharing (https://doi.org/10.25504/FAIRsharing.2abjs5)
Other
Generalist repositories may be used in this case, or other specialist repositories that are not included in this table.
Now continue reading below this table to help you determine if the repository is recognised by your journal.
Indicative generalist repositories are available at https://fairsharing.org/3541
Once you have determined that the repository is suitable for your type of digital object, you next need to discover if it is one that your journal recognises. It may be that you have an internal set of guidance around this that allows you to assess this element very quickly. If there is any uncertainty, it may help you to review those repositories listed within your journal or publisher’s FAIRsharing policy record by searching using your journal or publisher name. Once you have found the policy record, you can review the list of related databases (see ‘How journals and publishers list recognised standards and databases’) and determine if the repository containing the digital object is also recognised by your journal. Remember that not all journals provide a list of recommended repositories, and even where lists are present they may be indicative rather than exhaustive. Instead, your journal may provide a list of approved characteristics/attributes of repositories, in which case you may wish to confirm those characteristics with the metadata within the FAIRsharing record for the repository the digital object is stored within.
Source (see Included Sources):: F1000 (data, software); MDAR (Design.Laboratory Protocol); TOP2:Transparency; ARRIVE 19, 20; Nature; PRO-MaP Table 3 Recommendation 2.1; GigaScience; REAPPRAISED A - Analysis and Methods Role: Editor (internal or academic, as appropriate for your journal) Workflow position: Editor Assessment
9 - Has the digital object been anonymised if necessary?
Example from [1]: N/A (the digital object does not require anonymisation)
Possible status values: Pass, Fail or N/A Implementation note: none.
Source (see Included Sources): F1000 (data) Role: Editor (internal or academic, as appropriate for your journal) Workflow position: Editor Assessment
10 - Where applicable, is there evidence that the research has been approved by a specific, recognised committee?
Example from [1]: N/A (the digital object does not require approval by a recognised committee)
Possible status values: Pass, Fail or N/A Implementation note: none.
Source (see Included Sources): REAPPRAISED E - Ethics; MDAR (Design.Ethics) Role: Editor (internal or academic, as appropriate for your journal) Workflow position: Editor Assessment
11 - Where applicable, has an appropriate domain-specific metadata format been used?
Example from [1]: N/A (the digital object does not have an appropriate domain-specific metadata format, and it is not mandated for this data type and this journal)
Possible status values: Pass, Fail or N/A Implementation note: Follow the digital object’s link to its repository record. Can you determine the format used (or formats, if the digital object is a container)? It may be that this information is clearly stated when you follow its link, allowing you to assess this element very quickly. If there is any confusion around the format used within this digital object, it may help you to read this repository’s record in FAIRsharing.
You may already have the repository’s FAIRsharing record open from checking earlier elements; if not, simply search for the repository’s name in FAIRsharing (see ‘Searching FAIRsharing’) and open the FAIRsharing record (if present). Review the record’s associated models/format (see ‘Retrieving Model/Formats Linked to Databases’) to determine if this element passes or fails. If a format is provided as part of the output of a repository, then you may be able to download that digital object in that format.
Source (see Included Sources): F1000 (data), PRO-MaP Table 3 Recommendation 1.4 (methods), MDAR Reporting (materials), NIH Standards (all), FAIR4RS R3 Role: Peer reviewer Workflow position: Peer reviewer and revisions
12 - Is the accompanying metadata complete according to format requirements or community best practices?
Example from [1]: N/A (because Element 11 was also N/A)
Possible status values: Pass, Fail or N/A Implementation note: Use the documentation provided by the format specification and format website to determine the completeness of the metadata for your digital object. To find these documents, you may wish to use the format’s record in FAIRsharing (see ‘Searching FAIRsharing’), if present. Format records in FAIRsharing will provide general information regarding that standard as well as links out to documentation, specifications and related standards, databases and policies.
Source (see Included Sources): MDAR Material (materials); PRO-MaP Table 3 Recommendation 1, Recommendation 3 and Recommendation 6 (methods), FAIR4RS F2-F4 and R1 Role: Peer reviewer Workflow position: Peer reviewer and revisions
13 - Does the digital object have a citation in the article's 'References' section?
Example from [2]: This element passes because the three files listed within the Data Availability section are all listed within the References section.
Possible status values: Pass, Fail or N/A Implementation note: If you are tagging citations with a type, the type should be as precise and accurate as possible and not a catch-all term (e.g. “other”). There is a specific reference type for “dataset” in the JATS structure.
Source (see Included Sources): F1000 Role: Production Editor Workflow position: Production and Typesetting
Last updated