LogoLogo
  • Overview
  • Understanding the Checklist
    • Checklist Elements
    • Implementation tips
  • Understanding the Flowchart
  • Context, credits and references
Powered by GitBook
On this page
  • Manuscript-level checks
  • 1 - Are the availability statements for relevant digital objects present?
  • 2 - Are all digital objects and their contents clearly and correctly represented within the appropriate availability statement(s)?
  • 3 - How many digital objects are present across all availability statements?
  • Digital Object-level checks
  • 4 - Is the identifier provided for this digital object valid and recognised?
  • 5 - Is the licence for the digital object allowed by your journal?
  • 6 - Is the digital object openly available? If not, are there clearly-stated and valid ethical or data protection reasons for access to be controlled?
  • 7 - If access is controlled, is the digital object available to peer reviewers?
  • 8 - Has the digital object been deposited in an appropriate repository recognised by your journal?
  • 9 - Has the digital object been anonymised if necessary?
  • 10 - Where applicable, is there evidence that the research has been approved by a specific, recognised committee?
  • 11 - Where applicable, has an appropriate domain-specific metadata format been used?
  • 12 - Is the accompanying metadata complete according to format requirements or community best practices?
  • 13 - Does the digital object have a citation in the article's 'References' section?
Export as PDF
  1. Understanding the Checklist

Checklist Elements

Guidance around each Checklist element, including instructions, examples, roles, workflow positions, and implementation notes.

PreviousUnderstanding the ChecklistNextImplementation tips

Last updated 3 months ago

What follows is a description and guidance for each Checklist element. It can be used on its own to assess compliance of a manuscript, or in conjunction with the other components.

Each Checklist element may Pass, Fail or be N/A. A manuscript is presumed to be compliant with the Handbook if all Checklist elements either pass or are not applicable. If any element fails, your journal should take appropriate action. The action taken is dependent upon the particular internal protocols for your journal. Some examples of corrective action include contacting authors separately upon each failure, or collating all failures and pass them on to the authors for correction through a single communication. The corrective activity undertaken will be specific to your journal guidelines.

Manuscript-level checks

1 - Are the availability statements for relevant present?

Instructions: Availability statements (e.g. Data Availability Statement, Software Availability Statement) should be present for all types of digital object applicable to this manuscript type and publisher, even if only to state that the resource type in question is not applicable (e.g. opinion or letter article type). Note that this is a check to see if the appropriate availability statements exist; it is not a statement about the quality or completeness of those statements. If all required availability statements are present, then this element passes. If any required availability statement is missing, this element fails.

from [1]: Pass (this manuscript complies with this element by having a data availability statement; no other statements are required by this publisher.)

Possible values: Pass or Fail Implementation note: n/a; simple presence/absence check

Source (see ): F1000 Role: Administrator Workflow position: Initial QC checks

2 - Are all and their contents clearly and correctly represented within the appropriate availability statement(s)?

Instructions: This check is about the structure and completeness of the availability statement. All digital objects and their constituent files (if the digital object acts as a container) present in the manuscript text must be listed and correctly named within the availability statement appropriate for its type (e.g. data, software, protocol/material). This includes both novel and third-party data. Any formatting requirements for your availability statements from your publisher must be correctly implemented. If a digital object is present in the text but missing from the availability statements, or if it is included in the wrong availability statement, or if it is incorrectly labelled or formatted, then this element fails.

from [1]: Pass (This manuscript has just one availability statement (for data) and one digital object within it (a container for a few data files and a reporting guideline document). This manuscript passes by having this digital object listed in the appropriate availability statement, and by having the digital object and its constituent files correctly labelled and formatted. There are no other digital objects or availability statements to review.)

Possible values: Pass or Fail Implementation note: Although this is a simple presence/absence check, please note that the way 'clearly and correctly' is defined is highly dependent upon your particular journal's guidance. For example, some journals allow accession numbers (e.g. from UniProt) without any accompanying resolvable portion, while others would require DOIs or other types of persistent, globally-unique, resolvable identifiers.

Source (see ): F1000 (data); PRO-MaP Table 3 Recommendation 1.5 (materials, equipment); MDAR Material (materials); TOP1:Citation; STORMS 8.1-8.5, 16, 17 Role: Administrator Workflow position: Initial QC checks

3 - How many are present across all availability statements?

Instructions: This element ensures completeness of the checks by ensuring that there is an Digital Objects Elements tab for each digital object within the current availability statement. This includes both novel and third-party data. Please enter the number of digital objects that are being checked in this availability statement in the Status column. Then create that many copies of this tab, labelling each after its owning availability statement and name (e.g. DAS - OSF OSF.IO/T765V).

Digital Object-level checks

Instructions: If a licence has been applied to the the digital object, and it is one that your journal recognises, then this element passes. If there is no licence, or it is not one appropriate for your journal, this element fails. You may need to review the list of licences used by the repository where the digital object has been deposited. Sometimes a single licence is applied to all of the repository's content (in which case it may not be listed within the digital object's metadata), or the authors may have selected a licence specific to their digital object.

Instructions: If the digital object is openly available to view and retrieve, then this element passes. If the digital object is not open, this element still passes if the reasons provided by the authors meet your journal's guidelines. In all other cases, this element fails.

Instructions: An ‘appropriate’ repository means different things for different types of digital objects. These instructions take you through type-specific, domain-specific and generalist repositories to help you determine if the digital object has been deposited in the appropriate repository. We strongly encourage you to review the extra information provided within the implementation note (below) for this element. Datasets, software, workflows, protocols/methods, novel standards, newly-developed databases, materials all require different types of repositories. This element passes if the digital object is stored within an appropriate repository as outlined in these instructions and linked implementation guidance. Otherwise, the element fails; failure means that an appropriate repository has not been used for this digital object.

Type of digital object
Guidance
Exemplar repositories

Dataset

Domain-specific repositories can make the dataset more FAIR, and many journals either prefer or require deposition in domain-specific repositories where appropriate repositories exist; please refer to your journal’s specific requirements for the dataset you are assessing. Your journal may have a list of recommended resources; alternatively, you may be asked to assess this element either through your own expertise or using other journal requirements. If the dataset is not in a domain-specific repository, it may have instead been deposited within a generalist repository. Please check your particular journal requirements as some journals may not allow the use of generalist repositories for the type of dataset you are assessing. Now continue reading below this table to help you determine if the repository is recognised by your journal.

Software

Workflow

Workflow metadata is highly specialised. Now continue reading below this table to help you determine if the repository is recognised by your journal.

Protocols and methods

Novel standards

Register novel terminologies, model/formats, reporting guidelines and identifier schemata at FAIRsharing. Now continue reading below this table to help you determine if the repository is recognised by your journal.

Newly-developed databases

Register newly-developed repositories and knowledgebases at FAIRsharing.

Now continue reading below this table to help you determine if the repository is recognised by your journal.

Other

Generalist repositories may be used in this case, or other specialist repositories that are not included in this table.

Now continue reading below this table to help you determine if the repository is recognised by your journal.

Instructions: If the digital object has been appropriately anonymised, then this element passes. If it does not require anonymisation, then set the status to ‘N/A’ (not applicable). If the digital object has not been de-identified when it should be, this element fails.

10 - Where applicable, is there evidence that the research has been approved by a specific, recognised committee?

11 - Where applicable, has an appropriate domain-specific metadata format been used?

12 - Is the accompanying metadata complete according to format requirements or community best practices?

13 - Does the digital object have a citation in the article's 'References' section?

Instructions: Machine-readable links between the published article and each of the digital objects must be included in the article metadata to be compliant with this check. The most common way of introducing and curating these links is via citations. A check near the end of the submission workflow ensures compliance for all digital objects – especially if new objects have been made available and checked as part of implementing this Handbook. If the digital object has been properly cited according to your journal’s guidelines, then this element passes. If the digital object has not been properly cited, the element fails.

from [1]: one (there is only a single digital object used as a project container for all data files relevant to this manuscript)

Possible values: Pass or Fail Implementation note: n/a, simple numeric value.

Source (see ): Pilot member request Role: Administrator Workflow position: Initial QC checks

4 - Is the identifier provided for this valid and recognised?

Instructions: Check that this digital object's identifier (e.g. DOI, ARK, URL) works, that it resolves to the correct object, and that it is of a type recognised by your journal. If it resolves, next check that the identifier is in the correct format (e.g. correctly as vs incorrectly as ). This element passes if the identifier is of an appropriate type and resolves to the correct object; if the identifier is not of an appropriate type or it fails to resolve correctly, this element fails.

from [1]: Pass (I click on and it takes me to the correct OSF page containing the documents listed within the availability statement; the identifier is a DOI, which is recognised by my journal)

Possible values: Pass or Fail Implementation note: While checking the identifier resolves is straightforward, the test of the identifier’s suitability is more involved. Usually journals require that identifiers for digital objects be both globally unique and persistent (e.g. DOI) though individual requirements may vary. Additionally, some identifiers (such as database accession numbers) only become unique upon combining with a (sometimes non-persistent) URL prefix. It may be that the digital object’s identifier type is clear, allowing you to assess its validity very quickly. If there is any confusion with regards to the identifier type, it may help you to review the identifier type’s record in FAIRsharing. Search for the identifier name in FAIRsharing (see ‘’) and open the FAIRsharing record (if present). Review the record’s information to help you determine if the identifier meets your journal’s requirements.

Source (see ): F1000 (data, software); MDAR Analysis.Data Availability, Analysis.Code Availability, Materials (data, materials, software); TOP2:Transparency, TOP1:Citation; ARRIVE 20; GigaScience; Nature; PRO-MaP Table 3 Recommendation 1.5 (materials); FAIR4RS F1 Role: Editorial Office Workflow position: Initial QC checks

5 - Is the licence for the allowed by your journal?

from [1]: Pass (I click on and it takes me to the correct OSF page, which shows a CC0 licence)

Possible values: Pass or Fail Implementation note: It may be that the digital object’s licence is clearly stated when you follow its link, allowing you to assess its validity very quickly. If there is any confusion with regards to the licensing, it may help you to read this repository’s record in FAIRsharing. Search for the repository’s name in FAIRsharing (see ‘’) and open the FAIRsharing record (if present). Review the record’s licence information (see ‘’) to see if the available licences within that repository are ones that your journal allows. You may wish to combine this information with a review of the information on the digital object’s page within the repository.

Source (see ): F1000 (data), FAIR4RS R1.1 Role: Editorial Office Workflow position: Initial QC checks

6 - Is the ? If not, are there clearly-stated and valid ethical or data protection reasons for access to be controlled?

from [1]: Pass (I click on and it takes me to the correct OSF page, where I am able to download all files without restriction)

Possible values: Pass or Fail Implementation note: Follow the digital object’s link to its repository record. Can you download and view it? If not, is there a clear description of why access is controlled? It may be that the availability of the digital object licence is clearly stated when you follow its link, allowing you to assess this element very quickly. If there is any confusion around the digital object’s openness, it may help you to read this repository’s record in FAIRsharing.

You may already have the repository’s FAIRsharing record open from checking earlier elements; if not, simply search for the repository’s name in FAIRsharing (see ‘’) and open the FAIRsharing record (if present). Review the record’s data availability (see ‘’) to determine if this element passes or fails.

Source (see ): F1000 (data); TOP2:Transparency Role: Editorial Office Workflow position: Initial QC checks

7 - If access is controlled, is the available to peer reviewers?

Instructions: This check is not applicable ("N/A") if the digital object is openly available (and therefore has a Pass status for element ). If the digital object is not open, but does provide access to reviewers of the manuscript, then this element passes. Otherwise, this element fails.

from [1]: N/A ( passed because the digital object is openly available, therefore this element automatically passes)

Possible values: Pass, Fail or N/A Implementation note: If the digital object is not open, you will now need to determine if there is access for peer review of this manuscript. Visit the digital object’s repository record and follow any provided instructions for accessing the digital object. Are you able to view the digital object? It may be that you can view the digital object as a result of following the instructions, allowing you to assess this element very quickly. If there is any issue with determining the status of this element, it may help you to read this repository’s record in FAIRsharing.

You may already have the repository’s FAIRsharing record open from checking earlier elements; if not, simply search for the repository’s name in FAIRsharing (see ‘’) and open the FAIRsharing record (if present). Review the record’s data availability (see ‘') to determine if this element passes or fails.

Source (see ): F1000 (data); TOP2:Transparency; pilot member request. Role: Editorial Office Workflow position: Initial QC checks

8 - Has the been deposited in an appropriate repository recognised by your journal?

from [1]: Pass (I click on and it takes me to the correct OSF page. I either follow the Implementation note below or my own understanding of OSF to note that this is a generalist and NOT a domain-specific repository. I also know that domain-specific repositories are not required by F1000 for this type of digital object, so this check passes)

Possible values: Pass or Fail Implementation note: Follow the digital object’s link to its repository record. Check if the repository is ‘appropriate’ according to the guidance table below and your journal’s requirements. In this table, we have provided type-specific guidance for a number of common digital object types; match the digital object being assessed with its type in the table below and follow its guidance. Note that there is also a row for “other” types; if you have a digital object that does not fit with any of the types listed in the table, please consider this advice. It may be that the decision on the ‘appropriateness’ of a repository is clear, allowing you to assess this element very quickly. If there is any confusion around the type of repository, it may help you to read its record in FAIRsharing.

You may already have the repository’s FAIRsharing record open from checking earlier elements; if not, simply search for the repository’s name in FAIRsharing (see ‘’) and open the FAIRsharing record (if present). Review the record’s subject area(s) (see ‘’) and type of data it accepts to determine if this element passes or fails by establishing whether or not the record’s subject area(s) match the subject area / domain of the digital object you are assessing.

VizieR () for astronomy; PANGAEA () for Earth and Environmental Sciences; TROLLing () for Linguistics, ENA () for Genomics.

Software repositories have specialist capabilities such as versioning, forking, code checks and more. Information is available in FAIRsharing repository records regarding versioning; see ''. Now continue reading below this table to help you determine if the repository is recognised by your journal.

Software Heritage (), GitHub ()

WorkflowHub ()

Linking out to protocols provides valuable methodological details for all original research articles, especially methods papers. Please refer to your journal’s specific requirements for protocols and methods, as relevant. Where your journal requires protocol availability statements, they may also ask for information on protocol versioning, forking and on the presence/absence of a long-term preservation strategy. Information is available in FAIRsharing repository records regarding versioning; see ''. Additionally, there are often checks that the version of the protocol/method stored within the repository is plausible and the same as those actually used in the research (no outcome switching). Information is available in FAIRsharing repository records regarding versioning; see ''. Now continue reading below this table to help you determine if the repository is recognised by your journal.

Protocols.io ()

FAIRsharing ()

FAIRsharing ()

Indicative generalist repositories are available at

Once you have determined that the repository is suitable for your type of digital object, you next need to discover if it is one that your journal recognises. It may be that you have an internal set of guidance around this that allows you to assess this element very quickly. If there is any uncertainty, it may help you to review those repositories listed within your journal or publisher’s FAIRsharing policy record by searching using your journal or publisher name. Once you have found the policy record, you can review the list of related databases (see ‘’) and determine if the repository containing the digital object is also recognised by your journal. Remember that not all journals provide a list of recommended repositories, and even where lists are present they may be indicative rather than exhaustive. Instead, your journal may provide a list of approved characteristics/attributes of repositories, in which case you may wish to confirm those characteristics with the metadata within the FAIRsharing record for the repository the digital object is stored within.

Source (see ):: F1000 (data, software); MDAR (Design.Laboratory Protocol); TOP2:Transparency; ARRIVE 19, 20; Nature; PRO-MaP Table 3 Recommendation 2.1; GigaScience; REAPPRAISED A - Analysis and Methods Role: Editor (internal or academic, as appropriate for your journal) Workflow position: Editor Assessment

9 - Has the been anonymised if necessary?

from [1]: N/A (the digital object does not require anonymisation)

Possible values: Pass, Fail or N/A Implementation note: none.

Source (see ): F1000 (data) Role: Editor (internal or academic, as appropriate for your journal) Workflow position: Editor Assessment

Instructions: If evidence is provided that the originating research for the has been approved by a specific, recognised committee (e.g. ethics committee), then this element passes. Evidence may be given via the references to approval documents; an example for digital objects for material is the listing of permits and/or reference number(s) from IRB or equivalent committee(s) to show approval for any ethics requirements. If it does not require such approval, then set the status to ‘N/A’ (not applicable). If the research that created the digital object should have been approved by an appropriate committee but there is no evidence of this, the element fails.

from [1]: N/A (the digital object does not require approval by a recognised committee)

Possible values: Pass, Fail or N/A Implementation note: none.

Source (see ): REAPPRAISED E - Ethics; MDAR (Design.Ethics) Role: Editor (internal or academic, as appropriate for your journal) Workflow position: Editor Assessment

Instructions: This element is specialised; while it is important to provide metadata in domain-specific formats, the stringency of the Checklist elements depends on a number of factors, such as whether or not the journal requires domain-specific repositories, and whether or not an appropriate format exists. If the applies an appropriate domain-specific metadata format, this element passes. If no such format exists, or if such a format is neither applied nor mandated, the element is not applicable (‘N/A’ status). Finally, if the format is mandated but is not used, the element fails.

from [1]: N/A (the digital object does not have an appropriate domain-specific metadata format, and it is not mandated for this data type and this journal)

Possible values: Pass, Fail or N/A Implementation note: Follow the digital object’s link to its repository record. Can you determine the format used (or formats, if the digital object is a container)? It may be that this information is clearly stated when you follow its link, allowing you to assess this element very quickly. If there is any confusion around the format used within this digital object, it may help you to read this repository’s record in FAIRsharing.

You may already have the repository’s FAIRsharing record open from checking earlier elements; if not, simply search for the repository’s name in FAIRsharing (see ‘’) and open the FAIRsharing record (if present). Review the record’s associated models/format (see ‘’) to determine if this element passes or fails. If a format is provided as part of the output of a repository, then you may be able to download that digital object in that format.

Source (see ): F1000 (data), PRO-MaP Table 3 Recommendation 1.4 (methods), MDAR Reporting (materials), NIH Standards (all), FAIR4RS R3 Role: Peer reviewer Workflow position: Peer reviewer and revisions

Instructions: This element is specialised; while it is important to provide complete metadata in domain-specific formats, the relevance of this element depends on a number of factors, such as whether or not the journal requires domain-specific repositories, and whether or not an appropriate format exists. Please set this element to not applicable (‘N/A’ status) if ‘’ is either N/A or Fail. However, if your journal mandates the use of a domain-specific format for this , then a Pass or Fail is determined by whether or not all mandatory fields within the format have been provided.

from [1]: N/A (because was also N/A)

Possible values: Pass, Fail or N/A Implementation note: Use the documentation provided by the format specification and format website to determine the completeness of the metadata for your digital object. To find these documents, you may wish to use the format’s record in FAIRsharing (see ‘’), if present. Format records in FAIRsharing will provide general information regarding that standard as well as links out to documentation, specifications and related standards, databases and policies.

Source (see ): MDAR Material (materials); PRO-MaP Table 3 Recommendation 1, Recommendation 3 and Recommendation 6 (methods), FAIR4RS F2-F4 and R1 Role: Peer reviewer Workflow position: Peer reviewer and revisions

from [2]: This element passes because the three files listed within the Data Availability section are all listed within the References section.

Possible values: Pass, Fail or N/A Implementation note: If you are tagging citations with a type, the type should be as precise and accurate as possible and not a catch-all term (e.g. “other”). There is a specific reference type for “dataset” in the .

Source (see ): F1000 Role: Production Editor Workflow position: Production and Typesetting

https://doi.org/10.25504/FAIRsharing.2hqa97
https://fairsharing.org/FAIRsharing.2hqa97
6 - Is the digital object openly available? If not, are there clearly-stated and valid ethical or data protection reasons for access to be controlled?
https://doi.org/10.25504/FAIRsharing.hLKD2V
https://doi.org/10.25504/FAIRsharing.6yw6cp
https://doi.org/10.25504/FAIRsharing.6yw6cp
https://doi.org/10.25504/FAIRsharing.dj8nt8
https://doi.org/10.25504/FAIRsharing.6ffb92
https://doi.org/10.25504/FAIRsharing.c55d5e
https://doi.org/10.25504/FAIRsharing.07cf72
https://doi.org/10.25504/FAIRsharing.132b10
https://doi.org/10.25504/FAIRsharing.2abjs5
https://doi.org/10.25504/FAIRsharing.2abjs5
https://fairsharing.org/3541
P12345
https://doi.org/10.17605/OSF.IO/T765V
https://doi.org/10.17605/OSF.IO/T765V
https://doi.org/10.17605/OSF.IO/T765V
https://doi.org/10.17605/OSF.IO/T765V
JATS structure
Element 6
11 - Where applicable, has an appropriate domain-specific metadata format been used?
Element 11
digital object
digital object
digital object
Searching FAIRsharing
Conditions for Data Access
Searching FAIRsharing
Accessibility for pre-publication review
Searching FAIRsharing
Finding Repositories by Subject
How journals and publishers list recognised standards and databases
Searching FAIRsharing
Retrieving Model/Formats Linked to Databases
Versioning within Databases
Data Preservation Policies
Versioning within Databases
Searching FAIRsharing
Searching FAIRsharing
Licencing for Databases and Standards
Searching FAIRsharing
Example
Included Sources
Example
Included Sources
Example
Included Sources
Example
Included Sources
Example
Included Sources
Example
Included Sources
Example
Included Sources
Example
Included Sources
Example
Included Sources
Example
Included Sources
Example
Included Sources
Example
Included Sources
Example
Included Sources
digital objects
status
digital objects
status
digital objects
status
digital object
status
digital object
status
digital object
openly available
status
digital object
status
digital object
status
digital object
status
status
status
status
status