Checklist Elements

Guidance around each Checklist element, including instructions, examples, roles, workflow positions, and implementation notes.

What follows is a description and guidance for each Checklist element. It can be used on its own to assess compliance of a manuscript, or in conjunction with the other components.

Each Checklist element may Pass, Fail or be N/A. A manuscript is presumed to be compliant with the Handbook if all Checklist elements either pass or are not applicable. If any element fails, your journal should take appropriate action. The action taken is dependent upon the particular internal protocols for your journal. Some examples of corrective action include contacting authors separately upon each failure, or collating all failures and pass them on to the authors for correction through a single communication. The corrective activity undertaken will be specific to your journal guidelines.

Manuscript-level checks

1 - Are the availability statements for relevant digital objects present?

Instructions: Availability statements (e.g. Data Availability Statement, Software Availability Statement) should be present for all types of digital object applicable to this manuscript type and publisher, even if only to state that the resource type in question is not applicable (e.g. opinion or letter article type). Note that this is a check to see if the appropriate availability statements exist; it is not a statement about the quality or completeness of those statements. If all required availability statements are present, then this element passes. If any required availability statement is missing, this element fails.

Example: Pass (this manuscript complies with this element by having a data availability statement; no other statements are required by this publisher.)

Possible status values: Pass or Fail Implementation note: n/a; simple presence/absence check

Source (see Included Sources): F1000 Role: Administrator Workflow position: Initial QC checks

2 - Are all digital objects and their contents clearly and correctly represented within the appropriate availability statement(s)?

Instructions: This check is about the structure and completeness of the availability statement. All digital objects and their constituent files (if the digital object acts as a container) present in the manuscript text must be listed and correctly named within the availability statement appropriate for its type (e.g. data, software, protocol/material). This includes both novel and third-party data. Any formatting requirements for your availability statements from your publisher must be correctly implemented. If a digital object is present in the text but missing from the availability statements, or if it is included in the wrong availability statement, or if it is incorrectly labelled or formatted, then this element fails.

Example: Pass (This manuscript has just one availability statement (for data) and one digital object within it (a container for a few data files and a reporting guideline document). This manuscript passes by having this digital object listed in the appropriate availability statement, and by having the digital object and its constituent files correctly labelled and formatted. There are no other digital objects or availability statements to review.)

Possible status values: Pass or Fail Implementation note: Although this is a simple presence/absence check, please note that the way 'clearly and correctly' is defined is highly dependent upon your particular journal's guidance. For example, some journals allow accession numbers (e.g. P12345 from UniProt) without any accompanying resolvable portion, while others would require DOIs or other types of persistent, globally-unique, resolvable identifiers.

Source (see Included Sources): F1000 (data); PRO-MaP Table 3 Recommendation 1.5 (materials, equipment); MDAR Material (materials); TOP1:Citation; STORMS 8.1-8.5, 16, 17 Role: Administrator Workflow position: Initial QC checks

3 - How many digital objects are present across all availability statements?

Instructions: This element ensures completeness of the checks by ensuring that there is an Digital Objects Elements tab for each digital object within the current availability statement. This includes both novel and third-party data. Please enter the number of digital objects that are being checked in this availability statement in the Status column. Then create that many copies of this tab, labelling each after its owning availability statement and name (e.g. DAS - OSF OSF.IO/T765V).

Example: one (there is only a single digital object used as a project container for all data files relevant to this manuscript)

Possible status values: Pass or Fail Implementation note: n/a, simple numeric value.

Source (see Included Sources): Pilot member request Role: Administrator Workflow position: Initial QC checks

Digital Object-level checks

4 - Is the identifier provided for this digital object valid and recognised?

Instructions: Check that this digital object's identifier (e.g. DOI, ARK, URL) works, that it resolves to the correct object, and that it is of a type recognised by your journal. If it resolves, next check that the identifier is in the correct format (e.g. correctly as https://doi.org/10.25504/FAIRsharing.2hqa97 vs incorrectly as https://fairsharing.org/FAIRsharing.2hqa97). This element passes if the identifier is of an appropriate type and resolves to the correct object; if the identifier is not of an appropriate type or it fails to resolve correctly, this element fails.

Example: Pass (I click on https://doi.org/10.17605/OSF.IO/T765V and it takes me to the correct OSF page containing the documents listed within the availability statement; the identifier is a DOI, which is recognised by my journal)

Possible status values: Pass or Fail Implementation note: While checking the identifier resolves is straightforward, the test of the identifier’s suitability is more involved. Usually journals require that identifiers for digital objects be both globally unique and persistent (e.g. DOI) though individual requirements may vary. Additionally, some identifiers (such as database accession numbers) only become unique upon combining with a (sometimes non-persistent) URL prefix. It may be that the digital object’s identifier type is clear, allowing you to assess its validity very quickly. If there is any confusion with regards to the identifier type, it may help you to review the identifier type’s record in FAIRsharing. Search for the identifier name in FAIRsharing (see ‘Searching FAIRsharing’) and open the FAIRsharing record (if present). Review the record’s information to help you determine if the identifier meets your journal’s requirements.

Source (see Included Sources): F1000 (data, software); MDAR Analysis.Data Availability, Analysis.Code Availability, Materials (data, materials, software); TOP2:Transparency, TOP1:Citation; ARRIVE 20; GigaScience; Nature; PRO-MaP Table 3 Recommendation 1.5 (materials); FAIR4RS F1 Role: Editorial Office Workflow position: Initial QC checks

5 - Is the licence for the digital object allowed by your journal?

Instructions: If a licence has been applied to the the digital object, and it is one that your journal recognises, then this element passes. If there is no licence, or it is not one appropriate for your journal, this element fails. You may need to review the list of licences used by the repository where the digital object has been deposited. Sometimes a single licence is applied to all of the repository's content (in which case it may not be listed within the digital object's metadata), or the authors may have selected a licence specific to their digital object.

Example: Pass (I click on https://doi.org/10.17605/OSF.IO/T765V and it takes me to the correct OSF page, which shows a CC0 licence)

Possible status values: Pass or Fail Implementation note: It may be that the digital object’s licence is clearly stated when you follow its link, allowing you to assess its validity very quickly. If there is any confusion with regards to the licensing, it may help you to read this repository’s record in FAIRsharing. Search for the repository’s name in FAIRsharing (see ‘Searching FAIRsharing’) and open the FAIRsharing record (if present). Review the record’s licence information (see ‘Licencing for Databases and Standards’) to see if the available licences within that repository are ones that your journal allows. You may wish to combine this information with a review of the information on the digital object’s page within the repository.

Source (see Included Sources): F1000 (data), FAIR4RS R1.1 Role: Editorial Office Workflow position: Initial QC checks

6 - Is the digital object openly available? If not, are there clearly-stated and valid ethical or data protection reasons for access to be controlled?

Instructions: If the digital object is openly available to view and retrieve, then this element passes. If the digital object is not open, this element still passes if the reasons provided by the authors meet your journal's guidelines. In all other cases, this element fails.

Example: Pass (I click on https://doi.org/10.17605/OSF.IO/T765V and it takes me to the correct OSF page, where I am able to download all files without restriction)

Possible status values: Pass or Fail Implementation note: Follow the digital object’s link to its repository record. Can you download and view it? If not, is there a clear description of why access is controlled? It may be that the availability of the digital object licence is clearly stated when you follow its link, allowing you to assess this element very quickly. If there is any confusion around the digital object’s openness, it may help you to read this repository’s record in FAIRsharing.

You may already have the repository’s FAIRsharing record open from checking earlier elements; if not, simply search for the repository’s name in FAIRsharing (see ‘Searching FAIRsharing’) and open the FAIRsharing record (if present). Review the record’s data availability (see ‘Conditions for Data Access’) to determine if this element passes or fails.

Source (see Included Sources): F1000 (data); TOP2:Transparency Role: Editorial Office Workflow position: Initial QC checks

7 - If access is controlled, is the digital object available to peer reviewers?

Instructions: This check is not applicable ("N/A") if the digital object is openly available (and therefore has a Pass status for element 6 - Is the digital object openly available? If not, are there clearly-stated and valid ethical or data protection reasons for access to be controlled?). If the digital object is not open, but does provide access to reviewers of the manuscript, then this element passes. Otherwise, this element fails.

Example: N/A (Element 6 passed because the digital object is openly available, therefore this element automatically passes)

Possible status values: Pass, Fail or N/A Implementation note: If the digital object is not open, you will now need to determine if there is access for peer review of this manuscript. Visit the digital object’s repository record and follow any provided instructions for accessing the digital object. Are you able to view the digital object? It may be that you can view the digital object as a result of following the instructions, allowing you to assess this element very quickly. If there is any issue with determining the status of this element, it may help you to read this repository’s record in FAIRsharing.

You may already have the repository’s FAIRsharing record open from checking earlier elements; if not, simply search for the repository’s name in FAIRsharing (see ‘Searching FAIRsharing’) and open the FAIRsharing record (if present). Review the record’s data availability (see ‘Accessibility for pre-publication review') to determine if this element passes or fails.

Source (see Included Sources): F1000 (data); TOP2:Transparency; pilot member request. Role: Editorial Office Workflow position: Initial QC checks

8 - Has the digital object been deposited in an appropriate repository recognised by your journal?

Instructions: An ‘appropriate’ repository means different things for different types of digital objects. These instructions take you through type-specific, domain-specific and generalist repositories to help you determine if the digital object has been deposited in the appropriate repository. We strongly encourage you to review the extra information provided within the implementation note (below) for this element. Datasets, software, workflows, protocols/methods, novel standards, newly-developed databases, materials all require different types of repositories. This element passes if the digital object is stored within an appropriate repository as outlined in these instructions and linked implementation guidance. Otherwise, the element fails; failure means that an appropriate repository has not been used for this digital object.

Example: Pass (I click on https://doi.org/10.17605/OSF.IO/T765V and it takes me to the correct OSF page. I either follow the Implementation note below or my own understanding of OSF to note that this is a generalist and NOT a domain-specific repository. I also know that domain-specific repositories are not required by F1000 for this type of digital object, so this check passes)

Possible status values: Pass or Fail Implementation note: Follow the digital object’s link to its repository record. Check if the repository is ‘appropriate’ according to the guidance table below and your journal’s requirements. In this table, we have provided type-specific guidance for a number of common digital object types; match the digital object being assessed with its type in the table below and follow its guidance. Note that there is also a row for “other” types; if you have a digital object that does not fit with any of the types listed in the table, please consider this advice. It may be that the decision on the ‘appropriateness’ of a repository is clear, allowing you to assess this element very quickly. If there is any confusion around the type of repository, it may help you to read its record in FAIRsharing.

You may already have the repository’s FAIRsharing record open from checking earlier elements; if not, simply search for the repository’s name in FAIRsharing (see ‘Searching FAIRsharing’) and open the FAIRsharing record (if present). Review the record’s subject area(s) (see ‘Finding Repositories by Subject’) and type of data it accepts to determine if this element passes or fails by establishing whether or not the record’s subject area(s) match the subject area / domain of the digital object you are assessing.

Type of digital object
Guidance
Exemplar repositories

Dataset

Domain-specific repositories can make the dataset more FAIR, and many journals either prefer or require deposition in domain-specific repositories where appropriate repositories exist; please refer to your journal’s specific requirements for the dataset you are assessing. Your journal may have a list of recommended resources; alternatively, you may be asked to assess this element either through your own expertise or using other journal requirements. If the dataset is not in a domain-specific repository, it may have instead been deposited within a generalist repository. Please check your particular journal requirements as some journals may not allow the use of generalist repositories for the type of dataset you are assessing. Now continue reading below this table to help you determine if the repository is recognised by your journal.

Software

Workflow

Workflow metadata is highly specialised. Now continue reading below this table to help you determine if the repository is recognised by your journal.

Protocols and methods

Novel standards

Register novel terminologies, model/formats, reporting guidelines and identifier schemata at FAIRsharing. Now continue reading below this table to help you determine if the repository is recognised by your journal.

Newly-developed databases

Register newly-developed repositories and knowledgebases at FAIRsharing.

Now continue reading below this table to help you determine if the repository is recognised by your journal.

Other

Generalist repositories may be used in this case, or other specialist repositories that are not included in this table.

Now continue reading below this table to help you determine if the repository is recognised by your journal.

Once you have determined that the repository is suitable for your type of digital object, you next need to discover if it is one that your journal recognises. It may be that you have an internal set of guidance around this that allows you to assess this element very quickly. If there is any uncertainty, it may help you to review those repositories listed within your journal or publisher’s FAIRsharing policy record by searching using your journal or publisher name. Once you have found the policy record, you can review the list of related databases (see ‘How journals and publishers list recognised standards and databases’) and determine if the repository containing the digital object is also recognised by your journal. Remember that not all journals provide a list of recommended repositories, and even where lists are present they may be indicative rather than exhaustive. Instead, your journal may provide a list of approved characteristics/attributes of repositories, in which case you may wish to confirm those characteristics with the metadata within the FAIRsharing record for the repository the digital object is stored within.

Source (see Included Sources):: F1000 (data, software); MDAR (Design.Laboratory Protocol); TOP2:Transparency; ARRIVE 19, 20; Nature; PRO-MaP Table 3 Recommendation 2.1; GigaScience; REAPPRAISED A - Analysis and Methods Role: Editor (internal or academic, as appropriate for your journal) Workflow position: Editor Assessment

9 - Has the digital object been anonymised if necessary?

Instructions: If the digital object has been appropriately anonymised, then this element passes. If it does not require anonymisation, then set the status to ‘N/A’ (not applicable). If the digital object has not been de-identified when it should be, this element fails.

Example: N/A (the digital object does not require anonymisation)

Possible status values: Pass, Fail or N/A Implementation note: none.

Source (see Included Sources): F1000 (data) Role: Editor (internal or academic, as appropriate for your journal) Workflow position: Editor Assessment

10 - Where applicable, is there evidence that the research has been approved by a specific, recognised committee?

Instructions: If evidence is provided that the originating research for the digital object has been approved by a specific, recognised committee (e.g. ethics committee), then this element passes. Evidence may be given via the references to approval documents; an example for digital objects for material is the listing of permits and/or reference number(s) from IRB or equivalent committee(s) to show approval for any ethics requirements. If it does not require such approval, then set the status to ‘N/A’ (not applicable). If the research that created the digital object should have been approved by an appropriate committee but there is no evidence of this, the element fails.

Example: N/A (the digital object does not require approval by a recognised committee)

Possible status values: Pass, Fail or N/A Implementation note: none.

Source (see Included Sources): REAPPRAISED E - Ethics; MDAR (Design.Ethics) Role: Editor (internal or academic, as appropriate for your journal) Workflow position: Editor Assessment

11 - Where applicable, has an appropriate domain-specific metadata format been used?

Instructions: This element is specialised; while it is important to provide metadata in domain-specific formats, the stringency of the Checklist elements depends on a number of factors, such as whether or not the journal requires domain-specific repositories, and whether or not an appropriate format exists. If the digital object applies an appropriate domain-specific metadata format, this element passes. If no such format exists, or if such a format is neither applied nor mandated, the element is not applicable (‘N/A’ status). Finally, if the format is mandated but is not used, the element fails.

Example: N/A (the digital object does not have an appropriate domain-specific metadata format, and it is not mandated for this data type and this journal)

Possible status values: Pass, Fail or N/A Implementation note: Follow the digital object’s link to its repository record. Can you determine the format used (or formats, if the digital object is a container)? It may be that this information is clearly stated when you follow its link, allowing you to assess this element very quickly. If there is any confusion around the format used within this digital object, it may help you to read this repository’s record in FAIRsharing.

You may already have the repository’s FAIRsharing record open from checking earlier elements; if not, simply search for the repository’s name in FAIRsharing (see ‘Searching FAIRsharing’) and open the FAIRsharing record (if present). Review the record’s associated models/format (see ‘Retrieving Model/Formats Linked to Databases’) to determine if this element passes or fails. If a format is provided as part of the output of a repository, then you may be able to download that digital object in that format.

Source (see Included Sources): F1000 (data), PRO-MaP Table 3 Recommendation 1.4 (methods), MDAR Reporting (materials), NIH Standards (all), FAIR4RS R3 Role: Peer reviewer Workflow position: Peer reviewer and revisions

12 - Is the accompanying metadata complete according to format requirements or community best practices?

Instructions: This element is specialised; while it is important to provide complete metadata in domain-specific formats, the relevance of this element depends on a number of factors, such as whether or not the journal requires domain-specific repositories, and whether or not an appropriate format exists. Please set this element to not applicable (‘N/A’ status) if ‘11 - Where applicable, has an appropriate domain-specific metadata format been used?’ is either N/A or Fail. However, if your journal mandates the use of a domain-specific format for this digital object, then a Pass or Fail is determined by whether or not all mandatory fields within the format have been provided.

Example: N/A (because Element 11 was also N/A)

Possible status values: Pass, Fail or N/A Implementation note: Use the documentation provided by the format specification and format website to determine the completeness of the metadata for your digital object. To find these documents, you may wish to use the format’s record in FAIRsharing (see ‘Searching FAIRsharing’), if present. Format records in FAIRsharing will provide general information regarding that standard as well as links out to documentation, specifications and related standards, databases and policies.

Source (see Included Sources): MDAR Material (materials); PRO-MaP Table 3 Recommendation 1, Recommendation 3 and Recommendation 6 (methods), FAIR4RS F2-F4 and R1 Role: Peer reviewer Workflow position: Peer reviewer and revisions

Last updated