Wednesday 22nd October 2025—Crossref, the open scholarly infrastructure nonprofit, today releases an enhanced dashboard showing metadata coverage and individual organisations’ contributions to documenting the process and outputs of scientific research in the open. The tool helps research-performing, funding, and publishing organisations identify gaps in open research information, and provides supporting evidence for movements like the Barcelona Declaration for Open Research Information, which encourages more substantial commitment to stewarding and enriching the scholarly record through open metadata.
Welcome back to our series of case studies of research funders using the Grant Linking System. In this interview, I talk with Cátia Laranjeira, PTCRIS Program Manager at FCCN|FCT, Portugal’s main public funding agency, about the agency’s approach to metadata, persistent identifiers, Open Science and Open Infrastructure.
With a holistic approach to the management, production and access to information on science, FCCN|FCT’s decision to implement the Grant Linking System within their processes was not simply a technical upgrade, but a coordinated effort to continue building a strong culture of openness. With the mantra “register once, reuse always”, FCCN|FCT efforts to embrace open funding metadata was only logical.
Repositories are home to a wide range of scholarly content; they often archive theses, dissertations, preprints, datasets, and other valuable outputs. These records are an important part of the research ecosystem and should be connected to the broader scholarly record. But to truly serve their purpose, repository records need to be connected to each other, to the broader research ecosystem, and to the people behind the research. Metadata is what makes that possible. Enhancing metadata is a way to tell a fuller, more accurate story of research. It helps surface relationships between works, people, funders, and institutions, and allows us as a community to build and use a more connected, more useful network of knowledge - what Crossref calls the ‘Research Nexus’.
The Crossref Grant Linking System (GLS) has been facilitating the registration, sharing and re-use of open funding metadata for six years now, and we have reached some important milestones recently! What started as an interest in identifying funders through the Open Funder Registry evolved to a more nuanced and comprehensive way to share and re-use open funding data systematically. That’s how, in collaboration with the funding community, the Crossref Grant Linking System was developed. Open funding metadata is fundamental for the transparency and integrity of the research endeavour, so we are happy to see them included in the Research Nexus.
We’ve just released an update to our participation report, which provides a view for our members into how they are each working towards best practices in open metadata. Prompted by some of the signatories and organizers of the Barcelona Declaration, which Crossref supports, and with the help of our friends at CWTS Leiden, we have fast-tracked the work to include an updated set of metadata best practices in participation reports for our members. The reports now give a more complete picture of each member’s activity.
What do we mean by ‘participation’?
Crossref runs open infrastructure to link research objects, entities, and actions, creating a lasting and reusable scholarly record. As a not-for-profit with over 20,000 members in 160 countries, we drive metadata exchange and support nearly 2 billion monthly API queries, facilitating global research communication.
To make this system work, members strive to provide as much metadata as possible through Crossref to ensure it is openly distributed throughout the scholarly ecosystem at scale rather than bilaterally, thereby realizing the collective benefit of membership. Together, our membership provides and uses a rich nexus of information— known as the research nexus—on which the community can build tools to help progress knowledge.
Each member commits to certain terms, such as keeping metadata current, updating links for their DOIs to redirect to, linking references and other objects, and preserving their content in perpetuity. Beyond this, we also encourage members to register as much rich metadata as is relevant and possible.
Creating and providing richer metadata is a key part of participation in Crossref; we’ve long encouraged a more complete scholarly record, such as through Metadata 20/20, and through supporting or leading initiatives for specific metadata, like open citations (I4OC), open abstracts (I4OA), open contributors (ORCID), and open affiliations (ROR).
Which metadata elements are considered best practices?
Alongside basic bibliographic metadata such as title, authors, and publication date(s), we encourage members to register metadata in the following fields:
Example participation report for Crossref member University of Szeged
References
A list of all the references used by a work. This is particularly relevant for journal articles but the references can include any type of object, including datasets, versions, preprints, and more. Additionally, we encourage these to be added into relationships, where relevant.
Abstracts
A description of the work. These are particularly useful for discovery systems that will promote the work, and are often used in downstream analyses such as for detecting integrity issues.
Contributor IDs (ORCID)
All authors should be included in a work’s metadata, ideally alongside their verified ORCID identifier.
Affiliations / Affiliation IDs (ROR)
Members are able to register contributor affiliations as free text, but we are encouraging everyone to add ROR IDs for affiliations as the recommended best practice, as this differentiates and avoids mistyping. These two fields have newly been added to the participation reports interface in the most recent update.
Funder IDs (OFR)
Acknowledging the organisation(s) that funded the work. We encourage the inclusion of Open Funder Registry identifiers to make the funding metadata more usable. This will evolve into an additional use case for ROR over time.
Funding award numbers / Grant IDs (Crossref)
A number or identifier assigned by the funding organisation to identify the specific award of funding or other support such as use of equipment or facilities, prizes, tuition, etc. The Crossref Grant Linking System includes a unique persistent link that can be connected with outputs, activities, people, and organisations.
Crossmark
The Crossmark service gives readers quick and easy access to the current status of a record, including any corrections, retractions, or updates, via a button embedded on PDFs or a web article. Openly adding corrections, retractions, and errata is critical part of publishing, and the button provides readers with an easy in-context alert.
Similarity Check URLs
The Similarity Check service helps editors to identify text-based plagiarism through our collective agreement for the membership to access to Turnitin’s powerful text comparison tool, iThenticate. Specific full-text links are required to participate in this service.
License URLs
URLs pointing to a license that explains the terms and conditions under which readers can access content. These links are crucial to denote intended downstream use.
Text mining URLs
Full-text URLs that help researchers in meta-science easily locate your content for text and data mining.
What is a participation report?
Participation reports are are a visualization of the data representing members’ participation to the scholarly record which is available via our open REST API. There’s a separate participation report for each member, and each report shows what percentage of that member’s metadata records include 11 key metadata elements. These key elements add context and richness, and help to open up members’ work to easier discovery and wider and more varied use. As a member, you can use participation reports to see for yourself where the gaps in your organisation’s metadata are, and perhaps compare your performance to others. Participation reports are free and open to everyone - so you can also check the report for any other members you are interested in.
We first introduced participation reports in 2018. At the time, Anna Tolwinska and Kirsty Meddings wrote:
Metadata is at the heart of all our services. With a growing range of members participating in our community—often compiling or depositing metadata on behalf of each other—the need to educate and express obligations and best practice has increased. In addition, we’ve seen more and more researchers and tools making use of our APIs to harvest, analyze and re-purpose the metadata our members register, so we’ve been very aware of the need to be more explicit about what this metadata enables, why, how, and for whom.
All of that still rings true today. But as the research nexus continues to evolve, so should the tools that intend to reflect it. For example, in 2022, we removed the Open references field from participation reports after a board vote to change our policy and update the membership terms meant that all references deposited with Crossref would be open by default. And now we’ve expanded the list of fields again, adding coverage data for contributor affiliation text and ROR identifiers.
Putting it in practice
To find out how you measure up when it comes to participation, type the name of your member organisation into the search box. You may be surprised by what you find—we often speak to members who thought they were registering a certain type of metadata for all their records, only to learn from their participation report that something is getting lost along the way.
You can only address gaps in your metadata if you know that they exist.
More information, as well as a breakdown of the now 11 key metadata elements listed in every participation report and tips on improving your scores, is available in our documentation.
And if you have any questions or feedback, come talk to us on the community forum or request a metadata Health Check by emailing the community team.