Lots of exciting innovations are being made in scientific publishing, often raising fundamental questions about established publishing practices. In this guest post, Ludo Waltman and André Brasil discuss the recently launched MetaROR publish-review-curate platform and the questions it raises about good practices for Crossref DOI registration in this emerging landscape.
Crossref and the Public Knowledge Project (PKP) have been working closely together for many years, sharing resources and supporting our overlapping communities of organisations involved in communicating research. Now we’re delighted to share that we have agreed on a new set of objectives for our partnership, centred on further development of the tools that our shared community relies upon, as well as building capacity to enable richer metadata registration for organisations using the Open Journal Systems (OJS).
To mark Crossref’s 25th anniversary, we launched our first Metadata Awards to highlight members with the best metadata practices.
GigaScience Press, based in Hong Kong, was the leader among small publishers, defined as organisations with less than USD 1 million in publishing revenue or expenses. We spoke with Scott Edmunds, Ph.D., Editor-in-Chief at GigaScience Press, about how discoverability drives their high metadata standards.
What motivates your organisation/team to work towards high-quality metadata? What objectives does it support for your organisation?
Our objective is to communicate science openly and collaboratively, without barriers, to solve problems in a data- and evidence-driven manner through Open Science publishing. High-quality metadata helps us address these objectives by improving the discoverability, transparency, and provenance of the work we publish. It is an integral part of the FAIR principles and UNESCO Open Science Recommendation, playing a role in increasing the accessibility of research for both humans and machines. As one of the authors of the FAIR principles paper and an advisor of the Make Data Count project, I’ve also personally been very conscious to practice what I preach.
On behalf of the Nominating Committee, I’m pleased to share the slate of candidates for the 2025 board election.
Each year we do an open call for board interest. This year, the Nominating Committee received 51 submissions from members worldwide to fill five open board seats.
We have four large member seats and one small member seat open for election in 2025. We maintain a balanced board of 8 large member seats and 8 small member seats. Size is determined based on the organization’s membership tier (small members fall in the $0-$1,650 tiers and large members in the $3,900 - $50,000 tiers).
Members can participate in Cited-by by completing the following steps:
Deposit references for one or more prefixes as part of your content registration process. Use your Participation Report to see your progress with depositing references. This step is not mandatory, but highly recommended to ensure that your citation counts are complete.
We will match the metadata in the references to DOIs to establish Cited-by links in the database. As new content is registered, we automatically update the citations and, for those members with Cited-by alerts enabled, we notify you of the new links.
Display the links on your website. We recommend displaying citations you retrieve on DOI landing pages, for example:
If you are a member through a Sponsor, you may have access to Cited-by through your sponsor – please contact them for more details. OJS users can use the Cited-by plugin. If you need more help, you can learn more from PKP’s Cited-by (OJS Scopus/Crossref plugin) - as of OJS 3.2, this third-party plugin allows journals to display citations and citation counts (using article DOIs) from Scopus and/or Crossref.
Citation matching
Members sometimes submit references without including a DOI tag for the cited work. When this happens, we look for a match based on the metadata provided. If we find one, the reference metadata is updated with the DOI and we add the "doi-asserted-by": "crossref" tag. If we don’t find a match immediately, we will try again at a later date.
There are some references for which we won’t find matches, for example where a DOI has been registered with an agency other than Crossref (such as DataCite) or if the reference refers to an object without a DOI, including conferences, manuals, blog posts, and some journals’ articles.
To perform matching, we first check if a DOI tag is included in the reference metadata. If so, we assume it is correct and link the corresponding work. If there isn’t a DOI tag, we perform a search using the metadata supplied and select candidate results by thresholding. The best match is found through a further validation process. Learn more about how we match references. The same process is used for the results shown on our Simple Text Query tool.
All citations to a work are returned in the corresponding Cited-by query.
Page maintainer: Isaac Farley Last updated: 2023-April-28