The recent Metric Tide report proposed the notion of responsible metrics as a way of framing appropriate uses of quantitative indicators in the governance, management and assessment of research. One of the crucial dimensions of responsible metrics are transparent measurement systems. Data collection and analytical processes should be kept open and transparent (including university rankings), so that those being evaluated can test and verify the results.
The move to open access and open research means there are now opportunities for more open citation systems.
In a short experiment we’re investigating new forms of research citation and measures that could offer more accurate and transparent methods of measuring research impact based on an open approach.
We think that CORE which aggregates all open access research outputs from repositories and journals worldwide provides a great basis for this kind of demonstration. CORE is delivered by Jisc in partnership with the Open University (OU).
The open citations experiment team – Drahomira Hermannova and Petr Knoth will develop a graphical demonstrator of a new class of metrics for evaluating research. The demonstrator will allow us to visualise and compare – on a small set of preselected publications from CORE – traditional citation counts with the scores for the new metrics (the contribution score).
While this new method relies on the access to a citation network it does not use citation counts as evidence of impact. Instead it falls in the class of semantometrics/contribution methods, which aim at using the full-text of resources in order to assess research value, you can find out more about them in this 2014 Dlib article.
As part of the experiment a short report will be produced that:
- provides a qualitative comparative evaluation of the semantometric/contribution method against a baseline of traditional impact metrics
- compares technical challenges in larges use of semantometrics and traditional metrics including their possible adoption in the CORE aggregator
- proposes possible modifications/improvements to the contribution method and discusses the advantages and disadvantages of the metrics with respect to time delay, validity across subject fields, transparency, resistance to gaming/manipulation
The demonstrator and report will be available in January 2016, and we’ll post a link to them on this blog.
Following this experiment we plan to evaluate the approach and hope to work with researchers, librarians, research managers, funders and other projects working in this space to assess feasibility and to inform thinking on new services and approaches for research citation.
If you have any thoughts, ideas or connections do let me know.