As work comes to a close on the OA Dashboard project, we wanted to share our findings and conclusions and give an outline of what we are planning to do next in this space. Taken forward by Research Consulting in partnership with Pleiade Management and Consultancy and Digirati, the project aimed to assess the feasibility of a dashboard that would support institutions by combining and visualising data on OA. Such a system has the potential to improve institutional workflows by providing easier access to information on OA.
In order to investigate this further, the project followed a three-phase approach to scope the creation of an OA dashboard: We analysed five alternative dashboard options, created a prototype for one of these, and considered the business case for further development.
Phase 1 – User requirements of a Jisc OA dashboard
An exploratory study led to the development of five possible dashboards/use cases (for more information, please see the blog post on Phase 1), along with hypothetical data sources. Following consultation with institutional representatives, the following use cases were prioritised for technical assessment:
- Dashboard A (monitoring OA articles) would help to determine if an article is open access or not, and which type of open access it is. This is currently a manual process which leads to frequent duplication of effort across institutions.
- Dashboard B (effectiveness of OA policy) would help with OA advocacy to academics if it was able to show clear advantage in terms of citation or altmetric indicators for OA articles.
Phase 2 – Technical features of a dashboard prototype and user testing
As Dashboard A would provide the basis for Dashboard B, technical development of the Dashboard A prototype began using:
- Crossref, to provide the universe of publications;
- oaDOI, to provide licence information and OA status; and
- Sherpa/RoMEO, to provide publishers’ policies on copyright and self-archiving.
Each data source for Dashboard A required significant effort to obtain and normalise a large amount of information. The development of Dashboard B would have required the inclusion of yet more data sources, and have led to the replication of some of the work previously completed in the Library Data Labs. As a result, it was deemed out of scope for the prototype. Another key issue identified was the difficulty in sourcing a ‘universe of publications’ for UK HEIs from open data sources, which would affect the completeness of the dashboard.
In terms of UX design, limitations in the data visualisation tool used for the prototype meant it was not possible to produce a fully responsive and easily navigable set of interlinked dashboards. In addition, to circumvent some limitations in existing data models, a custom one was created to support the prototype.
Once the Dashboard A prototype was ready (see Figure 1), it was shared with prospective institutional users in a series of demonstration sessions. These were used to gather structured feedback through a brief survey, and qualitative comments on the business case for such a product.
Figure 1 Jisc OA Dashboard main screen in Tableau.
Phase 3 – Business case development
Following consultation with prospective institutional users, it appeared that, whilst they appreciated the intuitiveness of the Dashboard A prototype, data quality and coverage were key concerns.
Interviewees also struggled to quantify the dashboard’s contributions to streamlining workflows, as the Dashboard A prototype would provide additional insights rather than replacing existing activities. Thus, they felt that time savings would likely be limited.
Other observations raised were that:
- Usage of the dashboard would likely be ad-hoc, rather than regular.
- The dashboard would have limited value as a standalone service and could be better marketed if it were embedded with other Jisc services.
In terms of delivering a national picture of overall progress in the take-up of OA, funders cited existing mechanisms to obtain some of the information presented in the prototype dashboard and were pursuing other developments (e.g., via EuropePMC and Researchfish) to address known gaps.
Conclusions and recommendations
We reached the conclusion that a full business case cannot be built at this time, as the strength of the available evidence is, on average, low, and does not enable a strong case for further investment to be made. A key factor is that, although there is a gap in terms of analysing data on OA, open data sources are not mature enough to power a dashboard and may undermine the validity of its outputs.Whilst it is recommended that the development of a dashboard of this nature is put on hold and re-evaluated in the future, Jisc recognises the importance of centralised systems that enable libraries in being able to monitor their OA activity, encourage the discovery of OA content and support decision-making relating to their library holdings more generally. Therefore, the sector should be assured that work will continue in earnest to investigate new, innovative ways of working in this area.
In addition, Jisc will continue to seek to improve the quality and availability of data sources to enable further efforts, by:
- Considering how a comprehensive, open-source record of UK HEIs’ publication output might be developed;
- Ensuring that the terms and conditions for existing Jisc services permit re-use of relevant data in future services;
- Promoting greater uptake of institutional identifiers within key data sources;
- Continuing its support for ORCID;
- Improving internal consistency of Jisc data sources;
- Considering re-use of data models in development (e.g., the one used for the RDSS).
Based on the report’s conclusions, Jisc is not planning to take forward development work around an OA Dashboard in the near future. However, as noted above, it will continue to work to offer the sector new ways in which OA can be effectively monitored and support new paths for OA content discovery, particularly if this affects decisions with regard to wider library holdings.
Additionally, we will continue to work on the other relevant aspects of the recommendations and, potentially, revisit the idea again in the future once some of the technical constraints have been mitigated.
A copy of the report, “Defining and Prototyping an OA Dashboard – Final Report” is now available, which should give much more context to the findings.