Recently, there has been a shift in attitudes against improper use of metrics to evaluate research. In our effort to develop alternative approaches to identify relevant measurements for the impact of research at the institute level – beyond simple frequency counts and rankings based on single indicators – we decided to match the publications of a targeted institute to the Web of Science (WoS) database and with the use of machine learning techniques and visualization tools to explore and showcase our results respectively.
We present the Responsible Institute Impact Assessment (RI2A) for evaluating the research impact of a targeted institute with a case study. We used the 448 publications retrieved from the University of Southern Denmark’s (SDU) research registration database for the Department of Marketing and Management between 2012-2017. Of these, 170 publications satisfied the criteria of both being peer-reviewed and having a DOI to be matched in the WoS database. We then used the bibliographic coupling algorithm to cluster the articles that cited the work produced at the targeted institute. This algorithm groups data together based on the number of shared references between the identified units for analysis (article, source journal and institution level). Lastly, a co-word analysis was used to identify pair-wise relationships between keywords found in the citing articles.
A total of 1195 citing articles without self-citations where identified. The results of RI2A assist the targeted institute by allowing them to discover
- the researchers who cite their publications and the relationships among them,
- the journals that cite their publications and the relationships among them,
- the universities, institutes and organizations that use their publications and the relationships among them, and
- the main groups of keywords used in the citing articles.
Visualizing these results into graphs and making sense of them requires more work than simple publication/citation counts. Although not described in detail here, a collaboration between the SDU library and the university’s research support and policy services has started, where it has been proposed that evaluation should be based on joint work between evaluators and evaluatees focusing on strengths and weaknesses as well as timewise comparison of previous assessments. A straightforward mode would be to compare the results of the mapping exercise with an already known description of the evaluated department’s profile. For instance, by overlaying the targeted department’s research groups on the titles of the citing journals we discovered that the “Strategic Organisation Design” group is referenced a lot in “Organisation Studies”, “Journal of Management Studies”, “Strategic Management Journal” and “Human Relations” journals and that these journals form a distinct cluster based on shared citing practices.
Our approach responds to the conditions of keeping the process relatively simple and short for use in the library setting, yet meaningful for a combined quantitative/qualitative evaluation for both management and faculty whose research is affected by the evaluation procedure. Apart from showcasing the academic impact of an institute, RI2A also provides an opportunity to explore remote connections that otherwise might go unnoticed.
Dublin, 2019. p. 138-139
48th LIBER Annual Conference Research Libraries for Society, Dublin, June 26-28, 2019