France wants the European Union to wage war against bibliometrics

https://pacheco-torgal.blogspot.com/2021/11/evaluating-researchers-in-fast-and.html

Building on the previous discussion regarding the thought-provoking Koltun presentation (posted in the link above), I invite you to explore an article recently published in Science Business. This article delves into a concept dubbed as a ‘disruptive’ strategy for reshaping research assessment. https://sciencebusiness.net/news/france-helps-brussels-move-ahead-disruptive-plan-research-assessment

In fact, there´s absolutely nothing truly disruptive about it. Instead, is just a way to spend more money on peer review rather than harnessing bibliometrics to enhance the peer-review process. This could potentially result in significant cost savings amounting to millions of dollars, which could then be directed towards supporting research endeavors and recruiting researchers. As Peter Drucker used to say “You can’t manage what you can’t measure”

Indeed, as Koltun rightly emphasized, every scientist recognizes the necessity of thoroughly delving into a researcher’s work to evaluate it effectively. This entails not only comprehending the intricacies of the work but also understanding the underlying context in which it was conducted and, ideally, possessing the capability to reproduce it. Regrettably, the constraints of both time and financial resources often preclude such a comprehensive examination.

Allow me to provide you with some insights into my homeland, Portugal, which has experimented with both approaches. In a prior Portuguese research assessment conducted in 2013, the international experts serving on the evaluation panels enjoyed complete autonomy. They had the freedom to evaluate research units through on-site visits and also had access to a comprehensive bibliometric analysis, utilizing data from Scopus, which was expertly conducted by Elsevier and generated a range of valuable metrics (Publications per FTE, Citations per FTE, h-index, Field-Weighted Citation Impact, Top cited publications, National and International Collaborations).

However, in recent years, we experienced a shift in perspective, with a Science Minister who shared similar sentiments with those critical of bibliometrics in France. During the most recent research assessment in 2018, which involved the evaluation of 348 research units comprising nearly 20,000 researchers, the Evaluation Guide clearly dictated that absolutely no metric could be used by the panels (note that all panels were composed by international experts, 51 from UK, 21 from USA, 17 from Germany, 17 from France, 11 from The Netherlands, 8 from Finland, 8 from Ireland, 7 from Switzerland, 6 from Sweden, 5 from Norway and also from other countries).

Nonetheless, once the research assessment had concluded, I conducted an extensive search through all the reports across various scientific areas. What I discovered was that the reviewers assigned significant importance to the quantity of publications and the perceived “quality” of journals, even though such considerations were expressly prohibited by the Evaluation Guide. I found that “publications”, “quartiles” and even “impact factors” were mentioned in the assessment reports more than 500 times. Meaning that in the absence of any metric the international experts (somewhat ironically) decide to use the worst of them all.

PS – Could it be that France’s hatred of metrics is due to the fact that it has a low number of highly cited articles and a low number of highly cited scientists, such that it cannot even appear in this group of 17 countries of Scopus Highly Cited Scientists per million inhabitants? https://pacheco-torgal.blogspot.com/2021/11/switzerland-denmark-and-sweden-has.html Furthermore, when evaluating France’s research performance through the lens of funding efficiency, as proposed by both Wohlrabe et al. 2019 and also by de Marco 2019) it becomes evident that France’s performance falls even further below expectations.