Skip to main content
Log in

National peer-review research assessment exercises for the hard sciences can be a complete waste of money: the Italian case

  • Published:
Scientometrics Aims and scope Submit manuscript

Abstract

There has been ample demonstration that bibliometrics is superior to peer-review for national research assessment exercises in the hard sciences. In this paper we examine the Italian case, taking the 2001–2003 university performance rankings list based on bibliometrics as benchmark. We compare the accuracy of the first national evaluation exercise, conducted entirely by peer-review, to other rankings lists prepared at zero cost, based on indicators indirectly linked to performance or available on the Internet. The results show that, for the hard sciences, the costs of conducting the Italian evaluation of research institutions could have been completely avoided.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

Notes

  1. Research Excellence Framework, page 34, downloadable at www.hefce.ac.uk/pubs/hefce/2009/09_38/, last accessed on Sept. 5, 2012.

  2. Available at http://www.scimagoir.com/pdf/sir_2010_world_report.pdf, last accessed on Sept. 5, 2012.

  3. http://www.censis.it/1, last accessed on Sept. 5, 2012.

  4. Complete list accessible at http://cercauniversita.cineca.it/php5/settori/index.php, last accessed on Sept. 5, 2012.

  5. http://vtr2006.cineca.it/index_EN.html, last accessed on Sept. 5, 2012.

  6. Mathematics and computer sciences; physics; chemistry; earth sciences; biology; medicine; agricultural and veterinary sciences; industrial and information engineering.

  7. http://cercauniversita.cineca.it/php5/docenti/cerca.php, last accessed on Sept. 5, 2012.

  8. Observed as of 30/06/2009.

  9. For publications in multidisciplinary journals the standardized value is calculated as a weighted average of the standardized values for each subject category.

  10. For the life sciences, position in the list of authors reflects varying contribution to the work. Italian scientists active in these fields have proposed an algorithm for quantification: if the first and last authors belong to the same university, 40 % of citations are attributed to each of them; the remaining 20 % are divided among all other authors. If the first two and last two authors belong to different universities, 30 % of citations are attributed to first and last authors; 15 % of citations are attributed to second and last author but one; the remaining 10 %are divided among all others. This algorithm could also be adapted to suit other national contexts.

  11. Prior to the VTR, all universities were almost completely financed through non-competitive MIUR allocation.

  12. http://www.istat.it/it/, last accessed on Sept. 5, 2012.

  13. SCImago 2010 World Report, available at http://www.scimagoir.com/pdf/sir_2010_world_report.pdf, last accessed on Sept. 5, 2012.

  14. SCImago metholodogy available at http://www.scimagoir.com/methodology.php?page=indicators#, last accessed on Sept. 5, 2012.

  15. We followed the guidelines by Cohen (1988) on the strength of association.

  16. We note that the value change of rank within quartiles for any universities which do not shift quartile, may be larger than that of universities that shift quartile.

References

  • Abramo, G., & D’Angelo, C. A. (2011). Evaluating research: from informed peer-review to bibliometrics. Scientometrics, 87(3), 499–514.

    Article  Google Scholar 

  • Abramo, G., D’Angelo, C. A., & Di Costa, F. (2008). Assessment of sectoral aggregation distortion in research productivity measurements. Research Evaluation, 17(2), 111–121.

    Article  Google Scholar 

  • Abramo, G., D’Angelo, C. A., & Di Costa, F. (2011). National research assessment exercises: a comparison of peer-review and bibliometrics rankings. Scientometrics, 89(3), 929–941.

    Article  Google Scholar 

  • Aksnes, D. W., & Taxt, R. E. (2004). Peer reviews and bibliometric indicators: a comparative study at Norvegian University. Research Evaluation, 13(1), 33–41.

    Article  Google Scholar 

  • Bornmann, L., & Leydesdorff, L. (2012). Which are the best performing regions in information science in terms of highly cited papers? Some improvements of our previous mapping approaches. Journal of Informetrics, 6(2), 336–345.

    Article  Google Scholar 

  • Cohen, J. (1988). Statistical Power Analysis for the Behavioral Sciences (2nd edn.). Hillsdale, NJ: Lawrence Erlbaum Associates.

    MATH  Google Scholar 

  • D’Angelo, C. A., Giuffrida, C., & Abramo, G. (2011). A heuristic approach to author name disambiguation in large-scale bibliometric databases. Journal of the American Society for Information Science and Technology, 62(2), 257–269.

    Article  Google Scholar 

  • ERA (2010). The excellence in research for Australia initiative. http://www.arc.gov.au/era/. Accessed 5 Sept 2012.

  • Franceschet, M., & Costantini, A. (2011). The first Italian research assessment exercise: A bibliometric perspective. Journal of Informetrics, 5(2), 275–291.

    Article  Google Scholar 

  • Guisan, M. C. (2005). Universities and research expenditure in Europe and the USA, 1993–2003: an analysis of countries and regions. Regional and Sectoral Economic Studies, AEEADE, 5(2), 35–46.

    Google Scholar 

  • Horrobin, D. F. (1990). The philosophical basis of peer-review and the suppression of innovation. Journal of the American Medical Association, 263(10), 1438–1441.

    Article  Google Scholar 

  • MacRoberts, M. H., & MacRoberts, B. R. (1996). Problems of citation analysis. Scientometrics, 36(3), 435–444.

    Article  Google Scholar 

  • Meho, L. I., & Sonnenwald, D. H. (2000). Citation ranking versus peer evaluation of senior faculty research performance: a case study of Kurdish Scholarship. Journal of the American Society for Information Science, 51(2), 123–138.

    Article  Google Scholar 

  • Moed, H. F. (2002). The impact-factors debate: the ISI’s uses and limits. Nature, 415, 731–732.

    Article  Google Scholar 

  • Moxham, H., & Anderson, J. (1992). Peer-review. A view from the inside. Science and Technology Policy, 5, 7–15.

    Google Scholar 

  • Oppenheim, C. (1997). The correlation between citation counts and the 1992 research assessment exercise ratings for British research in genetics, anatomy and archaeology. Journal of Documentation, 53, 477–487.

    Article  Google Scholar 

  • Oppenheim, C., & Norris, M. (2003). Citation counts and the research assessment exercise V: archaeology and the 2001 RAE. Journal of Documentation, 56(6), 709–730.

    Google Scholar 

  • Pendlebury, D. A. (2009). The use and misuse of journal metrics and other citation indicators. Scientometrics, 57(1), 1–11.

    Google Scholar 

  • RAE (2008). Research assessment exercise. http://www.rae.ac.uk/aboutus/. Accessed 5 Sept. 2012.

  • Rinia, E. J., van Leeuwen, Th. N., van Vuren, H. G., & van Raan, A. F. J. (1998). Comparative analysis of a set of bibliometric indicators and central peer-review criteria, Evaluation of condensed matter physics in the Netherlands. Research Policy, 27(1), 95–107.

    Article  Google Scholar 

  • Serenko, A., & Dohan, M. (2011). Comparing the expert survey and citation impact journal ranking methods: Example from the field of Artificial Intelligence. Journal of Informetrics, 5(4), 629–648.

    Article  Google Scholar 

  • Thomas, P. R., & Watkins, D. S. (1998). Institutional research rankings via bibliometric analysis and direct peer-review: A comparative case study with policy implications. Scientometrics, 41(3), 335–355.

    Article  Google Scholar 

  • Van Raan, A. F. J. (2005). Fatal attraction: Conceptual and methodological problems in the ranking of universities by bibliometric methods. Scientometrics, 62(1), 133–143.

    Article  Google Scholar 

  • Van Raan, A. F. J. (2006). Comparison of the Hirsch-index with standard bibliometric indicators and with peer judgment for 147 chemistry research groups. Scientometrics, 67(3), 491–502.

    Google Scholar 

  • VTR (2006). Three-year research evaluation (20012003). Results separated by scientific area. http://vtr2006.cineca.it/index_EN.html. Accessed on 5 Sept 2012.

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Giovanni Abramo.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Abramo, G., Cicero, T. & D’Angelo, C.A. National peer-review research assessment exercises for the hard sciences can be a complete waste of money: the Italian case. Scientometrics 95, 311–324 (2013). https://doi.org/10.1007/s11192-012-0875-6

Download citation

  • Received:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11192-012-0875-6

Keywords

Navigation