Skip to main content
Log in

Academic Evaluation: Universal Instrument? Tool for Development?

  • Published:
Minerva Aims and scope Submit manuscript

Abstract

Research agendas and academic evaluation are inevitably linked. By means of economic incentives, promotion, research funding, and reputation academic evaluation is a powerful influence on the production of knowledge; moreover, it is often conceived as a universal instrument without consideration of the context in which it is applied. Evaluation systems are social constructions in dispute, being the current focus of international debates regarding criteria, indicators, and their associated methods. A universalist type of productivity indicators is gaining centrality in academic evaluation with profound effects on the content of research that is conducted everywhere. Specifically, evaluation systems based on this type of indicators are sending negative signals to scientists willing to conduct research on contextualized agendas, particularly those negotiated with non scientists. On the basis of theoretical and empirical studies documented on the specialized literature and extensive personal engagement with university research policy in Uruguay, we argue that the consolidation of evaluation practices of alleged universal validity deteriorates and discourages a type of research which is undeniably important in developing contexts.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

Notes

  1. According to Barré (2010), indicators are not a reality or a technical and scientific truth. Their construction results from a political process because they are based on a specific model (among many others) in order to understand how science works or how it should work. Simultaneously, interpreting these indicators requires multiple judgments where experts involved also introduce their values, rules and personal visions to make decisions. In this sense, evaluation as well as the development and interpretation of indicators relate to a normative and, therefore, debatable process.

  2. Uruguay is a high-income country in terms of GDP per inhabitant according to the World Bank classification. However, it belongs to what can be considered the developing world in terms of STI issues.

  3. Merton (1988) coined this expression in direct reference to a passage from the Gospel of Matthew: “Whoever has will be given more, and they will have abundance. Whoever does not have, even what they have will be taken from them.” In science, that means that those who have strong scientific capacities will be given more opportunities to strengthening them further as a result of the academic system structure, while those whose capacities are weak will be left with even less.

  4. In May 2013, the Science editor quoted the San Francisco Declaration and emphasized the need to stop using impact factor in research assessments (Alberts 2013).

  5. López-Piñeiro and Hicks (2015) show how decontextualized approaches in Spanish sociology are promoted by the fact that the specificities of the Spanish society are of little interest for English language audiences. Since the Spanish evaluation system emphasizes publications in high impact factor journals, these authors predict narrower and more abstract research agendas for Spanish sociology in the long run.

  6. Chavarro et al. (2014) provide empirical evidence on the association between interdisciplinary research and research focused on local issues for the case of Colombia.

References

  • Alberts, Bruce. 2013. Impact factor distortions. Science 340: 787.

    Article  Google Scholar 

  • Arocena, Rodrigo, and Judith Sutz. 2010. Weak knowledge demand in the South, learning divides and innovation policies. Science and Public Policy 37(8): 571–582.

    Article  Google Scholar 

  • Barré, Remi. 2010. Towards socially robust S&T indicators: indicators are debatable devices, enabling collective learning. Research Evaluation 19(3): 227–231.

    Article  Google Scholar 

  • Benninghoff, Martin, and Dietmar Braun. 2010. Research Funding, Authority Relations, and Scientific Production in Switzerland. In Reconfiguring Knowledge Production: Changing Authority Relationships in the Sciences and their Consequences for Intellectual Innovation, eds. Richard Whitley, Jochen Gläser, and Lars Engwall, 81–108. New York: Oxford University Press Inc.

    Chapter  Google Scholar 

  • Bensusán, Graciela, Natalia Gras, Daniel Inclán, Carlos E. Rodríguez, Giovanna Valenti, and Gonzalo Varela. 2014. Reflexiones sobre la evaluación a los investigadores: una mirada desde diferentes perspectivas. http://www.foroconsultivo.org.mx/libros_editados/evaluacion_de_la_evaluacion_subgrupos_individuos.pdf. Accessed 27 May 2015.

  • Bernal, John D. 1994. Historia Social de la Ciencia. Barcelona: Península Ed.

    Google Scholar 

  • Bianco, Mariela, Natalia Gras, and Judith Sutz. 2014a. Reflexiones sobre la práctica de la evaluación académica. In Análisis y reflexiones sobre 20 años de políticas de investigación en la Universidad de la República: aciertos, dudas y aprendizajes, eds. Mariela Bianco, and Judith Sutz, 209–235. Montevideo: CSIC-UdelaR, Trilce.

    Google Scholar 

  • Bianco, Mariela, Maria Goñi, and Cecilia Tomassini. 2014b. Señales transmitidas por el sistema de fomento a la investigación: tensiones en la orientación de la producción de conocimiento y las carreras académicas en Uruguay. REDES 20(39): 159–182.

  • Bianco, Mariela, Carlos Bianchi, Andrea Bielli, Claudia Cohanoff, Ana Laura de Giorgi, Natalia Gras, and Judith Sutz (coordinator). 2006. Pensando el Plan Estratégico Nacional en Ciencia, Tecnología e Innovación. Document for the First PENCTI Workshop, CSIC-UdelaR. http://csic.edu.uy/renderPage/index/pageId/275#heading_892. Accessed 18 March 2016.

  • Bunders, Joske. 1987. The practical management of scientists’ actions: the influence of patterns of knowledge development in biology on cooperations between university biologists and non-scientists. In The Social Direction of the Public Sciences, eds. Stuart S. Blume, Joske Bunders, Loet Leydesdorff, and Richard Whitley, 39–74. Dordrecht: Reidel Publishing Co.

    Chapter  Google Scholar 

  • Bunders, Joske. 1990. Biotechnology for small-scale farmers in developing countries. Analysis and assessment procedures. Amsterdam: VU University Press.

    Google Scholar 

  • Bunders, Joske, and Jacqueline Broerse. 1991. Appropriate biotechnology in small-scale agriculture: How to orient research and development. Wallingford: CAB International.

    Google Scholar 

  • Chavarro, Diego, Puay Tang, and Ismael Rafols. 2014. Interdisciplinarity and research on local issues: Evidence from a developing country. Research Evaluation 23(3): 195–209.

    Article  Google Scholar 

  • Dahler-Larsen, Peter. 2014. Constitutive Effects of Performance Indicators: Getting beyond unintended consequences. Public Management Review 16(7): 969–986.

    Article  Google Scholar 

  • de Jong, Stefan P.L., Pleun van Arensbergen, Floortje Daemen, Barend van der Meulen, and Peter van den Besselaar. 2011. Evaluation of research in context: an approach and two cases. Research Evaluation 20(1): 61–72.

    Article  Google Scholar 

  • De la Mothe, John, and Gilles Paquet. 1996. Evolutionary Economics and the New International Political Economy. London: Pinter.

    Google Scholar 

  • Dobbins, Michael, and Christoph Knill. 2014. Higher Education Governance and Policy Change in Western Europe. International Challenges to Historical Institutions. London: Palgrave Macmillan.

    Book  Google Scholar 

  • DORA. 2012. San Francisco Declaration on Research Assessment. http://am.ascb.org/dora/. Accessed 20 June 2015.

  • Dicyt-MEC. 2012. Informe a la Sociedad. Ciencia, Tecnología e Innovación en Uruguay en los últimos años. Montevideo: Ministerio de Educación y Cultura.

    Google Scholar 

  • Elzinga, Aant. 1988. The consequences of evaluation for academic research. Science Studies 1: 5–14.

    Google Scholar 

  • Evidence Ltd. 2007. The use of bibliometrics to measure research quality in the UK higher education institutions. http://www.universitiesuk.ac.uk/highereducation/Documents/2007/Bibliometrics.pdf. Accessed 20 August 2015.

  • Gläser, Jochen, and Grit Laudel. 2016. Governing Science. European Journal of Sociology 57(01): 117–168.

    Article  Google Scholar 

  • Gläser, Jochen, and Grit Laudel. 2007a. The social construction of bibliometric evaluation. In The Changing Governance of the Sciences. The Advent of Research Evaluation Systems, eds. Richard Whitley, and Jochen Gläser. Dordrecht: Springer.

    Google Scholar 

  • Gläser, Jochen, and Grit Laudel. 2007b. Evaluation Without Evaluators: The Impact of Funding Formulae on Australian University Research. In The Changing Governance of the Sciences. The Advent of Research Evaluation Systems, eds. Richard Whitley, and Jochen Gläser, 127–151. Dordrecht: Springer.

    Chapter  Google Scholar 

  • Hemlin, Sven, and Soren Barlebo Rasmussen. 2006. The Shift in Academic Quality Control. Science, Technology, & Human Values 31(2): 173–198.

    Article  Google Scholar 

  • Hess, David. 2007. Alternative Pathways in Science and Industry. Activism, Innovation, and the Environment in an Era of Globalization. Cambridge: The MIT Press.

    Google Scholar 

  • Hessen, Boris. 1931. “The Social and Economic Roots of Newton’s Principia”, paper presented at 2nd International Congress of the History of Science and Technology. London June 20–July 3.

  • Hicks, Diana. 2004. The Four Literatures of Social Science. In Handbook of Quantitative Science and Technology Research, eds. Henk Moed, Wolfgang Glänzel, and Ulrich Schmoch, 473–496. Dordrecht: Kluwer Academic Publishers.

    Google Scholar 

  • Hicks, Diana. 2006. The Dangers of Partial Bibliometric Evaluation in the Social Sciences. Economia Politica XXIII 2: 145–162.

    Google Scholar 

  • Hicks, Diana. 2013. One size doesn’t fit all: On the co-evolution of national evaluation systems and social science publishing. Confero: Essays on Education Philosophy and Politics. doi:10.3384/confero13v1121207b

  • Hicks, Diana, Paul Wouters, Ludo Waltman, Sarah de Rijcke, and Ismael Rafols. 2015. The Leiden Manifesto for research metrics. Nature 520(April): 429–431.

    Article  Google Scholar 

  • Katz, J. Sylvan. 1999. Bibliometric Indicators and the Social Sciences. ESRC http://users.sussex.ac.uk/~sylvank/pubs/ESRC.pdf. Accessed 20 August 2015.

  • Kuhn, Thomas. 1970. The Structure of Scientific Revolutions, 2nd ed. Chicago: University of Chicago Press.

    Google Scholar 

  • López-Piñeiro, Carla, and Diana Hicks. 2015. Reception of Spanish sociology by domestic and foreign audiences differs and has consequences for evaluation. Research Evaluation 24(1): 78–89.

    Article  Google Scholar 

  • Martin, Ben, and Richard Whitley. 2010. The UK Research Assessment Exercise: A Case of Regulatory Capture? In Reconfiguring Knowledge Production: Changing Authority Relationships in the Sciences and their Consequences for Intellectual Innovation, eds. Richard Whitley, Jochen Gläser, and Lars Engwall, 51–79. New York: Oxford University Press Inc.

    Chapter  Google Scholar 

  • McNay, Ian. 2009. Research Quality Assessment: objectives, approaches, responses and consequences. In Academic Research and Researchers, eds. Angela Brew, and Lisa Lucas, 35–53. London: McGraw Hill.

    Google Scholar 

  • Meier, Frank, and Uwe Schimank. 2010. Mission Now Possible: Profile Building and Leadership in German Universities. In Reconfiguring Knowledge Production: Changing Authority Relationships in the Sciences and their Consequences for Intellectual Innovation, eds. Richard Whitley, Jochen Gläser, and Lars Engwall, 211–237. New York: Oxford University Press Inc.

    Chapter  Google Scholar 

  • Merton, Robert. 1988. The Matthew effect in science. ISIS 79(4): 606–623.

    Article  Google Scholar 

  • Merton, Robert. 1942. The Normative Structure of Science. In The Sociology of Science. Chicago: University of Chicago Press.

  • Morris, Norma. 2010. Authority Relations as Condition for, and Outcome of, Shifts in Governance: The Limited Impact of the UK Research Assessment Exercise on the Biosciences. In Reconfiguring Knowledge Production: Changing Authority Relationships in the Sciences and their Consequences for Intellectual Innovation, eds. Richard Whitley, Jochen Gläser, and Lars Engwall, 239–264. New York: Oxford University Press Inc.

    Chapter  Google Scholar 

  • PEDECIBA (Programa de Desarrollo de las Ciencias Básicas). 2004. Criterios, herramientas y procedimientos generales para la evaluación de la actividad académica de los investigadores. http://www.pedeciba.edu.uy/docspd/CritEvalInv04.pdf. Accessed 21 Aug 2015.

  • Power, Michael. 1987. The Audit Society. Rituals of Verification. Oxford: Oxford University Press.

    Google Scholar 

  • Regeer, Barbara, Anne-Charlotte Hoes, Mariette van Amstel, Francisca Caron-Flinterman Saane, and Joske Bunders. 2009. Six Guiding Principles for Evaluating Mode-2 Strategies for Sustainable Development. American Journal of Evaluation 30: 515–537.

    Article  Google Scholar 

  • Rosenberg, Nathan. 1982. Inside the Black Box. Technology in Economics. Cambridge: Cambridge University Press.

    Google Scholar 

  • Sábato, Jorge, and Natalio Botana. 1968. La ciencia y la tecnología en el desarrollo futuro de América Latina. Revista de la Integración 1(3): 15–36.

    Google Scholar 

  • Sahel, Jose Alain. 2011. Quality versus quantity: assessing individual research performance. Science Translational Medicine. doi:10.1126/scitranslmed.3002249.

  • Schimank, Uwe. 2005. “New Public Management” and the Academic Profession: Reflections on the German Situation. Minerva 43: 361–376.

    Article  Google Scholar 

  • UA-CSIC (Unidad Académica de CSIC). 2003. Grupos de Investigación en la Universidad de la República. Montevideo: CSIC-UdelaR.

    Google Scholar 

  • van Dalen, Hendrik, and Kéne Henkens. 2012. Intended and Unintended Consequences of a Publish-or-Perish Culture: A Worldwide Survey. Journal of the American Society for Information Science and Technology 63(7): 1282–1293.

    Article  Google Scholar 

  • van der Most, Frank. 2010. Use and non-use of research evaluation: A literature review. Centre for Innovation, Research and Competence in the Learning Economy (CIRCLE), Lund University, Working Paper. http://wp.circle.lu.se/upload/CIRCLE/workingpapers/201016_vanderMost.pdf. Accessed 20 August 2015.

  • Van Raan, Anthony F.J. 1996. Advanced bibliometric methods as quantitative core of peer review based evaluation and foresight exercises. Scientometrics 36(3): 397–420.

    Article  Google Scholar 

  • Vessuri, Hebe, Jean-Claude Guédon, and Ana María Cetto. 2014. Excellence or quality? Impact of the current competition regime on science and scientific publishing in Latin America and its implications for development. Current Sociology 62(5): 647–665.

    Article  Google Scholar 

  • Whitley, Richard. 1984. The Intellectual and Social Organization of the Sciences. Oxford: Oxford University Press.

    Google Scholar 

  • Whitley, Richard. 2007. Changing Governance of the Public Sciences: The Consequences of Establishing Research Evaluation Systems for Knowledge Production in Different Countries and Scientific Fields. In The Changing Governance of the Sciences. The Advent of Research Evaluation Systems, eds. Richard Whitley, and Jochen Gläser, 3–27. Dordrecht: Springer.

    Chapter  Google Scholar 

  • Whitley, Richard. 2010. Reconfiguring the Public Sciences: The Impact of Governance Changes on Authority and Innovation in Public Science Systems. In Reconfiguring Knowledge Production: Changing Authority Relationships in the Sciences and their Consequences for Intellectual Innovation, eds. Richard Whitley, Jochen Gläser, and Lars Engwall, 3–47. New York: Oxford University Press Inc.

    Chapter  Google Scholar 

  • Ziman, John. 1994. Prometheus Bound. Science in a Dynamic Steady State. Cambridge: Cambridge University Press.

    Book  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Mariela Bianco.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Bianco, M., Gras, N. & Sutz, J. Academic Evaluation: Universal Instrument? Tool for Development?. Minerva 54, 399–421 (2016). https://doi.org/10.1007/s11024-016-9306-9

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11024-016-9306-9

Keywords

Navigation