Metricide, death by metrics, is catalyzed by an accelerated culture of irrationality that parades itself under the guise of reason.
I think of the epidemic of metricide each time that I speak with a junior colleague, each time that I write a promotion and tenure letter, and each time that I sit on a review committee. Mentoring assistant professors is an everyday reminder of this death by metrics.
The burden of metrics is borne by the most junior academics, subjecting them to a continual state of anxiety.
The suicidal anxieties produced in academics by the race for metrics has deleterious health effects, in many instances resulting in poor mental health outcomes among academics; and in some instances, resulting in death (recall the stories of a colleague dying of a heart attack in the office next door).
Beyond killing academics, metrics kill academia. They take the creativity, joy, and freedom of academia, and turn these positive emotions into an accelerated chase for numbers. The number frenzy makes numbers the end goal of academic work, obfuscating the fundamental spirit of inquiry.
Underneath its veneer of rationality (that numbers would offer a standard for quality), the metrics game is entirely irrational. The irrationality of the metrics game becomes apparent once we consider the various ironies in how metrics are determined and implemented.
One of the striking ironies of the culture of metrics is the mass implementation of numbers, carried out often by academic-managers with mediocre academic track records and a whole lot of ambition. That the managers implementing the metrics are mostly mediocre or failed academics that don't really understand the research process creates and reproduces the condition for metricide.
You have a Head who had never published in a top tier journal telling junior academics that "without a solo-authored publication in a top tier journal, you would not even have tenure track job." You have a Dean with an h-index of 3 telling an Associate Professor that an h-index of 17 is nothing to be proud of, "the University is looking for excellence these days." You have a Vice President of Research with 12 publications in sub-standard journals telling an Associate Professor that her productivity, with 5 publications in the last three years, has been low recently.
Without a deep understanding of what guides numbers, a new number, number of total citations, h-index, i-10, field weighted citation index, takes center stage. The ever-accelerating rush for new metrics also works to maintain the opacity of the metric epidemic.
This is the second irony of the metric culture. The propaganda of rationality and drive toward standards obfuscates the opaque processes through which decisions are made regarding what metrics to apply, and the very absence of agreed upon metrics. Once the ideology that metrics are ever-evolving and in a continual state of being calibrated in the search for excellence is accepted, it becomes the basis for tyrannical and prejudiced decisions made by management, all under the veneer of searching for excellence. A colleague with 18 peer reviewed publications and an h-index of 9 does not make it to associate professor, the management states "She did not measure up to the continually evolving standards of excellence." Another colleague with 7 articles and an h-index of 5 gets promoted and tenure, management argues "excellence is in the quality of the work." Excellence itself becomes a trope that justifies the prejudice built into academic systems of evaluation.
In the meanwhile, hearing the story of the colleague with 18 publications not making tenure, assistant professors push themselves to 25-30 publications, believing this is what would earn then tenure. The eternal perpetuation of anxiety is the underpinning principle of the game of metrics.
I have often argued that metrics kill creativity. I have also often written about the ways in which metrics, articulated in narrow frameworks of evaluation, constrain and limit the possibilities of new thought. Narrowly driven by how much to produce, where to produce, and how to generate citations, scholars are driven to kill all that which is creative within.
In this blog entry, I will further argue that the veneer of rationality of metrics works ideologically to cover over a fundamentally irrational process driven by the tyranny of mediocre academic management. Whereas metric-mania is meant to portray a drive toward excellence, what it actually does is write over an array of political practices and practices of power play that are inherently unequal. Whereas metrics are projected as instruments for calibrating the drive toward excellence, junior academics would do well to recognize the irrationality and prejudice that are built into how metrics are implemented and reduced.
For academia to retain its culture of creativity and for academics to fight the onslaught on their wellbeing by a culture of metrics, academics ought to consider the ways in which they can build networks of solidarity and collective claims-making. Unions and academic labour collectives have key roles to play in challenging the epidemic of metrics.
I think of the epidemic of metricide each time that I speak with a junior colleague, each time that I write a promotion and tenure letter, and each time that I sit on a review committee. Mentoring assistant professors is an everyday reminder of this death by metrics.
The burden of metrics is borne by the most junior academics, subjecting them to a continual state of anxiety.
The suicidal anxieties produced in academics by the race for metrics has deleterious health effects, in many instances resulting in poor mental health outcomes among academics; and in some instances, resulting in death (recall the stories of a colleague dying of a heart attack in the office next door).
Beyond killing academics, metrics kill academia. They take the creativity, joy, and freedom of academia, and turn these positive emotions into an accelerated chase for numbers. The number frenzy makes numbers the end goal of academic work, obfuscating the fundamental spirit of inquiry.
Underneath its veneer of rationality (that numbers would offer a standard for quality), the metrics game is entirely irrational. The irrationality of the metrics game becomes apparent once we consider the various ironies in how metrics are determined and implemented.
One of the striking ironies of the culture of metrics is the mass implementation of numbers, carried out often by academic-managers with mediocre academic track records and a whole lot of ambition. That the managers implementing the metrics are mostly mediocre or failed academics that don't really understand the research process creates and reproduces the condition for metricide.
You have a Head who had never published in a top tier journal telling junior academics that "without a solo-authored publication in a top tier journal, you would not even have tenure track job." You have a Dean with an h-index of 3 telling an Associate Professor that an h-index of 17 is nothing to be proud of, "the University is looking for excellence these days." You have a Vice President of Research with 12 publications in sub-standard journals telling an Associate Professor that her productivity, with 5 publications in the last three years, has been low recently.
Without a deep understanding of what guides numbers, a new number, number of total citations, h-index, i-10, field weighted citation index, takes center stage. The ever-accelerating rush for new metrics also works to maintain the opacity of the metric epidemic.
This is the second irony of the metric culture. The propaganda of rationality and drive toward standards obfuscates the opaque processes through which decisions are made regarding what metrics to apply, and the very absence of agreed upon metrics. Once the ideology that metrics are ever-evolving and in a continual state of being calibrated in the search for excellence is accepted, it becomes the basis for tyrannical and prejudiced decisions made by management, all under the veneer of searching for excellence. A colleague with 18 peer reviewed publications and an h-index of 9 does not make it to associate professor, the management states "She did not measure up to the continually evolving standards of excellence." Another colleague with 7 articles and an h-index of 5 gets promoted and tenure, management argues "excellence is in the quality of the work." Excellence itself becomes a trope that justifies the prejudice built into academic systems of evaluation.
In the meanwhile, hearing the story of the colleague with 18 publications not making tenure, assistant professors push themselves to 25-30 publications, believing this is what would earn then tenure. The eternal perpetuation of anxiety is the underpinning principle of the game of metrics.
I have often argued that metrics kill creativity. I have also often written about the ways in which metrics, articulated in narrow frameworks of evaluation, constrain and limit the possibilities of new thought. Narrowly driven by how much to produce, where to produce, and how to generate citations, scholars are driven to kill all that which is creative within.
In this blog entry, I will further argue that the veneer of rationality of metrics works ideologically to cover over a fundamentally irrational process driven by the tyranny of mediocre academic management. Whereas metric-mania is meant to portray a drive toward excellence, what it actually does is write over an array of political practices and practices of power play that are inherently unequal. Whereas metrics are projected as instruments for calibrating the drive toward excellence, junior academics would do well to recognize the irrationality and prejudice that are built into how metrics are implemented and reduced.
For academia to retain its culture of creativity and for academics to fight the onslaught on their wellbeing by a culture of metrics, academics ought to consider the ways in which they can build networks of solidarity and collective claims-making. Unions and academic labour collectives have key roles to play in challenging the epidemic of metrics.