Ei-iE

Education International
Education International

The Mismeasure of Higher Education

published 12 July 2013 updated 10 September 2013
written by:

In reflecting upon the OECD’s proposed AHELO programme, one may well ask if a similar critique could be levelled against attempts to measure and order the worth and quality of higher education. To what degree does such an assessment, and its close cousin in in the Times Higher Education University rankings, suffer from the same basic fallacies as “scientific” assessments of intelligence? Do such assessments make the same mistake of trying to convert abstract and complex concepts into a single number? Do these assessments have the effect of reproducing and justifying existing social and economic hierarchies?

First floated at the OECD education ministers’ meeting in Athens in 2006, AEHLO was initially packaged as a ‘PISA for higher education’; a tool that “could provide member governments with a powerful instrument to judge the effectiveness and international competitiveness of their higher education institutions, systems and policies.”

While receiving a lukewarm response to the proposal from most members and stakeholders, the OECD nevertheless pressed ahead and launched a multi-million euro feasibility study to see if such a tool was even scientifically possible. Skeptics noted that AHELO raised significant methodological issues. Given the diversity of higher education systems, institutional missions, and student populations both within and across countries, would it be possible to agree upon a set of standardized learning outcomes, let alone measure those outcomes in a way that would provide reliable international comparisons?

To explore this, the OECD’s  feasibility study tested three different tools or “strands”:  a generic strand, based upon the Collegiate Learning Assessment (CLA) administered in the United States evaluates, a standardized test to evaluate the general skills of all students, regardless of discipline, toward the end of their undergraduate degree (e.g. critical thinking, problem solving, and written communications);  a discipline specific strand that focused on assessing the knowledge and abilities of students in engineering and economics; and a contextual strand that sought to gather information about the institutional environment and background of students.

The results of the AHELO feasibility study were presented earlier this year, with the conclusion that it appears to be possible to assess discipline-specific skills. There is less scientific certainty around the reliability of the generic skills strand. In fact, the AHELO technical advisory group found that the questions used and based on the CLA “proved excessively ‘American’ in an international context.” In an echo of Gould’s critique of the mismeasure of human intelligence, AHELO seems to privilege and reinforce particular Euro-American values and knowledge systems.

Beyond the methodological shortcomings, there lay some serious political considerations concerning the potential use, misuse and abuse of AHELO results. While the OECD insisted AHELO would not be a ranking, it is difficult to see how it could be anything but, particularly when it was explicitly intended to help governments benchmark the performance of their institutions against those in other jurisdictions. Once a number is assigned to the performance of an institution or a program, whether based on research performance as are the current global university rankings or learning outcomes as proposed in AHELO, the outcome invariably will be that governments and the media rank results in a simplistic league table and use those tables improperly. No matter what the initial intention may be, the danger is that results will not be used to improve and support institutions and teachers, but rather to exert more external control.

In truth, what makes a good university or college can only at best be partially captured statistically. The quality of the education and free inquiry that takes place within an institution cannot be easily or accurately parsed, quantified, ordered and compared. Quality higher education is not a singular product or outcome subject to one simple definition or numerical score. It has to do with a diverse range of activities and processes. Assessments such as AHELO require that these complex aspects of a university or college be reduced to a number no matter how absurd the exercise becomes.