Skip to main content
News

Op-ed: We need better ways to measure social mobility

Date

Written by
David Winter, Head of Research & Organisational Development at the Careers Group, University of London

Writing for WonkHE, David Winter, Head of Research & Organisational Development at the Careers Group, University of London, says that salaries and outcomes are used to map social mobility because data is available rather than because they are a good tool for the job. As originally published on WonkHE.

There has been a recent flurry of organisations producing indicators which purport to measure the impact of education (particularly higher education) on social mobility.

The Social Mobility Commission has produced a report on the Labour Market Value of Higher and Further Education Qualifications. The Sutton Trust produced its Universities and Social Mobility Data Explorer. And TASO (Transforming Access and Student Outcomes in Higher Education) produced a review of evidence, The Value of Higher Education.

All three of these make significant use of Longitudinal Education Outcomes (LEO) data – which links HM Revenue and Customs income data to student records to capture median salary at 1, 3 and 5 years after graduation. This article is not about pointing out the various fundamental flaws in LEO data — that’s been done by others. However, I do want to question some of the assumptions behind these measures and to propose a methodology that is more aligned with the mission of promoting social mobility.

To be fair to TASO, their report looks at a whole range of evidence — what it tells us, what it doesn’t tell us and, importantly, where the gaping holes are. This makes it clear that people are using LEO data, not because it provides a conceptually valid measure of social mobility, but because there’s very little else out there that can be used at the moment.

The problem of availability

One of the worst ways to develop a metric is to look at what data you happen to have available and then try to make it fit your purpose, especially if you don’t question the assumptions behind why that data was collected in the first place. A better way is to clearly define what change you are trying to bring about (and why), to establish what information you need to track that change, and to fully explore the logical consequences of measuring it in that way (or how it could be gamed).

The Sutton Trust metric compares the proportion of disadvantaged students that enter an institution and then uses LEO data to see how many of them go on to be in the top 20 per cent of earners at age 30. So, to improve your ranking in the Sutton Trust metric you would need to increase the proportion of disadvantaged students you recruited and then force as many of them as possible into the highest paid careers. Rather than arbitrarily stipulating that you have to be a top earner, the Social Mobility Commission metric uses LEO to look at the boost in earnings a degree gets you compared to what you might have earned without one. Obviously, highly prestigious institutions and courses linked to high earning professions (so, STEM subjects) do best. Although the TASO report evidence indicates that this education bonus doesn’t apply equally to students from different backgrounds.

On a personal level, my own background ticks most of the usual social deprivation markers but I managed to get into Oxford to study Physics, which should have set me up for future success. However, I don’t think I’ve ever been in the top 20 per cent of earners and my income is probably less than the average of my peers. So, according to both of these measures, I am a social mobility failure.

The problem of following the money

Based on these metrics it would seem that the only outcomes, successes or gaps that matter are financial. In my career, I may not have been raking in the cash but my higher education experience has allowed me to obtain things from my career that I value more than money — such as lifelong learning, making a difference to other people and just enjoying what I do. Are people from disadvantaged backgrounds not entitled to have these things too or are they only allowed to desire greater wealth? What about the fact that HESA’s analysis of the ‘graduate voice’ and wellbeing questions in Graduate Outcomes shows that increasing income is only associated with career satisfaction and wellbeing up to a certain salary level?

The other issue with focusing on high salaries is that you are, as a result, indirectly focusing on particular career destinations at the exclusion of others. It’s true that people from disadvantaged backgrounds are under-represented in the highest paid professions, but that is also true for less well-paid areas such as the media, creative arts, the charity sector and policy work — where entry often depends on periods of parental support or the acquisition of unpaid experience or expensive postgraduate qualifications. So, presumably, we should also be scrutinising the proportions of disadvantaged students entering all of these areas, not just the highest paid ones. A true measure of social mobility would monitor all the walks of life where people from disadvantaged backgrounds are under-represented.

Measuring discrepancy

The starting point for a good metric should be the strategic mission it is trying to monitor, not whatever vaguely relevant data you can cobble together. The missions of the organisations mentioned above boil down to the aspiration that an individual’s life outcomes should not be related to their background. So, in statistical terms, we’re aiming for a null hypothesis scenario in which there is no significant relationship between background and life outcomes. And our indicator should allow us to know if we’re achieving this. Based on this, for any process (such as education) that has a range of possible outcomes, the distribution of disadvantaged and privileged individuals within each of those outcomes should be the same as the distribution of the input population as a whole. Any discrepancy between the distribution in a particular outcome category and the base population would indicate inequality that needs to be addressed.

This is the approach taken in other areas of diversity and is consistent with the approach recommended in the Social Mobility Commission’s own toolkit for employers which looks at socio-economic patterning. In fact, the report Elitist Britain 2019, produced jointly by the Social Mobility Commission and the Sutton Trust has a perfect example of this. Figure 1 (page 11) shows the socio-economic mix of various sectors compared to the mix for the whole workforce using parental occupation as an indicator of background.

It is possible to imagine how this approach could be adapted to provide a more nuanced understanding of the impact of higher education on social mobility. However you categorise the outcomes of higher education (graduate/non-graduate level, sector, salary, wellbeing level, sense of purpose, and so on), does each category have the same socio-economic distribution of graduates as the student population? Asking this question and producing metrics based on this would provide a clearer indication of where the equality gaps are and whether they are closing.

This page was last updated on 2 June 2023