#VisualisingHE does International League tables…
A laymans introduction to the pros and cons, and uses and abuses of league tables.
The main reason I’ve been holding off vizzing a league table to date is that our eyes have been caught by the less obvious, lesser vizualised and, potentially, less contentious open data sources. We are currently experiencing very exciting times in the UK: the drive for open access to data has never been stronger. Thanks to bodies like HESA, DfE and HEFCE much previously locked away data is making its way into annual publication cycles – many thanks for these rich sources of data to get our teeth into!
This post takes a side step from some of the HESA and UCAS sources #VisualisingHE have been blogging about recently and turns to the wonderful world of league tables…..
From experience “You either love them or hate them”, which camp do pitch your tent in?
For me, it’s been a way of life and work for the last ten years. Content drawn out of these ranking tables feed the HE provider Marketing machines with key headlines, but, more importantly, may provide benchmarking on competitors’ performance and key insights into global trends, they could also be used to evidence key areas for strategic investment and required focus for improvements. For a provider to stay afloat in this heavily competitive HE environment, they need to be clued up to what’s going on around them in order to stay in the game, whether this is the recruitment of students or ability to land key research bids for example.
Why use league tables, and what are some of the limitation?
On the Domestic league table front the primary league tables include; The Times and Sunday Times Good University Guide (paywall protected), Complete Uni Guide and the Guardian University Guide. Whilst the stated primary aims are to provide a guide to prospective applicants seeking information to guide their university applications, one could argue they are also a handy source of bench-marked performance data. Overtime the robustness of this data and the methodologies used to compile the metrics have improved, allowing them to be a valid reference source. Compilers are much more open to involving professionals from the HE sector in the process of building a rankings table, seeking their buy in and professional expertise to advise what metrics and methodologies are sensible and robust AND WHAT THE DATA CAN SUPPORT.
So what’s the crack on the International league tables? Quite simply it’s the International’ness’ of it all…. Getting yourself on a WORLD ranking stage is quite some accolade, and may mean big business and potential revenue. If you take a look at any UK providers’ mission statement/corporate plan, it will have an international strategy strand. A league table, like it or not, is a key source for international students, agents and funders alike, seeking the definitive TOP ‘X’ shortlist of providers.
It is also arguably, just for bragging rights for some providers and academics to boost their egos!
Some of the International tables look pretty on the outside, but in all honesty don’t have a lot going on in the mechanics to substantiate the actual rankings popping out the other end. They are to a lesser or greater extent heavily subjective depending on the editorial slant. And for the analyst in us, do not always warrant much deeper analysis or use than a promo email signature or throw away twitter brag.
With the caveats and subjective nature of league tables aside, a league table can provide real insights into common performances across the world. However, within the HE bubble and on the public International league table stage, only the brave wrangle with key issues like identifying ‘robust’ globally common metrics and take a peek into Pandora’s box, exposing varying data quality standards and governance processes. International league tables may therefore be challenged at the core, and wrangle with these limitations to execute a ranking table.
How does the Times Higher Education (THE) World University Rankings reach its final rankings?
- Teaching (30%) – 15% survey driven 15% data driven
- Research (30%) – 18% survey driven 12% data driven
- Citations – (30%) data driven
- International Outlook (7.5%) – 7.5% data driven
- Industry Income (2.5%) – 2.5% data driven
For a full breakdown and debrief on how the table is made up, check out the the THE methodology page.
One key thing to mention is that the ‘THE’ lists the TOP1000 providers in the world, however, anything outside the TOP200 is banded. Frankly, this is a little annoying for visualisation, so I have focused on a subset of the publication I am ‘most’ interested in – the TOP200.
Where does the analyst and the vizzer come in?
League tables are by definition a ranking data table, it’s their brief and they generally do it well… that’s great, but it isn’t very insightful nor visually appealing to investigate. Therefore, #VisualisingHE have taken some time to make it a little easier to explore what’s going on.
How? You may wish to try tools like Import.io or Google sheets ‘IMPORTHTML‘, they come in very handy, easymorph can also be a great tool if you need to spin things around.
Starting to question the data
Q1 – Which countries make up the TOP200?
Q2 -How is the little old UK doing in the big wide world of Higher Education?
Check out the interactive Viz to flip through the TOP5 per Continent: THE TOP200
- UK providers make up 16% (31/200) of the TOP200 world rankings
- European providers holding 51% of the TOP200 rank
UK v’s the Rest of the World | Digging a little deeper…
Q3 – Which pillars do the continents excel on?
- UK outperforms the rest of the world on two pillars; International criteria worth 7.5% and Citations worth 30%.
- Industry criteria is an emergent shortfall in the UK’s balanced score card (worth 2.5%)
- Teaching and Research carry most weightings and fall short of the rest of the world median scores – but it is the rest of the world TOP200 we are comparing here, lest not forget.
Q4 – Where is this pillar prowess distributed across the globe?
- It’s a mixed picture, and fascinating to spend a few moments teasing out key strengths and weaknesses emergent when grouped by continent.
Which leads me to the main visualisation of this blog post that sets up the stage with an overview of the whole TOP200 in one view, encourages the viewer to explore the overall ranking and relationship between the scores across the pillars of assessment, whilst also giving the ability to highlight a specific HE provider of interest.
Q5 – How do the continents compare in the TOP200? And who excels in what Pillars of assessment?
Go on.. Go beyond this screen shot and dive into to this interactive viz: TOP200 – How do the Continents compare
Continental ranking by median TOP200 rank
- #1 – N America leads the way with a median rank of 63
- # 2 – Oceania (80)
- #3 – Asia (95)
- #4 -Europe (125)
- #5 – Africa (171).
#1 ‘Pillar’ talk (Continental Median score)
- Overall score | #1 N America (67.5 out of 100)
- Teaching | #1 N America (57.4/100)
- Research | #1 Asia (63/100)
- Citations | #1 N America (94.6/100)
- International | #1 Oceania (90.6/100)
- Industry | #1 Africa (88.5/100)
Hope you have enjoyed this little foray into International League tables, and how they could be used to glean a little more insight from the raw ranking tables by creating a few visuals and presenting the data a little differently.