Times Higher World League table | Pillar Talk

THE_feature.png

#VisualisingHE does International League tables…

A laymans introduction to the pros and cons, and uses and abuses of league tables.

The main reason I’ve been holding off vizzing a league table to date is that our eyes have been caught by the less obvious, lesser vizualised and, potentially, less contentious open data sources. We are currently experiencing very exciting times in the UK: the drive for open access to data has never been stronger. Thanks to bodies like HESA, DfE and HEFCE much previously locked away data is making its way into annual publication cycles – many thanks for these rich sources of data to get our teeth into!

This post takes a side step from some of the HESA and UCAS sources #VisualisingHE have been blogging about recently and turns to the wonderful world of league tables…..

From experience “You either love them or hate them”,  which camp do pitch your tent in?

For me, it’s been a way of life and work for the last ten years. Content drawn out of these ranking tables feed the HE provider Marketing machines with key headlines, but, more importantly, may provide benchmarking on competitors’ performance and key insights into global trends, they could also be used to evidence key areas for strategic investment and required focus for improvements. For a provider to stay afloat in this heavily competitive HE environment, they need to be clued up to what’s going on around them in order to stay in the game, whether this is the recruitment of students or ability to land key research bids for example.

Why use league tables, and what are some of the limitation?

On the Domestic league table front the primary league tables include; The Times and Sunday Times Good University Guide (paywall protected), Complete Uni Guide and the Guardian University Guide. Whilst the stated primary aims are to provide a guide to prospective applicants seeking information to guide their university applications, one could argue they are also a handy source of bench-marked performance data. Overtime the robustness of this data and the methodologies used to compile the metrics have improved, allowing them to be a valid reference source. Compilers are much more open to involving professionals from the HE sector in the process of building a rankings table, seeking their buy in and professional expertise to advise what metrics and methodologies are sensible and robust AND WHAT THE DATA CAN SUPPORT.

So what’s the crack on the International league tables? Quite simply it’s the International’ness’ of it all…. Getting yourself on a WORLD ranking stage is quite some accolade, and may mean big business and potential revenue. If you take a look at any UK providers’ mission statement/corporate plan, it will have an international strategy strand. A league table, like it or not, is a key source for international students, agents and funders alike, seeking the definitive TOP ‘X’ shortlist of providers.

It is also arguably, just for bragging rights for some providers and academics to boost their egos!

Some of the International tables look pretty on the outside, but in all honesty don’t have a lot going on in the mechanics to substantiate the actual rankings popping out the other end. They are to a lesser or greater extent heavily subjective depending on the editorial slant. And for the analyst in us, do not always warrant much deeper analysis or use than a promo email signature or throw away twitter brag.

With the caveats and subjective nature of league tables aside, a league table can provide real insights into common performances across the world. However, within the HE bubble and on the public International league table stage, only the brave wrangle with key issues like identifying ‘robust’ globally common metrics and take a peek into Pandora’s box, exposing varying data quality standards and governance processes. International league tables may therefore be challenged at the core, and wrangle with these limitations to execute a ranking table.

How does the Times Higher Education (THE) World University Rankings reach its final rankings?

  • Teaching (30%) – 15% survey driven 15% data driven
  • Research (30%) – 18% survey driven 12% data driven
  • Citations – (30%) data driven
  • International Outlook (7.5%) – 7.5% data driven
  • Industry Income (2.5%) – 2.5% data driven

THE weightingsFor a full breakdown and debrief on how the table is made up, check out the the THE methodology page.

One key thing to mention is that the ‘THE’ lists the TOP1000 providers in the world, however, anything outside the TOP200 is banded. Frankly, this is a little annoying for visualisation, so I have focused on a subset of the publication I am ‘most’ interested in – the TOP200.

Where does the analyst and the vizzer come in?

League tables are by definition a ranking data table, it’s their brief and they generally do it well… that’s great, but it isn’t very insightful nor visually appealing to investigate. Therefore, #VisualisingHE have taken some time to make it a little easier to explore what’s going on.

How? You may wish to try tools like Import.io or Google sheets ‘IMPORTHTML‘, they come in very handy, easymorph can also be a great tool if you need to spin things around.

Starting to question the data

Q1 – Which countries make up the TOP200?

map_location of TOP200.png

Q2 -How is the little old UK doing in the big wide world of Higher Education?

THE World University Rankings 2018  TOP200_UKvs.PNG

Check out the interactive Viz to flip through the TOP5 per Continent: THE TOP200

Takeaways

  • UK providers make up 16% (31/200) of the TOP200 world rankings
  • European providers holding 51% of the TOP200 rank

UK v’s the Rest of the World | Digging a little deeper…

Q3 – Which pillars do the continents excel on?

UK vs_overview

  • UK outperforms the rest of the world on two pillars; International criteria worth 7.5% and Citations worth 30%.
  • Industry criteria is an emergent shortfall in the UK’s balanced score card (worth 2.5%)
  • Teaching and Research carry most weightings and fall short of the rest of the world median scores – but it is the rest of the world TOP200 we are comparing here, lest not forget.

Q4 – Where is this pillar prowess distributed across the globe?

UK vs.png

  • It’s a mixed picture, and fascinating to spend a few moments teasing out key strengths and weaknesses emergent when grouped by continent.

Which leads me to the main visualisation of this blog post that sets up the stage with an overview of the whole TOP200 in one view, encourages the viewer to explore the overall ranking and relationship between the scores across the pillars of assessment, whilst also giving the ability to highlight a specific HE provider of interest.

Q5 – How do the continents compare in the TOP200? And who excels in what Pillars of assessment?

THE World University Rankings 2018 TOP200_v2_overallscore

Go on.. Go beyond this screen shot and dive into to this interactive viz: TOP200 – How do the Continents compare

Takeaways

Continental ranking by median TOP200 rank

  • #1 – N America leads the way with a median rank of 63
  • # 2 – Oceania (80)
  • #3 – Asia (95)
  • #4 -Europe (125)
  • #5 – Africa (171).

#1 ‘Pillar’ talk (Continental Median score)

  • Overall score | #1 N America (67.5 out of 100)
  • Teaching | #1 N America (57.4/100)
  • Research | #1 Asia (63/100)
  • Citations | #1 N America (94.6/100)
  • International | #1 Oceania (90.6/100)
  • Industry | #1 Africa (88.5/100)

Hope you have enjoyed this little foray into International League tables, and how they could be used to glean a little more insight from the raw ranking tables by creating a few visuals and presenting the data a little differently.

Best

Adam

#VisualingHE

Opinions and thoughts are mine and are not in anyway linked to that of my employer.
Advertisement

Universities in UK: From Research to Patents

Higher Education Providers (HEPs) in the UK pride themselves with delivering excellent teaching and world class research, but here is a question: how can we measure how ‘innovative’ these institutions are? One way to look at this is to see how many patents each institution has using data from the Higher Education Business Community Interaction (HE-BCI) return. Though this is nowhere nearly as comprehensive a method as are the numerous rankings, league tables, and other excellence framework exercises, it is still a simple way of showing who invests in protecting their intellectual property.

What are the caveats?

First, this is not a sign of research strength or quality but rather shows what the universities choose to do with their intellectual property.

Second, whether universities have a large patent portfolio or not depends on a range of factors: some universities do not focus on research as much as others do (e.g. some HEPs are traditionally research focused whilst other, more recently established ones, put more emphasis on teaching); some areas of research are less likely to be commercialised (e.g. research in the social sciences and humanities subjects would less frequently lead to practical inventions as would research in the applied scientific subject areas); some universities may choose not to file any patents if they do not seek to commercialise their inventions, and so on.

These are only a few out of many possible explanations, so keeping this information in mind, have a look at the insights.

The Size of UK’s Patent Portfolio:

In 2016 the 162 UK higher education providers held a total of 18,723 live patents! 18% of those, or 3,357 were owned by a single institution: the University of Oxford. What is also striking is that the university with the second largest patent portfolio, the University College London, has approximately half the number of patents that Oxford has.

tableau_2017-09-08_22-00-06.png

By applying the Pareto principle to the data we see that 80% of the patents were owned by the top 15% of institutions (or the top 24 out of 162 HEIs). If you would like to find out more about the Pareto principle, have a look at this article: https://betterexplained.com/articles/understanding-the-pareto-principle-the-8020-rule/

The Region Split

London is the region with the largest number of patents overall – 5,152 followed by the South East with 4,265. Whilst London has twice as many HEPs than the South East (38 vs. 19), the region only has 20% more patents.

tableau_2017-09-08_22-00-44.png

Another interesting region is Northern Ireland – whilst there are only four HEPs, only two of them held any live patents as of 2016. These were the ‘Queen’s University of Belfast’ and the ‘University of Ulster’.

The Management of Intellectual Property

The last set of charts do not focus solely on patents but rather, on the overall management of the intellectual property generated by an institution. This may include any licenses, designs, and trademarks, etc. When comparing whether the management of IP is dealt-with in-house or outsourcing, it is clear to see the difference between the top 24 institutions and the remaining institutions with at least one active patent. Amongst the top 24, the most preferred method is through a combination of both in-house and outsourcing the expertise whilst amongst the remaining universities, outsourcing is the most popular choice.

tableau_2017-09-08_22-00-53.png

One very interesting observation from the data is that amongst the 97 HEIs that held at least 1 patent in 2016, all but two have disclosed that members of staff whose research generate the intellectual property are rewarded in some form. Although it is not easy to compare what these rewarding methods are, remember that you can still find out more about them by clicking on an institution in either of the charts.

Is This All the Data Could Tell Us?

Absolutely no. The data is very rich and it allows for a great depth of analyses which may be covered in other #VisualisingHE projects. Some of the questions that weren’t answered due to limitations of the data include:

  • any 5-10 year trend analyses (not all data for the last 10 years is made freely available by HESA).
  • any cost-revenue insights (this topic explored patents in particular and the cost-revenue analysis available in the dataset is for the entire intellectual property of an institution);
  • any breakdown of the science field in which patents are held (a topic that may be explored with data from the UK patent office, if such open dataset exists).

Link to interactive viz:

https://public.tableau.com/profile/elena.hristozova#!/vizhome/UKHEIsIP/IP

Elena

#VisualisingHE