2012 QS World University Rankings by Subject: Methodology

QS World University Rankings®has been published annually since 2004, but since 2011 the study has been extended to encompass a range of popular individual subjects.

Any international ranking faces various challenges relating to the equal availability and applicability of data across different countries and university systems. Many indicators of university quality commonly used in domestic rankings, such as the average entry tariff of admitted students, are not yet applicable on an international level and are thus not included in any of these exercises. And in areas where universities themselves provide data, the efficiency with which it is collected varies by region. While the depth of data available from the UK, Australia and the US may be exemplary it is yet to be matched by that in India, Greece or Brazil, for example.

These challenges become more pronounced when the focus of a ranking is narrowed to a particular aspect of university performance. While it may be reasonable to expect a university to have a decent understanding of its average faculty-student ratio, to break that down by faculty or department is difficult in even the most advanced cultures of data provision.

For this reason, the methodology for QS World University Rankings by Subject has been narrowed to include only those indicators that bypass the direct involvement of institutions and can reliably be stratified by subject discipline. Thispage outlines the QS approach for doing so, and how it has been used to produce the new QS World University Rankings by Subject.

Figure 1

Figure 2
Figure 3
Included Subjects
Accounting & Finance
Agriculture & Forestry
Anatomy & Physiology
Anthropology
Archaeology
Architecture / Built Environment
Art & Design
Biological Sciences
Business & Management Studies
Chemistry
Classics & Ancient History
Communication & Media Studies
Computer Science & Information Systems
Dentistry
Development Studies
Earth & Marine Sciences
Economics & Econometrics
Education
Engineering – Chemical
Engineering – Civil & Structural
Engineering – Electrical & Electronic
Engineering – General
Engineering – Mechanical, Aeronautical & Manufacturing
Engineering – Mineral & Mining
English Language & Literature
Environmental Sciences
Geography
History
History of Art, Architecture & Design
Hospitality & Leisure Management
Law
Library & Information Management
Linguistics
Mathematics
Medicine
Materials Science
Modern Languages
Nursing
Other Studies & Professions Allied to Medicine
Performing Arts
Pharmacy & Pharmacology
Philosophy
Physics & Astronomy
Politics & International Studies
Psychology
Social Policy & Administration
Social Work
Sociology
Sports-related Subjects
Statistics & Operational Research
Theology, Divinity & Religious Studies
Veterinary Science

Figure 4

Overview

Three extensive datasets have been employed to produce QS World University Rankings by Subject: ourACADEMICandEMPLOYERreputation surveys, and the Scopus data we use for ourCITATIONSper Faculty indicator in the overall rankings.

Obviously there are innumerable subject disciplines and sub-disciplines. Through analysis of academic survey results over a protracted period and publication data from Sciverse Scopus, QS Intelligence Unit has identified 52 subject areas which may, at some stage in the next few years, reach the necessary data levels to facilitate a ranking. These are listed in Figure 1.

In 2012, 29 will be published. These 29 subjects have been selected due to their meeting all of the following criteria:

·  Inclusion of specialists

QS has ensured that surveys have included all key specialist institutions operating within the discipline, regardless of whether they may have been expected to feature in the overall QS World University Rankings®

·  Academic response level

Subject meets a minimum threshold of academic responses

·  Overall appropriateness of indicators

Indicators and approach prove appropriate and effective in highlighting excellence within a given discipline.

Figure 4 also denotes which of the 52 subjects have currently qualified for publication. Every effort is being made to diversify and increase the number of subjects over the next couple of years.

In order to feature in any discipline table, an institution must meet three simple prerequisites:

o  Attract more than 20 responses from academics and/or employers

o  Exceed the five-year threshold for number of papers published in the given discipline

o  Offer undergraduate or taught postgraduate programs in the given discipline

Not all disciplines can be considered equal. Publication and citation rates are far higher in life sciences and natural sciences than in social sciences or arts & humanities and therefore there is more data. It would not make sense to place the same emphasis on citations in medicine and English language and literature.

Similarly the popularity of particular disciplines amongst employers varies greatly, and placing the same emphasis on employer opinion in economics and philosophy therefore makes little sense. Taking these factors into account leads to a variable approach to the weightings for the different subjects, which can be seen in Figure 1 above.

In principle, in the future additional indicators may be introduced that could contribute to as few as one single subject area. This adaptive approach to weightings enables us to recognize the different measures of strength in different subjects and embrace new datasets as we identify them.

Academic reputation

Academic reputation has been the centrepiece of the QS World University Rankings® since their inception in 2004. In 2011 we drew on over 33,000 respondents to compile our results. The survey is structured in the following way:

SECTION 1: PERSONAL INFORMATION

Respondents provide their name, contact details, job title and the institution where they are based

SECTION 2: KNOWLEDGE SPECIFICATION

Respondents identify the countries, regions and faculty areas that they have most familiarity with and up to two narrower subject disciplines in which they consider themselves expert

SECTION 3: TOP UNIVERSITIES

For EACH of the (up to five) faculty areas they identify, respondents are asked to list up to ten domestic and thirty international institutions that they consider excellent for research in the given area. They are not able to select their own institution.

SECTION 4: ADDITIONAL INFORMATION

Additional questions relating to general feedback and recommendations.

A thorough breakdown of respondents by geography, discipline and seniority is available in the methodology section of our main rankingshere

As part of QS Global Academic Survey, respondents are asked to identify universities they consider excellent within one of five areas:

·  Arts & humanities

·  Engineering & technology

·  Life sciences & medicine

·  Natural sciences

·  Social sciences & management

The results of the academic reputation component of the new subject rankings have been produced by filtering responses according to the narrow area of expertise identified by respondents. While academics can select up to two narrow areas of expertise, greater emphasis is placed on respondents who have identified with only one.

The threshold for academic respondents that any discipline must reach for us to consider publication has been set at 150. As responses build over time, new subjects from the above list may qualify.

The number of academic respondents considered for each qualifying discipline can be seen, along with employer responses, in Figure 2 above. As with the overall tables, our analysis places an emphasis on international reputation over domestic. Domestic responses are individually weighted at half the influence of an international response. This is a global exercise and will recognize institutions that have an international influence in these disciplines. As in the main QS World University Rankings®, weightings are also applied to balance the representation by region.

NEW FOR 2012 – Direct Subject Responses

Until 2010, the survey could only infer specific opinion on subject strength by aggregating the broad faculty area opinions of academics from a specific discipline. From the 2011 survey additional questions have been asked to gather specific opinion in the respondent’s own narrow field of expertise. These responses are given a greater emphasis from 2012.

Employer reputation

QS World University Rankings® are unique in incorporating employability as a key factor in the evaluation of international universities, and in 2011 drew on over 16,000 responses to compile the results for the overall rankings. The employer survey works on a similar basis to the academic one only without the channelling for different faculty areas. Employers are asked to identify up to ten domestic and thirty international institutions they consider excellent for the recruitment of graduates. They are also asked to identify from which disciplines they prefer to recruit. From examining where these two questions intersect we can infer a measure of excellence in a given discipline.

A full breakdown of respondents by geography and sector is available in the methodology section of our main rankingshere

Of course, employability is a slightly wider concern than this alone would imply. Many students’ career paths are indirectly related to their degree discipline. Many engineers become accountants and few history students wind up pursuing careers closely related to their program. On this basis, employers citing a preference for hiring students from ‘any discipline’ or from broader category areas are also included in subject the scores, but at a considerably lower individual weighting. From 2012, a greater emphasis is placed on the opinions of the employers that are specifically interested in only the given discipline.

It is our view, based on focus groups and feedback from students, that employment prospects are a key consideration for prospective students when choosing a program and a university, regardless of whether or not they envisage a career directly linked to the discipline they choose to study.

Employers seeking graduates from any discipline are weighted at 0.1 and those from a parent category (i.e. social sciences) are weighted at 0.25 relative to the weight of a direct response for the subject area. Responses from employers exclusively targeting a specific subject carry a relative weighting of 2.

Figure 2 shows the total number of employers, alongside the academics contributing to our employer index in each of the corresponding disciplines. The similarities between the numbers recorded in each of the engineering sub-disciplines is down to the fact that employers were asked to comment on engineering in general rather than the specific sub-disciplines. A small number of respondents specified their preference through the ‘other’ option provided in the survey, leading to a slightly different total for mechanical engineering. The threshold for including the employer component in any discipline is 300.

As with the overall tables, our analysis places an emphasis on international reputation over domestic, with domestic responses carrying half the individual weighting of international responses. This is a global exercise and recognizes institutions that have an international influence in these disciplines. A weighting is also applied to balance representation by region.

Citations per paper

In the overall QS World University Rankings® we use a measure of citations per faculty. This has some advantages in that it does a good job of taking into account the size of an institution, yet allows us to penetrate deeply into the global research landscape. Due to the impracticality of reliably gathering faculty numbers broken down by discipline, for the purposes of this exercise we have measured citations per paper. A minimum publication threshold has been set for each subject to avoid potential anomalies stemming from small numbers of highly cited papers.

Journals in Scopus are tagged with a number of ASJC (All Science Journal Classification) codes, which identify the principal foci of the journal in which they were published (multidisciplinary journals are excluded). When aggregated these totals and their associated citations provide an indicator of volume and quality of output within a given discipline.

One of the advantages of the “per faculty” measure used in the overall rankings is that a small number of papers, achieving a high level of citations, has limited impact due to the divisor. Conventionally in citations per paper analysis, a paper threshold is required to eliminate anomalies. Of course publication patterns are very different in different subjects and this needs to be taken into account both in terms of the thresholds that are used and the weights applied to the citations indicator.

Figure 3 above lists the subjects we will be working with as identified based on strength of response to the academic and employer surveys. It shows the number of paper affiliations indexed – which serves as a proxy for the scale of global research in the discipline – a total of all of the distinct paper affiliations in the discipline that we have been able to attribute to one of the 1,000+ universities that we have mapped into the Scopus database. The resulting paper threshold for each discipline is also shown, representing the minimum number of papers an institution must have published in the last five years in order to qualify for our tables in a given subject.

There are certain subjects in which academic publications are not a feasible or appropriate measure of academic output. These subjects have either zero or a low number of papers in Scopus, and are denoted in the above by a paper threshold of 0. Any discipline must have at least 6,000 papers identifiable in the table above for us to include the citations indicator in the table.

There are a few other clarifying points regarding our treatment of Scopus data; most, if not all of which also apply to our analysis for the forthcoming cycle of regional and global rankings:

1.  Our analysis is based on an extract from Scopus (custom data), and not on the live database, in order for us to be drawing on a consistent dataset within each cycle of research. We receive this in Feb/Mar of each year. As the live Scopus database evolves the two diverge, so a comparison with the current Scopus dataset will not yield an exact match

2.  The window for both publications and citations is five years, from 2006 to 2010 inclusive

3.  Self-citations are excluded from all citation counts

4.  Multidisciplinary publications do not contribute towards counts for any discipline (although they do if you run a search in Scopus, so be sure to edit your search query if you are trying to verify our numbers)

5.  All affiliations we know about are considered. Universities are invited to inform us of hospitals, laboratories and schools with which they are affiliated

6.  NEW FOR 2012 – Papers in journals with ASJC codes that map exclusively to only one subject area carry additional weight