Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

QS World University Rankings

This is an old revision of this page, as edited by 14.136.68.165 (talk) at 17:17, 6 May 2016. The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

QS World University Rankings is an annual publication of university rankings by Quacquarelli Symonds (QS). Previously known as THE-QS World University Rankings, QS had collaborated with Times Higher Education magazine (THE) to publish its international league tables from 2004 to 2009 before both started to announce their own versions when QS chose to use the pre-existing methodology. The QS system now comprises the global overall and subject rankings, alongside three independent regional tables (Asia, Latin America, and BRICS).[1] It is viewed as one of the most widely read university rankings along with Academic Ranking of World Universities and Times Higher Education World University Rankings,[2][3][4][5] but criticism of it for giving undue weight to subjective indicators also exists.[6][7][8]

QS World University Rankings
File:QS World University Rankings logo.gif
EditorDanny Byrne
CategoriesHigher education
FrequencyAnnual
PublisherQuacquarelli Symonds Limited
First issue2004 (in partnership with THE)
2010 (on its own)
Country United Kingdom
LanguageEnglish
Websitewww.topuniversities.com

History

The need for an international ranking of universities was highlighted in December 2003 in Richard Lambert's review of university-industry collaboration in Britain[9] for HM Treasury, the finance ministry of the United Kingdom. Amongst its recommendations were world university rankings, which Lambert said would help the UK to gauge the global standing of its universities.

The idea for the rankings was credited in Ben Wildavsky's book, The Great Brain Race: How Global Universities are Reshaping the World,[10] to then-editor of Times Higher Education (THE), John O'Leary. THE chose to partner with educational and careers advice company Quacquarelli Symonds (QS) to supply the data, appointing Martin Ince,[11] formerly deputy editor and later a contractor to THE, to manage the project.

Between 2004 and 2009, QS produced the rankings in partnership with THE. In 2009, THE announced they would produce their own rankings, the Times Higher Education World University Rankings, in partnership with Thomson Reuters. THE cited a weakness in the methodology of the original rankings,[12] as well as a perceived favoritism in the existing methodology for science over the humanities,[13] as one of the key reasons for the decision to split with QS.

QS retained the intellectual property in the Rankings and the methodology used to compile them [citation needed] and continues to produce the rankings, now called the QS World University Rankings.[14] THE created a new methodology with Thomson Reuters, published as the Times Higher Education World University Rankings in September 2010.

Global rankings

Overall

Methodology

Methodology of QS World University Rankings[15]
Indicator Weighting Elaboration
Academic peer review
  • 40%
Based on an internal global academic survey
Faculty/Student ratio
  • 20%
A measurement of teaching commitment
Citations per faculty
  • 20%
A measurement of research impact
Employer reputation
  • 10%
Based on a survey on graduate employers
International student ratio
  • 5%
A measurement of the diversity of the student community
International staff ratio
  • 5%
A measurement of the diversity of the academic staff

QS publishes the rankings results in some media around the world, including Chosun Ilbo in Korea. The first rankings produced by QS independently of THE, and using QS's consistent and original methodology, were released on September 8, 2010, with the second appearing on September 6, 2011.

QS tried to design its rankings to look at a broad range of university activity.[16]

Academic peer review

This is the most controversial part of the methodology[weasel words][citation needed]. Using a combination of purchased mailing lists and applications and suggestions, this survey asks active academicians across the world about the top universities in fields they know about. QS has published the job titles and geographical distribution of the participants.[17]

The 2011 rankings made use of responses from 33,744 people from over 140 nations in its Academic Peer Review, including votes from the previous two years rolled forward provided there was no more recent information available from the same individual. Participants can nominate up to 30 universities but are not able to vote for their own. They tend to nominate a median of about 20, which means that this survey includes over 500,000 data points.[17]

In 2004, when the rankings first appeared, academic peer review accounted for half of a university's possible score. In 2005, its share was cut to 40 per cent because of the introduction of the Recruiter Review.

Faculty student ratio

This indicator accounts for 20 per cent of a university's possible score in the rankings. It is a classic measure used in various ranking systems as a surrogate for teaching commitment, but QS has admitted that it is less than satisfactory.[18]

Citations per faculty

Citations of published research are among the most widely used inputs to national and global university rankings. The QS World University Rankings used citations data from Thomson (now Thomson Reuters) from 2004 to 2007, and since then uses data from Scopus, part of Elsevier. The total number of citations for a five-year period is divided by the number of academicians in a university to yield the score for this measure, which accounts for 20 per cent of a university's possible score in the Rankings.

QS has explained that it uses this approach, rather than the citations per paper preferred for other systems, because it reduces the effect of biomedical science on the overall picture – bio-medicine has a ferocious "publish or perish" culture. Instead QS attempts to measure the density of research-active staff at each institution. But issues still remain about the use of citations in ranking systems, especially the fact that the arts and humanities generate comparatively few citations.[19]

QS has conceded the presence of some data collection errors regarding citations per faculty in previous years' rankings.[20]

One interesting issue is the difference between the Scopus and Thomson Reuters databases. For major world universities, the two systems capture more or less the same publications and citations. For less mainstream institutions, Scopus has more non-English language and smaller-circulation journals in its database. But as the papers there are less heavily cited, this can also mean fewer citations per paper for the universities that publish in them.[19] This area has been criticized for undermining universities which do not use English as their primary language.[21] Citations and publications in a language different from English are harder to come across. The English language is the most internationalized language and therefore the most popular when citing.

Recruiter review

This part of the ranking is obtained by a similar method to the Academic Peer Review, except that it samples recruiters who hire graduates on a global or significant national scale. The numbers are smaller – 16,875 responses from over 130 countries in the 2011 Rankings – and are used to produce 10 per cent of any university's possible score. This survey was introduced in 2005 in the belief that employers track graduate quality, making this a barometer of teaching quality, a famously problematic thing to measure. University standing here is of special interest to potential students.[22]

International orientation

The final ten per cent of a university's possible score is derived from measures intended to capture their internationalism: five percent from their percentage of international students, and another five percent from their percentage of international staff. This is of interest partly because it shows whether a university is putting effort into being global, but also because it tells us whether it is taken seriously enough by students and academics around the world for them to want to be there.[23]

Commentary

Reception

Several universities in the UK and the Asia-Pacific region have commented on the rankings positively. Vice-Chancellor of New Zealand's Massey University, Professor Judith Kinnear, says that the Times Higher Education-QS ranking is a "wonderful external acknowledgement of several University attributes, including the quality of its research, research training, teaching and employability." She said the rankings are a true measure of a university's ability to fly high internationally: "The Times Higher Education ranking provides a rather more and more sophisticated, robust and well rounded measure of international and national ranking than either New Zealand's Performance Based Research Fund (PBRF) measure or the Shanghai rankings."[24] In September 2012 the British newspaper The Independent described the QS World University Rankings as being "widely recognised throughout higher education as the most trusted international tables".[25]

Martin Ince,[11] chair of the Advisory Board for the Rankings, points out that their volatility has been reduced since 2007 by the introduction of the Z-score calculation method and that over time, the quality of QS's data gathering has improved to reduce anomalies. In addition, the academic and employer review are now so big that even modestly ranked universities receive a statistically valid number of votes. QS has published extensive data [26] on who the respondents are, where they are, and the subjects and industries to which the academicians and employers respectively belong.

Criticisms

Many are concerned with the use or misuse of survey data.

Since the split from Times Higher Education, further concerns about the methodology QS uses for its rankings have been brought up by several experts. Simon Marginson, professor of higher education at University of Melbourne and a member of the THE editorial board, in the article "Improving Latin American universities' global ranking" for University World News on 10 June 2012, said: "I will not discuss the QS ranking because the methodology is not sufficiently robust to provide data valid as social science." [27]

In an article for the New Statesman entitled "The QS World University Rankings are a load of old baloney", David Blanchflower, a leading labour economist, said: "This ranking is complete rubbish and nobody should place any credence in it. The results are based on an entirely flawed methodology that underweights the quality of research and overweights fluff... The QS is a flawed index and should be ignored." [28]

In an article titled The Globalisation of College and University Rankings and appearing in the January/February 2012 issue of Change magazine, Philip Altbach, professor of higher education at Boston College and also a member of the THE editorial board, said: "The QS World University Rankings are the most problematical. From the beginning, the QS has relied on reputational indicators for half of its analysis … it probably accounts for the significant variability in the QS rankings over the years. In addition, QS queries employers, introducing even more variability and unreliability into the mix. Whether the QS rankings should be taken seriously by the higher education community is questionable."[29]

The QS World University Rankings have been criticised by many for placing too much emphasis on peer review, which receives 40 percent of the overall score. Some people have expressed concern about the manner in which the peer review has been carried out.[30] In a report,[31] Peter Wills from the University of Auckland, New Zealand wrote of the Times Higher Education-QS World University Rankings:

But we note also that this survey establishes its rankings by appealing to university staff, even offering financial enticements to participate (see Appendix II). Staff are likely to feel it is in their greatest interest to rank their own institution more highly than others. This means the results of the survey and any apparent change in ranking are highly questionable, and that a high ranking has no real intrinsic value in any case. We are vehemently opposed to the evaluation of the University according to the outcome of such PR competitions.

QS points out that no survey participant, academic or employer, has been offered a financial incentive to respondents. And academics cannot vote for their own institution.

Despite THES-QS introducing several changes in methodology in 2007 which were aimed at addressing these criticisms,[32] the ranking has continued to attract criticisms. In an article[33] in the peer-reviewed BMC Medicine authored by several scientists from the US and Greece, it was pointed out:

If properly performed, most scientists would consider peer review to have very good construct validity; many may even consider it the gold standard for appraising excellence. However, even peers need some standardized input data to peer review. The Times simply asks each expert to list the 30 universities they regard as top institutions of their area without offering input data on any performance indicators. Research products may occasionally be more visible to outsiders, but it is unlikely that any expert possesses a global view of the inner workings of teaching at institutions worldwide. Moreover, the expert selection process of The Times is entirely unclear. The survey response rate among the selected experts was only <1% in 2006 (1,600 of 190,000 contacted). In the absence of any guarantee for protection from selection biases, measurement validity can be very problematic.

Alex Usher, vice president of Higher Education Strategy Associates in Canada, commented:

Most people in the rankings business think that the main problem with The Times is the opaque way it constructs its sample for its reputational rankings - a not-unimportant question given that reputation makes up 50% of the sample. Moreover, this year's switch from using raw reputation scores to using normalized Z-scores has really shaken things up at the top-end of the rankings by reducing the advantage held by really top universities - University of British Columbia (UBC) for instance, is now functionally equivalent to Harvard in the Peer Review score, which, no disrespect to UBC, is ludicrous. I'll be honest and say that at the moment the THES Rankings are an inferior product to the Shanghai Jiao Tong's Academic Ranking of World Universities.

Academicians have also been critical of the use of the citation database, arguing that it undervalues institutions which excel in the social sciences. Ian Diamond, former chief executive of the Economic and Social Research Council and now vice-chancellor of the University of Aberdeen and a member of the THE editorial board, wrote to Times Higher Education in 2007, saying:[34]

The use of a citation database must have an impact because such databases do not have as wide a cover of the social sciences (or arts and humanities) as the natural sciences. Hence the low position of the London School of Economics, caused primarily by its citations score, is a result not of the output of an outstanding institution but the database and the fact that the LSE does not have the counterweight of a large natural science base.

The most recent criticism of the old system came from Fred L. Bookstein, Horst Seidler, Martin Fieder and Georg Winckler in the journal Scientomentrics for the unreliability of QS's methods:

Several individual indicators from the Times Higher Education Survey (THES) data base the overall score, the reported staff-to-student ratio, and the peer ratings—demonstrate unacceptably high fluctuation from year to year. The inappropriateness of the summary tabulations for assessing the majority of the "top 200" universities would be apparent purely for reason of this obvious statistical instability regardless of other grounds of criticism. There are far too many anomalies in the change scores of the various indices for them to be of use in the course of university management.[6]

The QS subject rankings have been dismissed as unreliable by some critics, including most notably Brian Leiter, who points out that programmes which are known to be high quality, and which rank highly in the Blackwell rankings (e.g., the University of Pittsburgh) fare poorly in the QS ranking for reasons that are not at all clear.[35]

Results

QS World University Rankings — Top 50[note 1]
Institution 2010/11[36] 2011/12[37] 2012/13[38] 2013/14[39] 2014/15[40] 2015/16[41]
 Massachusetts Institute of Technology 5 3 1 1 1 1
 Harvard University 2 2 3 2 4 2
 Stanford University 13 11 15 7 7 3
 University of Cambridge 1 1 2 3 2 3
 California Institute of Technology 9 12 10 10 8 5
 University of Oxford 6 5 5 6 5 6
 University College London 4 7 4 4 5 7
 Imperial College London 7 6 6 5 2 8
 Swiss Federal Institute of Technology in Zurich 18 18 13 12 12 9
 University of Chicago 8 8 8 9 11 10
 Princeton University 10 13 9 10 9 11
 National University of Singapore 31 28 25 24 22 12
 Nanyang Technological University 74 58 47 41 39 13
 Swiss Federal Institute of Technology in Lausanne 32 35 29 19 17 14
 Yale University 3 4 7 8 10 15
 The Johns Hopkins University 17 16 16 16 14 16
 Cornell University 16 15 14 15 19 17
 University of Pennsylvania 12 9 12 13 13 18
 King's College London 21 27 26 19 16 19
 Australian National University 20 26 24 27 25 19
 University of Edinburgh 22 20 21 17 17 21
 Columbia University 11 10 11 14 14 22
 Ecole Normale Supérieure 33 33 34 28 24 23
 McGill University 19 17 18 21 21 24
 Tsinghua University 54 47 48 48 47 25
 University of California, Berkeley 28 21 22 25 27 26
 University of California, Los Angeles 35 34 31 40 37 27
 The Hong Kong University of Science and Technology 40 40 33 34 40 28
 Duke University 14 19 20 23 25 29
 The University of Hong Kong 23 22 23 26 28 30
 University of Michigan 15 14 17 22 23 30
 Northwestern University 26 24 27 29 34 32
 The University of Manchester 30 29 32 33 30 33
 University of Toronto 29 23 19 17 20 34
 London School of Economics and Political Science -- -- 69 68 71 35
 Seoul National University 50 42 37 35 31 36
 University of Bristol 27 30 28 30 29 37
 Kyoto University 25 32 35 35 36 38
 The University of Tokyo 24 25 30 32 31 39
 Ecole Polytechnique 36 36 41 41 35 40
 Peking University 47 46 44 46 57 41
 The University of Melbourne 38 31 36 31 33 42
 Korean Advanced Institute of Science and Technology -- -- 63 60 51 43
 University of California, San Diego -- -- 70 63 59 44
 The University of Sydney 37 38 39 38 37 45
 University of New South Wales 39 39 52 52 48 46
 The University of Queensland 43 48 46 43 43 46
 The University of Warwick -- -- -- 58 64 48
 Brown University 39 39 42 47 52 49
 University of British Columbia 44 51 45 49 43 50

Young Universities

QS also releases a list of QS Top 50 under 50 annually to rank universities which have established for not more than 50 years. These institutions are judged based on their positions on the overall table of the previous year.[42]

Faculties and subjects

QS also ranks universities by academic discipline organized into 5 faculties, namely Arts & Humanities, Engineering & Technology, Life Sciences& Medicine, Natural Sciences and Social Sciences & Management. These annual rankings are drawn up on the basis of academic opinion, recruiter opinion and citations.

Categories of QS World University Rankings by Faculty and Subject[43]
Art & Humanities Engineering & Technology Life Sciences & Medicine Natural Sciences [note 2] Social Sciences
Arts & Design Architecture Agriculture & Forestry Physics & Astronomy Accounting & Finance
English language & literature Chemical Engineering Biological Sciences Mathematics Business & Management
History Civil & Structural Engineering Dentistry Environmental Sciences Communication & Media Studies
Linguistics Computer Science Medicine Earth & Marine Sciences Development Studies
Modern languages Electrical & Electronic engineering Pharmacy & Pharmacology Chemistry Economics & Econometrics
Philosophy Mechanical, Aeronautical & Manufacturing Psychology Materials Sciences Education
Surveying Veterinary Science Geography Law
Politics & International Studies
Sociology
Statistics

Regional rankings

Asia

In 2009, QS launched the QS Asian University Rankings or QS University Rankings: Asia in partnership with The Chosun Ilbo newspaper in Korea to rank universities in Asia independently.

These rankings use some of the same criteria as the world rankings, but there are changed weightings and new criteria. One addition is the criterion of incoming and outgoing exchange students. Accordingly, the QS World University Rankings and the QS Asian University Rankings released in the same academic year are different from each other.[1]

QS University Rankings: Asia — Top 50[note 1]
Institution 2009[44] 2010[45] 2011[46] 2012[47] 2013[48] 2014[49] 2015[50]
 National University of Singapore 10 3 3 2 2 1 1
 The University of Hong Kong 1 1 2 3 2 3 2
 Korea Advanced Institute of Science and Technology 7 13 11 7 6 2 3
 Nanyang Technological University 14 18 17 17 10 7 4
 The Hong Kong University of Science and Technology 4 2 1 1 1 5 5
 The Chinese University of Hong Kong 2 4 5 5 7 6 6
 Peking University 10 12 13 6 5 8 7
 Seoul National University 8 6 6 4 4 4 8
 The City University of Hong Kong 18 15 15 12 12 11 9
 Pohang University of Science and Technology 17 14 12 9 7 9 10

Latin America

The QS Latin American University Rankings or QS University Rankings: Latin America were launched in 2011. They use academic opinion (30%), employer opinion (20%), publications per faculty member, citations per paper, academic staff with a PhD, faculty/student ratio and web visibility (10 per cent each) as measures.[51]

BRICS

This set of rankings adopts 8 indicators to select the top 100 higher learning institutions in BRICS countries. Institutions in Hong Kong, Macau and Taiwan are not ranked here, leaving those in mainland China alone.

QS University Rankings: BRICS — Top 10[note 1]
Institution 2013[52] 2014[53] 2015[54]
 Tsinghua University 1 1 1
 Peking University 2 2 2
 Fudan University 4 5 3
 Lomonosov Moscow State University 3 3 4
 Indian Institute of Science Bangalore - - 5
 Shanghai Jiao Tong University 6 8 6
 University of Science and Technology of China 6 4 6
 Nanjing University 5 6 8
 Universidade de São Paulo 8 7 9
 Beijing Normal University 12 14 10

QS Stars

QS also offers universities a way of seeing their own strengths and weaknesses in depth. Called QS Stars, this service is separate from the QS World University Rankings. It involves a detailed look at a range of functions which mark out a modern university. Universities can get from one star to five, or Five Star Plus for the truly exceptional.

QS Stars ratings are derived from scores on eight criteria. They are:

  • Research Quality
  • Teaching Quality
  • Graduate Employability
  • University Infrastructure
  • Internationalisation
  • Innovation and knowledge transfer
  • Third mission activity, measuring areas of social and civic engagement
  • Special criteria for specific subjects

Stars is an evaluation system, not a ranking. About 100 institutions had opted for the Stars evaluation as of early 2013. In 2012, fees to participate in this program were $9850 for the initial audit and an annual license fee of $6850.[55]

Notes

  1. ^ a b c Order shown in accordance with the latest result.
  2. ^ The term "Natural Sciences" here actually refers to physical sciences since life sciences are also a branch of natural sciences.

References

  1. ^ a b "Asian University Rankings - QS Asian University Rankings vs. QS World University Rankings™". The methodology differs somewhat from that used for the QS World University Rankings...
  2. ^ "University rankings: which world university rankings should we trust?". The Telegraph. 2015. Retrieved 27 January 2015. It is a remarkably stable list, relying on long-term factors such as the number of Nobel Prize-winners a university has produced, and number of articles published in Nature and Science journals. But with this narrow focus comes drawbacks. China's priority was for its universities to "catch up" on hard scientific research. So if you're looking for raw research power, it's the list for you. If you're a humanities student, or more interested in teaching quality? Not so much.
  3. ^ Ariel Zirulnick. "New world university ranking puts Harvard back on top". The Christian Science Monitor. Those two, as well as Shanghai Jiao Tong University, produce the most influential international university rankings out there
  4. ^ Indira Samarasekera & Carl Amrhein. "Top schools don't always get top marks". The Edmonton Journal. Archived from the original on October 3, 2010. There are currently three major international rankings that receive widespread commentary: The Academic World Ranking of Universities, the QS World University Rankings and the Times Higher Education Rankings. {{cite news}}: Unknown parameter |deadurl= ignored (|url-status= suggested) (help)
  5. ^ Philip G. Altbach (11 November 2010). "The State of the Rankings". Inside Higher Ed. Retrieved 27 January 2015. The major international rankings have appeared in recent months — the Academic Ranking of World Universities, the QS World University Rankings, and the Times Higher Education World University Rankings (THE).
  6. ^ a b "Scientometrics, Volume 85, Number 1". SpringerLink. Retrieved 2010-09-16.
  7. ^ "Methodology of QS rankings comes under scrutiny". www.insidehighered.com. Retrieved 2016-04-29.
  8. ^ "Competition and controversy in global rankings - University World News". www.universityworldnews.com. Retrieved 2016-04-29.
  9. ^ Lambert Review of Business-University Collaboration
  10. ^ Princeton University Press, 2010
  11. ^ a b "Martin Ince Communications". Martin Ince Communications. Retrieved 31 May 2015.
  12. ^ Mroz, Ann. "Leader: Only the best for the best". Times Higher Education. Retrieved 2010-09-16.
  13. ^ Baty, Phil (2010-09-10). "Views: Ranking Confession". Inside Higher Ed. Retrieved 2010-09-16.
  14. ^ Labi, Aisha (2010-09-15). "Times Higher Education Releases New Rankings, but Will They Appease Skeptics?". The Chronicle of Higher Education. London, UK. Retrieved 2010-09-16.
  15. ^ "QS World University Rankings: Methodology". QS (Quacquarelli Symonds). 2014. Retrieved 29 April 2015.
  16. ^ "MS and MBA in USA". MS MBA in USA. Retrieved 31 May 2015.
  17. ^ a b "2011 Academic Survey Responses". Archived from the original on February 6, 2012. Retrieved 12 September 2013. {{cite web}}: Unknown parameter |deadurl= ignored (|url-status= suggested) (help)
  18. ^ QS Intelligence Unit | Faculty Student Ratio. Iu.qs.com. Retrieved on 2013-08-12.
  19. ^ a b QS Intelligence Unit | Citations per Faculty. Iu.qs.com. Retrieved on 2013-08-12.
  20. ^ Richard Holmes. "University Ranking Watch". Retrieved 31 May 2015.
  21. ^ "Global university rankings and their impact,". "European University Association". Retrieved 3, September, 2012
  22. ^ QS Intelligence Unit | Employer Reputation. Iu.qs.com. Retrieved on 2013-08-12.
  23. ^ QS Intelligence Unit | International Indicators. Iu.qs.com. Retrieved on 2013-08-12.
  24. ^ Flying high internationally Archived 2007-12-11 at the Wayback Machine
  25. ^ "Cambridge loses top spot to Massachusetts Institute of Technology". The Independent. 11 September 2012. Retrieved 11 September 2012.
  26. ^ "QS Intelligence Unit - QS World University Rankings". Retrieved 31 May 2015.
  27. ^ "Improving Latin American universities' global ranking - University World News". Retrieved 31 May 2015.
  28. ^ "The QS World University Rankings are a load of old baloney". Retrieved 31 May 2015.
  29. ^ Change Magazine - Taylor & Francis (13 January 2012). "Change Magazine - January-February 2012". Retrieved 31 May 2015.
  30. ^ Holmes, Richard (2006-09-05). "So That's how They Did It". Rankingwatch.blogspot.com. Retrieved 2010-09-16.
  31. ^ Response to Review of Strategic Plan by Peter Wills
  32. ^ Sowter, Ben (1 November 2007). THES – QS World University Rankings 2007 - Basic explanation of key enhancements in methodology for 2007"
  33. ^ "1741-7015-5-30.fm" (PDF). Retrieved 2010-09-16.
  34. ^ "Social sciences lose 1". Timeshighereducation.co.uk. 2007-11-16. Retrieved 2010-09-16.
  35. ^ Leiter Reports: A Philosophy Blog: Guardian and "QS Rankings" Definitively Prove the Existence of the "Halo Effect". Leiterreports.typepad.com (2011-06-05). Retrieved on 2013-08-12.
  36. ^ "QS World University Rankings (2010/11)".
  37. ^ "QS World University Rankings (2011/12)" (PDF).
  38. ^ "QS World University Rankings (2012/13)".
  39. ^ "QS World University Rankings (2013/14)".
  40. ^ "QS World University Rankings (2014/15)".
  41. ^ "QS World University Rankings (2015/16)".
  42. ^ "QS Top 50 under 50". Quacquarelli Symonds. Retrieved 2013-07-07.
  43. ^ "QS World University Rankings by Subject 2015". Quacquarelli Symonds (QS). Retrieved 12 August 2014.
  44. ^ "QS Asian University Rankings (2009)".
  45. ^ "QS Asian University Rankings (2010)".
  46. ^ "QS Asian University Rankings (2011)".
  47. ^ "QS Asian University Rankings (2012)".
  48. ^ "QS Asian University Rankings (2013)".
  49. ^ "QS Asian University Rankings (2014)".
  50. ^ "QS Asian University Rankings (2015)".
  51. ^ "Methodology (QS University Rankings – Latin America)". Quacquarelli Symonds. Retrieved 12 August 2014. {{cite web}}: Italic or bold markup not allowed in: |publisher= (help)
  52. ^ "QS University Rankings: BRICS 2013". Quacquarelli Symonds Limited. 2013. Retrieved August 23, 2015.
  53. ^ "QS University Rankings: BRICS 2014". Quacquarelli Symonds Limited. 2014. Retrieved August 23, 2015.
  54. ^ "QS University Rankings: BRICS 2015". Quacquarelli Symonds Limited. 2015. Retrieved August 23, 2015.
  55. ^ "Ratings at a Price for Smaller Universities". The New York Times. 30 December 2012. Retrieved 10 September 2013.