Tuesday, 6 August 2013

The Complete Efficiency Rankings

The Complete Efficiency Rankings

At last. Here are complete Efficiency Rankings measuring the efficiency with which universities turn inputs into citations. I am using the method of Professor Dirk van Damme which is to divide the scores for Citations: Research Influence in The THE World University Rankings by the scores for Research: Volume, Income and Reputation. Here is the method as cited in a previous post:

"The input indicator takes scaled and normalised measures of research income and volume into account, and also considers reputation, while the output indicator looks at citations to institutional papers in Thomson Reuters’ Web of Science database, normalised for subject differences.
Professor van Damme said that the results - which show that university systems outside the Anglo-American elite are able to realise and increase outputs with much lower levels of input - did not surprise him.
“For example, Switzerland really invests in the right types of research. It has a few universities in which it concentrates resources, and they do very well,” he said.
Previous studies have found the UK to have the most efficient research system on measures of citation per researcher and per unit of spending.
But Professor van Damme explained that under his approach, productivity - output per staff member - was included as an input.
“With efficiency I mean the total research capacity of an institution, including its productivity, divided by its impact. The UK is not doing badly at all, but other countries are doing better, such as Ireland, which has a very low research score but a good citations score,” he said.
Given the severity of the country’s economic crisis, Ireland’s success was particularly impressive, he said.
“I think it is really conscious of the effort it has to make to maintain its position and is doing so.”
Low efficiency scores for China and South Korea reflected the countries’ problems in translating their huge investment into outputs, he added."




1. Tokyo Metroplitan
2. Moscow Engineering Physics Institute
3. Florida Institute of Technology
4. Southern Methodist
5. Hertfordshire
6. Portsmouth
7= King Mongkut's University of Technology
7= Vigo
9. Creighton
10. Fribourg
11. King Abdulaziz
12. University of the Andes
13. Trieste
14. Renmin
15. Medical University of Vienna
16. Polytech University Valencia
17= Beyreuth
18= Montana
19. Mainz
20. Ferrara
21. Drexel
22. Valencia
23. Linz
24. Crete
25. Colorado School of Mines
26. Technical University of Dresden
27. Innsbruck
28. Nurnberg
29= Dauphine
29= Wake Forest
29= Maryland Baltimore County
32. St George's London
33. William and Mary College
34. Hong Kong Baptist
35. Basel
36. Texas San Antonio
37. Duisberg
38. Lyon 1
39. Wurzburg
40. Charles Darwin
41. Wayne State
42. Northeastern
43. Bicocca
44. Royal Holloway
45. Koc
46. Georgia University of Health Science
47. Modena
48. Dundee
49. Southern Denmark
50= IIT Roorkhee
50= Pompeu Fabra
52. Graz
53= Oregon
53= Diderot
55. Bielfeld
56. Munster
57. Waikato
58= Grenoble
59= East Anglia
60= Bonn
61= Pavia
62. ENS Lyon
63. Eastern Finland
64. Padua
65. Brandeis
66. Aberystwyth
67. Tulane
68. Tubingen
69= Warsaw
70= Sun Yat Sen
71= Keele
72. Tromso
73. Brunel
74. Liege
75. Queen Mary
76= Vermont
77= Trento
78. Turin
79. Jyvaskyla
80. Carleton
81. Kansas
82. California Riverside
83. SUNY Stony Brook
84= George Washington
85= Pisa
86. Tasmania
87. George Mason
88. Boston College
89= Oregon State
90= Texas Dallas
91. Trinity College Dublin
92= University Science and Technology China
92= Murdoch
92= Cinncinati
92= Galway
92= Yeshiva
97= Tufts
97= Minho
99. Miami
100. Lehigh
101. Technical University Denmark
102= Rice
102= Iceland
104. California Santa Cruz
104= Milan
106. Monpellier 2
107. Frankfurt
108= Bergen
109= Strasbourg
110. Victoria
111. Rochester
112. Cork
113. Dartmouth
114. Oklahoma
115. Birkbeck
116. Porto
117. Canterbury
118= Newcastle UK
118= Notre Dame
118= University College Dublin
121. Binghamton
122. Aveiro
123= Kiel
123= Sussex
125. Temple
126. Aachen
127= Fribourg
127= Queens Belfast
127= Colorado Boulder
130. Iowa State
131. Tokyo Medical Dental
132= Autonomous Madrid
132= Swedish Agriculture
132= Tempere
135= Deakin
135= Barcelona
137= Stockholm
137= Stirling
139. Laval
140. Durham
141. Bangor
142= Aberdeen
142= Vanderbilt
144. Istanbul Technical
145. Nanjing
146= Exeter
146= Emory
146= Leicester
149. Southamton
150. Paris Mines
151. Vrije Universiteit Brussel
152. Polytechnic Milan
153. Kwazulu-Natal
154= Linkoping
154= Bilkent
154= Herriot-Watt
154= Bologna
158= Wyoming
158= Utah
158= Massey
161= Glasgow
161= Bern
163. ENS Paris
164. Zurich
165= Case Western Reserve
166= California Irvine
167= Tartu
168= Wellington
169= Salento
170. South Carolina
171. York UK
172. Aalto
173= Curie
173= Macquarie
173= Boston
176= Delaware
177= Copenhagen
178= Hannover
179. Norway University of Science and Technology
180. Antwerp
181= Dalhousie
181= Renselaer Polytechnic Institute
183= Konstanz
184= Paris Sud
185. Technical University Munich
186. Lancaster
187. Waseda
188. Otago
189. Arizona State
190= SUNY Albany
190= Gottingen
190= Autonomous Barcelona
193= Cape Town
194= St Andrews
195= Colorado State
195= Bath
195= Wollongong
198= Tsukuba
198= Simon Fraser
198= Liverpool
198= Umea
202= Geneva
202= Newcastle Australia
204= Universite Libre Bruxelles
204= Virginia
206= Lausanne
206= Louvain
208= Connecticut
208= Georgetown
208= York Canada
211. EPF Lausanne
212= North Carolina State
212= Bristol
212= Aalborg
212= Free University Amsterdam
216= Indiana
216= Kentucky
218. Maryland College Park
219. Karlsruhe Institute technology
220= University Technology Sydney
220= Iowa
222. Charles
223. Flinders
224. Cardiff
225= Auckland
225= Oslo
227. Pittsburgh
228= Heidelberg
228= Guelph
228= Washington State
228= Sheffield
232= Chinese University Hong Kong
232= Strathclyde
234= Ottawa
234= Gotherberg
234= Washington St Louis
237. Medical South Carolina
238= McMaster
238= Brown
238= National Sun Yat Sen
238= Reading
242. Ecole Polytechnique
243. Helsinki
244= Quebec
244= National Central Taiwan
246. Bogazici
247= Southern California
247= Arizona
249. Keio
250= Houston
250= Stellenbosch
250= Kings College London
250= Darmstadt
250= Western Australia
255= Pohang
255= IIT Bombay
257= Wageningen
257= Manitoba
259= South Australia
259= Nagoya
261= Leeds
261= UC Santa Barbara
261= Nijmegen
261= Jagiellon
265= New York University
265= Calgary
265= Ohio State
268. Aarhus
269= Witwatersrand
269= North Carolina Chapel Hill
269= Michigan State
269= Fudan
273= Bochum
273= Munich
275= SUNY Buffalo
275= Adelaide
275= Sapienza
278= Utrecht
278= Edinburgh
278= Queensland University of Technology
281= Lund
281= Ghent
283. Erasmus
284= Massachusetts
284= Illinois Chicago
284= Nottingham
287= Eindhoven
287= Amsterdam
289. UC San Diego
290. Birmingham
291= Western Ontario
291= Twente
293= Washington Seattle
293= Duke
295= Penn State
295= NUI Maynooth
297= Maastricht
297= Groningen
297= Columbia
297= Leiden
297= Georgia
302. UC Davis
303= Southern Florida
303= Chalmers University of Technology
305= Minnesota
305= Essex
305= Manchester
305= Georgia Institute of Technology
309= Rutgers
309= Texas at Austin
311= Northwestern
311= Warwick
311= Vienna
311= MIT
315. Johns Hopkins
316= Wisconsin Madison
316= Carnegie Mellon
318. Alberta
319. Pennsylvania
320= Hong Kong University of Science and Technology
320= Kyushu
322= Chicago
322= Vienna University of Technology
324= Queensland
324= Montreal
326. British Columbia
327= Yale
327= Imperial College London
327= UCLA
327= Hebrew University of Jerusalem
327= Karolinska
332= Melbourne
332= Humboldt
332= National Tsinghua Taiwan
332= Cambridge
332= Harvard
332= Stanford
338= Monash
338= Princeton
338= Caltech
338= Michigan
338= UC Berkeley
338= Cornell
344= Waterloo
344= KHT Sweden
344= Missouri
347. University College London
348= Oxford
348= Middle East Technical University
350. Yonsei
351= Toronto
351= Illinois Urbana Champagne
351= Peking
351= Leuven
355= Zhejiang
355= Hokkaido
355= Hong Kong Polytechnic University
355= McGill
359= ETH Zurich
359= Tokyo Institute of Technology
361= Berlin
361= Uppsala
363= Korea
363= Sydney
365= Florida
365= New South Wales
367= Australian National
367= Tohoku
367= Purdue
367= Technion
371= Surrey
371= IIT Kharagpur
373= KAIST
373= Texas A and M
375. Virginia Polytechnic Institute
376= Osaka
376= Nanyang Technological University
376= Shanghai Jiao Tong
379. LSE
380. Sungkyunkwan
381. Sharif University of Technology
382. Tokyo
383= National Taiwan University of Science and Technology
383= National Autonomous University of Mexico
385= Kyoto
385= National University of Singapore
387. Loughborough
388. National Cheng Kung
389. Tel Aviv
390= Hong Kong
390= Tsinghua
392. Chinese University of Hong Kong
393. National Taiwan
394. National Chiao Tung
395. Tilburg
396. Delft
397. Seoul National
398. State University Campinas
399. Sao Paulo
400. Moscow State

What about a Research Influence Ranking?

Keeping up with the current surge of global university rankings is becoming next to impossible. Still there are a few niches that have remained unoccupied. One might be a ranking of universities according to their ability to spread new knowledge around the world. So it might be a good idea to have a Research Influence Ranking based on the citations indicator in the Times Higher Education -- Thomson Reuters World University Rankings.

Thomson Reuters are the world's leading collectors and analysts of citations data so such an index ought to provide invaluable data source for governments, corporations and other stakeholders deciding where to place research funding. Data for 400 universities can be found on the THE iPhone/iPad app.

The top place in the world would be jointly held by Rice University in Texas and Moscow State Engineering Physics Institute, closely followed by MIT and the University of California Santa Cruz.

Then there are the first places in various regions and counties. (MEPhI would be first in Europe and Rice in the US and North America.)


Canada
University of Toronto

Latin America
University of the Andes, Colombia

United Kingdom (and Western Europe)
Royal Holloway London

Africa
University of Cape Town 

Middle East
Koc University, Turkey

Asia (and Japan)
Tokyo Metropolitan University

ASEAN
King Mongkut's University of Technology, Thailand

Australia and the Pacific
University of Melbourne

On second thoughts, perhaps not such a good idea.

The Efficiency Rankings

Times Higher Education has a story about a study by Dirk Van Damme, head of the Centre for Educational Research and Innovation at the OECD. This will be presented at the Global University Summit held in Whitehall, London from the 28th to the 30th May.

The Summit "brings an invitation-only audience of leaders from the world’s foremost universities, senior policy-makers and international business executives to London in 2013." It is a "prestigious event" held in a "spectacular setting" and is sponsored by the University of Warwick, Times Higher Education, Thomson Reuters and UK Universities International Unit. Speakers include Vince Cable, Boris Johnson, the Russian ambassador and heads of various universities from around the world.

What Professor Van Damme has done is to treat the THE World University Rankings Research Indicator scores as an input and the Research Influence (Citations) scores as an output. The output scores are divided by the input scores and the result is a measure of the efficiency with which the inputs are turned into citations, which, as we all know, is the main function of the modern university.

According to THE:

"The input indicator takes scaled and normalised measures of research income and volume into account, and also considers reputation, while the output indicator looks at citations to institutional papers in Thomson Reuters’ Web of Science database, normalised for subject differences.
Professor van Damme said that the results - which show that university systems outside the Anglo-American elite are able to realise and increase outputs with much lower levels of input - did not surprise him.
“For example, Switzerland really invests in the right types of research. It has a few universities in which it concentrates resources, and they do very well,” he said.
Previous studies have found the UK to have the most efficient research system on measures of citation per researcher and per unit of spending.
But Professor van Damme explained that under his approach, productivity - output per staff member - was included as an input.
“With efficiency I mean the total research capacity of an institution, including its productivity, divided by its impact. The UK is not doing badly at all, but other countries are doing better, such as Ireland, which has a very low research score but a good citations score,” he said.
Given the severity of the country’s economic crisis, Ireland’s success was particularly impressive, he said.
“I think it is really conscious of the effort it has to make to maintain its position and is doing so.”
Low efficiency scores for China and South Korea reflected the countries’ problems in translating their huge investment into outputs, he added."

One hesitates to be negative about a paper presented at a prestigious event in a spectacular setting to an invitation only audience but this is frankly rather silly.

I would accept that income can be regarded as an input but surely not reputation and surely not volume of publications. Also, unless Van Damme's methodology has undisclosed refinements he is treating research scores as having the same value regardless of whether they are composed mainly of scores for reputation or for number of publications or for research income.

Then there is the time period concerned. Research income is income for one year, Publications are drawn from a five year period. These are then compared with citations over a six year period. So the paper is asking how research income for 2010 produces citations in the years 2006 - 2011 of papers published in the years 2006 - 2010. A university is certainly being remarkably efficient if its 2010 income is producing citations in 2006, 2007, 2008 and 2009.

Turning to the citations side of the equation, it should be recalled that the THE citations indicator includes an adjustment by which the citation impact score for universities is divided by the square root of the citation impact score for the country as a whole. In other words a university located in a country where papers are not cited very much gets a big boost and the lower the national citation impact score the bigger the boost for the university. This is why Hong Kong universities suffered reduced scores when Thomson Reuters took them out of China when counting citations and put them in their own separate category.

So, it is not surprising that universities from outside the Anglo-Saxon elite do well for citations and thus appear to be very efficient. Thomson Reuters methodology gives such universities a very substantial weighting just for being located in countries that are less productive in terms of citations.

None of this is new. In 2010 Van Damme did something similar at a seminar in London.

Van Damme is just analysing the top 200 universities in the THE rankings. It would surely be more interesting to analyse the top 400 whose scores are obtainable from an iPad/iPhone app.

So here are the top ten universities in the world according to the efficiency with which they turn income, reputation and publications into citations. The procedure is to divide the citations score from the 2012 THE rankings by the research indicator score.

1. Tokyo Metropolitan University
2. Moscow State Engineering Physics Institute
3. Florida Institute of Technology
4. Southern Methodist University
5. University of Hertfordshire
6. University of Portsmouth
7. King Mongkut's University of Technology
8. Vigo University
9. Creighton University
10. Fribourg University

No doubt the good and the great of the academic world assembled in Whitehall will make a trip to Portsmouth or even to Vigo or Creighton if they can find them on the map.

And now for the hall of shame. Here are the bottom ten of the THE top 400, ranked according to efficiency as measured by citations indicator scores divided by research scores. The heads of these failing institutions will no doubt be packing their bags and looking for jobs as junior administrative assistants at technical colleges in Siberia or the upper Amazon


391. Tsinghua University
392. Chinese University of Hong Kong
393. National Taiwan University
394. National Chiao Tung University
395. Tilburg University
396. Delft University of Technology
397. Seoul National University
398. State University of Campinas
399. Sao Paulo University
400. Lomosonov Moscow State University

In a little while I hope to publish the full 400 after I have finished being sarcastic about the QS subject rankings.

QS Rankings by Subject

QS have produced their annual subject rankings. At the top there are no real surprises and, while there is certainly room for argument, I do not think that anyone will be shocked by the top ten or twenty in each subject.


The university with the most number ones is Harvard:

Medicine
Biology
Psychology
Pharmacy and Pharmacology
Earth and Marine Sciences
Politics and International Studies
Law
Economics and Econometrics
Accounting and Finance
Education

MIT has seven:
Computer Science
Chemical Engineering
Electrical Engineering
Mechanical Engineering
Phys and Astronomy
Chemistry
Materials Science

Then there is Berkeley with exactly the four you would expect:
Environmental Science
Statistics and Operational Research
Sociology
Communication and Media Studies

Oxford has three:

Philosophy
Modern Languages
Geography

Cambridge another three:
History
Linguistics
Mathematics


Imperial College London is top for Civil Engineering and University of California, Davis for Agriculture and Forestry.


These rankings are based on the academic opinion survey, the employer survey, citations per paper and h-index, a measure of both output and influence that eliminates outliers, in proportions that vary for each subject. They are very research-focused which is unfortunate since there seems to be a consensus emerging at conferences and seminars that the THE-TR rankings are for policy makers, the Shanghai ARWU for researchers and the QS rankings for undergraduate students.

Outside the top fifty there are some oddities resulting form the small number of responses when we leave the top fifty or top one hundred. I will leave it to specialists to find them.

Serious Wonkiness


Alex Usher at HESA had a post on the recent THE Under-50 Rankings. Here is an except about the Reputation and Citations indicators.



"But there is some serious wonkiness in the statistics behind this year’s rankings which bear some scrutiny. Oddly enough, they don’t come from the reputational survey, which is the most obvious source of data wonkiness. Twenty-two percent of institutional scores in this ranking come from the reputational ranking; and yet in the THE’s reputation rankings (which uses the same data) not a single one of the universities listed here had a reputational score high enough that the THE felt comfortable releasing the data. To put this another way: the THE seemingly does not believe that the differences in institutional scores among the Under-50 crowd are actually meaningful. Hmmm.

No, the real weirdness in this year’s rankings comes in citations, the one category which should be invulnerable to institutional gaming. These scores are based on field-normalized, 5-year citation averages; the resulting institutional scores are then themselves standardized (technically, they are what are known as z-scores). By design, they just shouldn’t move that much in a single year. So what to make of the fact that the University of Warwick’s citation score jumped 31% in a single year, Nanyang Polytechnic’s by 58%, or UT Dallas’ by a frankly insane 93%? For that last one to be true, Dallas would have needed to have had 5 times as many citations in 2011 as it did in 2005. I haven’t checked or anything, but unless the whole faculty is on stims, that probably didn’t happen. So there’s something funny going on here."

Here is my comment on his post.


Your comment at University Ranking Watch and your post at your blog raise a number of interesting issues about the citations indicator in the THE-TR World University Rankings and the various spin-offs.



You point out that the scores for the citations indicator rose at an unrealistic rate between 2011 and 2012 for some of the new universities in the 100 Under 50 Rankings and ask how this could possibly reflect an equivalent rise in the number of citations.



Part of the explanation is that the scores for all indicators and nearly all universities in the WUR, and not just for the citations indicator and a few institutions, rose between 2011 and 2012. The mean overall score of the top 402 universities in 2011 was 44.3 and for the top 400 universities in 2012 it was 49.5.



The mean scores for every single indicator or group of indicators in the top 400 (402 in 2011) have also risen although not all at the same rate. Teaching rose from 37.9 to 41.7, International Outlook from 51.3 to 52.4, Industry Income from 47.1 to 50.7, Research from 36.2 to 40.8 and Citations from 57.2 to 65.2.



Notice that the scores for citations are higher than for the other indicators in 2011 and that the gap further increases in 2012.



This means that the citations indicator had a disproportionate effect on the rankings in 2011, one that became more disproportionate in 2012



It should be remembered that the scores for the indicators are z scores and therefore they measure not the absolute number of citations but the distance in standard deviations from the mean number of normalised citations of all the universities analysed. The mean is the mean not of the 200 universities listed in the top 200 universities in the printed and online rankings or the 400 included in the ipad/iphone app but the mean of the total number of universities that have asked to be ranked. That seems to have increased by a few hundred between 2011 and 2012 and will no doubt go on increasing over the next few years but probably at a steadily decreasing rate.



Most of the newcomers to the world rankings have overall scores and indicator scores that are lower than those of the universities in the top 200 or even the top 400. That means that the mean of the unprocessed scores on which the z scores are based decreased between 2011 and 2012 so that the overall and indicator scores of the elite universities increased regardless of what happened to the underlying raw data.



However, they did not increase at the same rate. The scores for the citations indication, as noted, were much higher in 2011 and in 2012 than they were for the other indicators. It is likely that this was because the difference between top 200 or 400 universities and those just below the elite is greater for citations than it is for indicators like income, publications and internationalisation. After all, most people would probably accept that internationally recognised research is a major factor in distinguishing world class universities from those that are merely good.



Another point about the citations indicator is that after the score for field and year normalised citations for each university is calculated it is adjusted according to a “regional modification”. This means that the score, after normalisation for year and field, is divided by the square root of the average for the country in which the university is located. So if University A has a score of 3.0 citations per paper and the average for the country is 3.0 then the score will be divided by 1.73, the square of 3, and the result is 1.73. If a university in country B has the same score of 3.0 citations per paper but the overall average is just 1.0 citation per paper the final score will be 3.0 divided by the square root of 1, which is 1, and the result is 3.



University B therefore gets a much higher final score for citations even though the number of citations per paper is exactly the same as University A’s . The reason for the apparently higher score is simply that the two universities are being compared to all the other universities in their country. The lower the score for universities in general then the higher the regional modification for specific universities. The citations indicator is not just measuring the number of citations produced by universities but also in effect the difference between the bulk of a country’s universities and the elite that make into the top 200 or 400.



It is possible then that a university might be helped into the top 200 or 400 by having a high score for citations that resulted from being better than other universities in a particular country that were performing badly.



It is also possible that if a country’s research performance took a dive, perhaps because of budget cuts, with the overall number of citations per paper declining, this would lead to an improvement in the score for citations of a university that managed to remain above the national average.



It is quite likely that -- assuming the methodology remains unchanged -- if countries like Italy, Portugal or Greece experience a fall in research output as a result of economic crises, their top universities will get a boost for citations because they are benchmarked against a lower national average.



Looking at the specific places mentioned, it should be noted once again that Thomson Reuters do not simply count the number of citations per paper but compare them with the mean citations for papers in particular fields published in particular years and cited in particular years.



Thus a paper in applied mathematics published in a journal in 2007 and cited in 2007, 2008, 2009, 2010, 2011 and 2012 will be compared to all papers in applied maths published in 2007 and cited in those years.



If it is usual for a paper in a specific field to receive few citations in the year of publication or the year after then even a moderate amount of citations can have a disproportionate effect on the citations score.



It is very likely that Warwick’s increased score for citations in 2012 had a lot to do with participation in a number of large scale astrophysical projects that involved many institutions and produced a larger than average number of citations in the years after publication. In June 2009, for example, the Astrophysical Journal Supplement Series published ‘The seventh data release of the Sloan Digital Sky Survey’ with contributions from 102 institutions, including Warwick. In 2009 it received 45 citations. The average for the journal was 13. The average for the field is known to Thomson Reuters but it is unlikely that anyone else has the technical capability to work it out. In 2010 the paper was cited 262 times: the average for the journal was 22. In 2011 it was cited 392 times: the average for the journal was 19 times.



This and similar publications have contributed to an improved performance for Warwick, one that was enhanced by the relatively modest number of total publications by which the normalised citations were divided.



With regard to Nanyang Technological University, it seems that a significant role was played by a few highly cited publications in Chemical Reviews in 2009 and in Nature in 2009 and 2010.



As for the University of Texas at Dallas, my suspicion was that publications by faculty at the University of Texas Southwestern Medical Center had been included, a claim that had been made about the QS rankings a few years ago. Thomson Reuters have, however, denied this and say they have observed unusual behaviour by UT Dallas which they interpret as an improvement in the way that affiliations are recorded. I am not sure exactly what this means but assume that the improvement in the citations score is an artefact of changes in the way data is recorded rather than any change in the number or quality of citations.



There will almost certainly be more of this in the 2013 and 2014 rankings."

A bad idea but not really new

University teachers everywhere are subject to this sort of pressure but it is unusual for it to be stated so explicitly.




"A university put forward plans to assess academics’ performance according to the number of students receiving at least a 2:1 for their modules, Times Higher Education can reveal.
According to draft guidance notes issued by the University of Surrey - and seen by THE - academics were to be required to demonstrate a “personal contribution towards achieving excellence in assessment and feedback” during their annual appraisals.
Staff were to be judged on the “percentage of students receiving a mark of 60 per cent or above for each module taught”, according to the guidance form, issued in June 2012, which was prefaced by a foreword from Sir Christopher Snowden, Surrey’s vice-chancellor, who will be president of Universities UK from 1 August.
“The intention of this target is not to inflate grades unjustifiably but to ensure that levels of good degrees sit comfortably within subject benchmarks and against comparator institutions,” the document explained.
After “extensive negotiations” with trade unions, Surrey dropped the proposed “average target mark”, with replacement guidance instead recommending that staff show there to be “a normal distribution of marks” among students."

Competition and controversy in global rankings

Higher education is becoming more competitive by the day. Universities are scrambling for scarce research funds and public support. They are trying to recruit from increasingly suspicious and cynical students. The spectre of online education is haunting all but the most confident institutions.


Rankings are also increasingly competitive. Universities need validation that will attract students and big-name researchers and justify appeals for public largesse. Students need guidance about where to take their loans and scholarships. Government agencies have to figure out where public funds are going.

It is not just that the overall rankings are competing with one another, but also that a range of subsidiary products have been let loose. Times Higher Education (THE) and QS have released Young University Rankings within days of each other. Both have published Asian rankings. THE has published reputation rankings and QS Latin American rankings. QS’s subject rankings have been enormously popular because they provide something for almost everybody.

There are few countries without a university somewhere that cannot claim to be in the top 200 for something, even though these rankings sometimes manage to find quality in places lacking even departments in the relevant fields.

QS’s academic survey

Increasing competition can also be seen in the growing vehemence of the criticism directed against and between rankings, although there is one ranking organisation that so far seems exempt from criticism. The QS academic survey has recently come under fire from well-known academics although it has been scrutinised byUniversity Ranking Watch and other blogs since 2006.

It has been reported by Inside Higher Ed that QS had beensoliciting opinions for its academic survey from a US money-for-surveys company that also sought consumer opinion about frozen foods and toilet paper.

The same news story revealed that University College Cork had been trying to find outside facultyto nominate the college in this year’s academic survey.

QS has been strongly criticised by Professor Simon Marginson of the University of Melbourne, who assigns it to a unique category among national and international ranking systems, saying, “I do think social science-wise it’s so weak that you can’t take the results seriously”.

This in turn was followed by a heated exchange between Ben Sowter of QS and Marginson.

Although it is hard to disagree with Marginson’s characterisation of the QS rankings, it is strange he should consider their shortcomings to be unique.

U-Multirank and the Lords

Another sign of intensifying competition is the response toproposals for U-Multirank. This is basically a proposal, sponsored by the European Union, not for a league table in which an overall winner is declared but for a series of measures that would assess a much broader range of features, including student satisfaction and regional involvement, than rankings have offered so far.

There are obviously problems with this, especially with the reliance on data generated by universities themselves, but the disapproval of the British educational establishment has been surprising and perhaps just a little self-serving and hypocritical.

In 2011, the European Union Committee of the House of Lords took evidence from a variety of groups about various aspects of European higher education, including U-Multirank. Among the witnesses was the Russell Group of elite research intensive universities, formed after many polytechnics were upgraded to universities in 1992.

The idea was to make sure that research funding remained in the hands of those who deserved it. The group, named after the four-star Russell Hotel in a “prestigious location in London” where it first met, is not an inexpensive club: recently the Universities of Exeter, Durham and York and Queen Mary College paid £500,000 apiece to join.

The Lords also took evidence from the British Council, the Higher Education Funding Council for England, the UK and Scottish governments, the National Union of Students and Times Higher Education.

The committee’s report was generally negative about U-Multirank, stating that the Russell Group had said "ranking universities is fraught with difficulties and we have many concerns about the accuracy of any ranking”.

“It is very difficult to capture fully in numerical terms the performance of universities and their contribution to knowledge, to the world economy and to society,” the report said. “Making meaningful comparisons of universities both within, and across, national borders is a tough and complex challenge, not least because of issues relating to the robustness and comparability of data.”

Other witnesses claimed there was a lack of clarity about the proposal’s ultimate objectives, that the ranking market was too crowded, that it would confuse applicants and be “incapable of responding to rapidly changing circumstances in institutional profiles”, that it would “not allow different strengths across diverse institutions to be recognised and utilised” and that money was better spent on other things.

The committee also observed that the UK Government’s Department of Business Innovation and Skills was “not convinced that it [U-Multirank] would add value if it simply resulted in an additional European ranking system alongside the existing international ranking systems” and the minister struck a less positive tone when he told us that U-Multirank could be viewed as "an attempt by the EU Commission to fix a set of rankings in which [European universities] do better than [they] appear to do in the conventional rankings”.

Just why should the British government be so bothered about a ranking tool that might show European (presumably they mean continental here) universities doing better than in existing rankings?

Finally, the committee reported that “(w)e were interested to note that THES (sic) have recently revised their global rankings in 2010 in order to apply a different methodology and include a wider range of performance indicators (up from six to 13)”.

The committee continued: “They told us that their approach seeks to achieve more objectivity by capturing the full range of a global university's activities – research, teaching, knowledge transfer and internationalisation – and allows users to rank institutions (including 178 in Europe) against five separate criteria: teaching (the learning environment rather than quality); international outlook (staff, students and research); industry income (innovation); research (volume income and reputation); and citations (research influence).”

It is noticeable the Lords showed not the slightest concern, even if they were aware of it, about the THE rankings’ apparent discovery in 2010 that the world’s fourth most influential university for research was Alexandria University.

The complaints about U-Multirank seem insubstantial, if not actually incorrect. The committee’s report says the rankings field is overcrowded. Not really: there are only two international rankings that make even the slightest attempt to assess anything to do with teaching. The THE World University Rankings included only 178 European universities in 2011 so there is definitely a niche for a ranking that aims at including up to 500 European universities and includes a broader range of criteria.

All of the other complaints about U-Multirank, especially reliance on data collected from institutions, would apply to the THE and QS rankings, although perhaps in some cases to a somewhat lesser extent. The suggestion that U-Multirank is wasting money is ridiculous; €2 million would not even pay for four subscriptions to the Russell Group.

Debate

In the ensuing debate in the Lords there was predictable scepticism about the U-Multirank proposal, although Baroness Young of Hornsey was quite uncritical about the THE rankings, declaring that “ (w)e noted, however, that existing rankings, which depend on multiple indicators such as the Times Higher Educationworld university rankings, can make a valuable contribution to assessing the relative merits of universities around the world”.

In February, the League of European Research Universities, or LERU, which includes Oxford, Cambridge and Edinburgh, announced it would have nothing to do with the U-Multirank project.

Its secretary general said "(w)e consider U-Multirank, at best an unjustifiable use of taxpayers' money and at worst a serious threat to a healthy higher education system". He went on to talk about "the lack of reliable, solid and valid data for the chosen indicators in U-Multirank”, about the comparability between countries, about the burden put upon universities to collect data and about “the lack of 'reality-checks' in the process thus far".

In May, the issue resurfaced when the UK Higher Education International Unit, which is funded by British universities and various government agencies, issued a policy statement that repeated the concerns of the Lords and LERU.

Since none of the problems with U-Multirank are in any way unique, it is difficult to avoid the conclusion that higher education in the UK is turning into a cartel and is extremely sensitive to anything that might undermine its market dominance.

And what about THE?

What is remarkable about the controversies over QS and U-Multirank is that Times Higher Education and Thomson Reuters, its data provider, have been given a free pass by the British and international higher education establishments.

Imagine what would happen if QS had informed the world that, in the academic reputation survey, its flagship indicator, the top position was jointly held by Rice University and the Moscow State Engineering Physics Institute (MEPhI)! And that QS argued this was because these institutions were highly focused, that they had achieved their positions because they had outstanding reputations in their areas of expertise and that QS saw no reason to apologise for uncovering pockets of excellence.

Yet THE has put Rice and MEPhI at the top of its flagship indicator, field- and year- normalised citations, given very high scores to Tokyo Metropolitan University and Royal Holloway London among others, and this has passed unremarked by the experts and authorities of university ranking.

For example, a recent comprehensive survey of international rankings by Andrejs Rauhvargers for the European University Association describes the results of the THE reputation survey as “arguably strange” and “surprising”, but it says nothing about the results of the citation indicator, which ought to be much more surprising.

Let us just look at how MEPhI got to be joint top university in the world for research influence, despite its lack of research in anything but physics and related fields. It did so because one of its academics was a contributor to two multi-cited reviews of particle physics. This is a flagrant case of the privileging of the citation practices of one discipline which Thomson Reuters andTHE supposedly considered to be unacceptable. The strange thing is that these anomalies could easily have been avoided by a few simple procedures which, in some cases, have been used by other ranking or rating organisations.

They could have used fractionalised counting, for example, the default option in the Leiden ranking, so that MEPhI would get 1/119th credit for its 1/119th contribution to the Review of Particle Physics for 2010. They could have excluded narrowly specialised institutions. They could have normalised for five or six subject areas, which is what Leiden University and Scimagodo. They could have used several indicators for research influence drawn from the Leiden menu.

There are other things they could do that would not have had much effect, if any, on last year’s rankings, but that might pre-empt problems this year and later on. One is to stop counting self-citations, a step already taken by QS. This would have prevented Alexandria University getting into the world’s top 200 in 2010 and it might prevent a similar problem next year.

Another sensible precaution would be to count only one affiliation per author. This would prevent universities benefitting from signing up part-time faculty in strategic fields. Something else they should think about is the regional adjustment for the citations indicator, which has the effect of giving universities a boost just for being in a low-achieving county.

To suggest that two universities in different countries with the same score for citations are equally excellent – when, in fact, one of them has merely benefitted from being in a country with a poor research profile – is very misleading. It is in effect conceding, asJohn Stuart Mill said of a mediocre contemporary, that its eminence is “due to the flatness of the surrounding landscape”.

Finally, if THE and Thomson Reuters are not going to change anything else, at the very least they could call their indicator a measure of research quality instead of research influence. Why should THE and Thomson Reuters have not taken such obvious steps to avoid such implausible results?

Probably it is because of a reluctance to deviate from their InCites system, which evaluates individual researchers.

THE and Thomson Reuters may be lucky this year. There will be only two particle physics reviews to count instead of three so it is likely that some of the places with inflated citation scores will sink down a little bit.

But in 2014 and succeeding years, unless there is a change in methodology, the citations indicator could look very interesting and very embarrassing. There will be another edition of the Review of Particle Physics, with its massive citations for its 100-plus contributors, and there will be several massively cited multi-authored papers on dark matter and the Higgs Boson to skew the citations indicator.

It seems likely that the arguments about global university rankings will continue and that they will get more and more heated.

Bad Mood Rising

In 2006 I tried to get an article published in the Education section of the Guardian, that fearless advocate of radical causes and scourge of the establishment, outlining the many flaws and errors in the Times Higher Education Supplement -- Quacquarelli Symonds (as they were then) World University Rankings, especially its "peer review". Unfortunately, I was told that they would be wary of publishing an attack on a direct rival. That was how University Ranking Watch got started.


Since then QS and Times Higher Education have had an unpleasant divorce, with the latter now teaming up with Thomson Reuters. New rankings have appeared, some of them to rapidly disappear -- there was one from Wuhan and another from Australia but they seem to have vanished. The established rankings are spinning off subsidiary rankings at a bewildering rate.

As the higher education bubble collapses in the West everything is getting more competitive including rankings and everybody -- except ARWU -- seems to be getting rather bad-tempered.

Rankers and academic writers are no longer wary about "taking a pop" at each other. Recently, there has been an acrimonious exchange between Ben Sowter of QS and Simon Marginson of Melbourne University. This has gone so far as to include the claim that QS has used the threat of legal action to try to silence critics.

"[Ben] Sowter [of QS] does not mention that his company has twice threatened publications with legal action when publishing my bona fide criticisms of QS. One was The Australian: in that case QS prevented my criticisms from being aired. The other case was University World News, which refused to pull my remarks from its website when threatened by QS with legal action.

If Sowter and QS would address the points of criticism of their ranking and their infamous star system (best described as 'rent a reputation'), rather than attacking their critics, we might all be able to progress towards better rankings. That is my sole goal in this matter. As long as the QS ranking remains deficient in terms of social science, I will continue to criticise it, and I expect others will also continue to do so."

Meanwhile the Leiter Reports has a letter from "a reader in the UK".

THES DID drop QS for methodological reasons. The best explanation is here:http://www.insidehighered.com/views/2010/03/15/baty
But there may have been more to it? Clearly QS's business practices leave an awful lot to be desired. See: http://www.computerweekly.com/news/1280094547/Quacquarelli-Symonds-pays-80000-for-using-unlicensed-software
Also I understand that the "S" from QS -- Matt Symonds -- walked out on the company due to exasperation with the business practices. He has been airbrushed from QS history, but can be foud at: https://twitter.com/SymondsGSB
And as for the reputation survey, there was also this case of blantant manipulation:http://www.insidehighered.com/news/2013/04/08/irish-university-tries-recruit-voters-improve-its-international-ranking
And of course there's the high-pressure sales:http://www.theinternationalstudentrecruiter.com/how-to-become-a-top-500-university/
And the highly lucrative "consultancy" to help universities rise up the rankings:http://www.iu.qs.com/projects-and-services/consulting/
There are "opportunities" for branding -- a snip at just $80,000 -- with QS Showcase:http://qsshowcase.com/main/branding-opportunities/
Or what about some relaxing massage, or a tenis tournament and networking with the staff who compile the rankings: http://www.qsworldclass.com/6thqsworldclass/
Perhaps most distribing of all is the selling of dubious Star ratings:http://www.nytimes.com/2012/12/31/world/europe/31iht-educlede31.html?pagewanted=all&_r=0
Keep up the good work. Its an excellent blog.

All of this is true although I cannot get very excited about using pirated software and the bit about relaxing massage is rather petty -- I assume it is something to do with having a conference in Thailand. Incidentally, I don't think anyone from THE sent this since the reader refers to THES (The S for Supplement was removed in 2008).

This is all a long way from the days when journalists refused to take pops at their rivals, even when they knew the rankings were a bit rum.

Times Higher Education Under 50s Rankings

Times Higher Education has now published its ranking of universities less than fifty years old.
The top five are:

1. Phang University of Science and Technology
2. EPF Lausanne
3. Korea Advanced Institute of Science and Technology
4. Hong Kong University of Science and Technology
5. University of California, Irvine

They are quite a bit different from the QS young universities rankings. In a while I hope to provide a detailed comparison.

A University without Faculty: The Demise of the University of Phoenix and the Rise of the MOOCs

The University of Phoenix is now the largest university in America, but this may soon change. This mostly online institution is facing an accreditation sanction, which could force it to lose its Pell Grants, student loans, and other federal subsidies. Not only is the stock price taking a major beating, but massive layoffs are underway. 

Although we should not take enjoyment in other people’s job losses, it is important to focus on what happens when higher education is taken over by a soulless corporation. As the founder of the university has become a billionaire and has just received a $5 million retirement package, the school is shedding many of its on-the-ground employees. Like many other for-profit schools, the U. of Phoenix receives most of its funding from public monies, and then uses these funds to enrich administrators and shareholders and hire an army of marketers and recruiters in order to turn mostly under-represented minority students into unemployed debt slaves, and they do this by hiring all of their faculty off of the tenure system. In many ways, this school represents the extreme logic of the online education movement: eliminate tenure for the faculty, develop questionable distance education, cater to private corporations, and make students suffer with high debt levels and bogus degrees (actually very few students ever get their degrees, and very few get their promised jobs).

While online course providers like Coursera and Udacity appear to represent a much more progressive version of this high-tech education promotion, let us look at some of the statements that are coming out of the mouths of these not-for-profit, profit-seeking marketers. Here is Sebastian Thrun, founder of Udacity, from the UCLA forum (these quotes come from the rush transcript on Remaking the University): “Students rarely learn listening . . . or they never learn by listening. The challenge for us is to take this new medium and really bring it to a mode where students do something and learn by doing. And if you look at the broad spectrum of online technology with what happens. It doesn’t really take long time to point to video games. And most of us look down on video games. We’ve also played them. I know there are people in this room who play angry birds. Some people do. Some people don’t admit it. Angry birds is an wonderful learning environment because you get drawn in, you solve the physics problems but the big problem is that it stops at angry birds . . . if the angry birds was good enough to get into the masters students in physics. It would be an amazing experience and you could do this at scale.” The point I want to stress here is the claim that students never learn from listening. Following this logic, most of current education is simply useless, and we should just have students take out their smart phones and play Angry Birds all day.

During Thrun’s presentation at UCLA, this downgrading of traditional learning environments was connected to a downsizing of the faculty: “As we know that higher education is moving at a slower pace compared to the industry moves. We have been funded by a whole bunch of corporations that make the classes with us and there’s a number of classes launching soon on topics to be not covered in academia. If you look at the way the technology turns over, it will be 5-10 years in computer science [and] if you look at the way colleges turn over, it’s much more difficult because [with] tenure they are gonna be with us for 30 years so the national turnover rate for colleges is about 30 years. Industries it’s like 5-10 years. So there’s a disconnect between how the world changes and how colleges are able to keep up. Therefore in computer science it would be hard to find courses that teach technologies that are useful today such as IOS and all the wonderful things that they do. So the industries jumped in and funded us to build these classes.” According to this logic, since tenure requires a thirty-year commitment to the faculty, and industry and technology change at a much faster rate, we need to get rid of the secure faculty and replace them with student mentors and the latest technology.

Thrun’s argument fails to recognize that faculty also develop and change, and most faculty, including his own wife, now teach without tenure. His point of view also pushes the idea that technological change is always for the better, and even if it is not good, there is no way to resist it. As I have previously argued, we need to compare online courses to our best courses and not our worst, and we have to defend and define quality education and push for more funds to be spent on small, interactive classes. However, Thrun and other MOOC celebrators appear to have a disdain for their own teaching: “But in the existing classes, the level of services are often not that great. . . .I talked to numerous instructors and you divide the time the communal time and the personal time you give back to the students in terms of advising and grading . . . you can be lucky as a student for 3 credits class to get 3 hours of personal time. Many people laugh and many say I spend 10 min/student per class and the rest I give to my TAs. Charging $1,000-$4,000 for that to me is gonna be a question going forward.” Although I have often questioned what students are actually paying for in higher education, what Thrun is really questioning is the validity and value of large, impersonal lecture classes, and on this point, we are in agreement; still the question that remains is if large online courses can really provide the quality education they advertise.

At the last Regents meeting, many of these themes were continued as three computer science professors attempted to convince the UC system that online courses would make higher education “better, faster, and cheaper.” In her presentation for Coursera, Daphne Koller insisted that since students now have a very short attention span, the classic lecture has to be broken up into a series of short videos followed by an interactive question and answer system. She argued that this method paradoxically makes mass education personalized as it pushes students to constantly learn and be tested on material before they advance.

Like the other online course providers, in order to differentiate her “product” from the “traditional” model of education, Koller had to constantly put down the current way we educate students. Thus, she derided the “sage on the stage” and the inability of most students to ask questions in their large lecture classes. She also bemoaned the fact that no one wants to read students’ tests with identical questions and answers, and so the whole grading process can be given to computers and fellow students. Once again, this argument not only degrades the value and expertise of faculty, but it also treats students as if they need to be reimagined as programmable machines and free laborers. Yes, let’s have the students’ grade each other’s paper, and while they pay for their education, let us train them to work for free.

Another alarming aspect of the rhetoric of these providers is their constant reference to experimenting on students as they attempt to increase access to higher education. The idea presented at the Regents meeting is that since so many under-represented students cannot find places in the UC system, these students from underfunded high schools should be given an online alternative. Some have called this the Digital Jim Crow because wealthy students will still have access to traditional higher education, while the nonwealthy, under-represented minority students will be sent to an inferior online system. Of course this new form of educational segregation is being pushed under the progressive banner of expanding access

Francis Marion University Review


Francis Marion University is a really great school. As people quoted that it is easy to get into but hard to get out. It is what a person does with his or her major that counts. The university prepares the student to go out into the real world and start a great career. There may be some professors that may appear as slackers but there are some really great professors at FMU who care and want to see the students progress. For the negative reviews about FMU, I am a graduate from FMU and have met lots of students who were slackers. Therefore, if a student does not put in the effort then what can FMU do if the student is not willing to put their best foot forward.


My time at FMU generally positive. I am not sure why there are so many negative reviews about the school, but I have been gone for a while and do not know if things have changed that much. I cam to FMU the first year that Dr. Carter took over and witnessed many changed to the school and its physical presence in Florence until I move away in 2007. I enjoyed my time in the History Dept. The Professors were engaging and worked with students. They expected students to make an effort in their studies and were not tolerant of slackers. I also was involved in the School of Education. It focused mainly on education theory and not enough on practical classroom issues. I have come to learn that this is fairly normal for most colleges and universities.


I am outraged at what happened to me while I was a student at FMU. They couldn't find me an advisor while I was a marketing major so I had to get my schedule made by a woman in one of the offices in the school of business. Well turns out she gave me a biology 105 class with a biology 103 lab. Once I found out about this mistake, I informed her and she said she made no such mistake and I was given an F because of that.They also have HORRIBLE chemistry teachers, especially that Anderson guy. He moved way too quickly and was in no way a good professor. Most of the students in my class were failing and complained about his teacher. Why isn't he fired yet?One last thing. I had a C in spanish 101 and when grades came in, there was a D there instead. Well that caused my gpa to be a little lower than I wanted to be, but the following semester, that C I had miraculously appeared in place of the D and I hadn't retaken the course. I will never recommend anyone go to FMU (Frauds Merging University)

Clemson University Review

Clemson is a good school, no doubt about it. The campus is pretty and most of the students are friendly. However, there are the occasional snooty students that think their so brilliant/better than everyone. There are also kids that nobody knows how in the world they got accepted. Transfer students for the most part are a joke. Some transfer for financial purposes, but a lot are unintelligent and end up at Clemson somehow. If you're the outdoorsy type, you'll like Clemson. There is a lake on campus, the mountains are 45 minutes away, and beaches are 4-5 hours away. There really isn't anything to do in Clemson. The closest small city is 30 minutes away, and Charlotte and Atlanta are 2 hours away. Overall, the academics are good, the athletics are also good. Clemson football has recently become very good. Except USC has become better. Clemson will always be the best school in the state of South Carolina. I would recommend it for the most part. 

I love clemson to death. I am an instate student who always knew what college i was going to be attending. I would honestly never want to be at any other college. There is some major draw backs to clemson though. This college is we over priced. Both with materials for class(books, eclickers, etc) and housing on campus. I mean why do i want to pay 7,000 dollers for a 2 person apartment thats been there for 30 years. They will nickle and dime you every step of the way. IF you can get past the huge price tag on the this school, everything else is great. The capus is very diverse. With that in mind, i get tired of non american TA's that i have to strain to understand or that can not understand me.

Okay, so I'm literally just writing this review to warn/help/advise others on what to do if you're thinking about coming here. Background about me: in-state, engineering major, seriously still can't decide how I feel about this place and I'm not sure if I ever will. There are some truly great things about Clemson in my opinion. The campus is well-laid out, the school runs well, systematically run campus. For some people, this school has everything you will ever want in a university. However, this school may 'shrink your world' in the process. I believe that a university setting should expand your world. Unfortunately, that's not the case with Clemson (the majority of the time.)Clemson may not be a good fit for you if you...
-love sports -- love watching them, playing them, whatever
-love the idea of Greek Life - Greek life here is huge - I personally don't know too much about how it works on the inside, but from the outside - everyone that's in it seems to think the world of it. In fact, it seems to be their world.
-are very focused on maintaining your faith. This definitely isn't a negative aspect. Religion helps people remain grounded. Again, the majority of these are religious organizations are Christian. So, if this is something truly important to you, you'll find friends.
Clemson may not be a good fit for you if you...
-are used to a more cultured atmosphere - if you've spent your life interacting with people who are extremely knowledgeable about affairs in the modern world, you might be surprised by the lack of open-minded thinking here. Many people are going to hold the same, conservative opinions. I'm not saying everyone's like that..I've met plenty of people that are extremely knowledgable, cultured, etc, but it's not as common. This is coming from someone who considers themselves to be conservative.
-aren't so enthusiastic about the outdoors. Clemson is hardly a town - it's not even really a village. Literally, downtown is one street. AGAIN, plenty of people love going downtown, drinking, but there's no active 'night scene.' People go to frat parties and such, but if you're looking for entertainment elsewhere, the closest 'city' is Greenville, which is quite popular.
-are looking for a college with a lot of diversity. Whether it's good or bad or really doesn't matter, Clemson is not diverse. It's one of the least diverse campuses in the nation. This really isn't a big issue to most people. I suppose the only people that would really notice are the people that aren't white...but whatever. That's a fact you should take into account.
Clemson has great weather, great sports, and some fantastic students. It all depends on what you're looking for. I tried to make this review as helpful as I can. I don't want to bash Clemson. It really has some positives, but there are negatives. If you really want to come here, make sure you understand what it's like.If you're out-of-state, I'm not sure why you would come here if you weren't planning on rushing, playing a sport, or didn't have some national scholarship. I'm sure the schools in your state are great.If you're in-state, you're still getting a good education for a reasonable price comparatively. Just know what kind of your atmosphere you're looking for in a college

The University of Florida Reviews

One description I heard of UF students (by a professor) was "bright but naive." I agree -- students were bright, curious but often lacked knowledge/experiences more sophisticated students would take for granted. Most are from Florida and publicly schooled but there is more diversity than you would expect from the remaining to 20 to 25 percent that are out of state/internationals. Professors can be top notch -- I had one Professor who had his PhD from Princeton and another from Oxford, England -- others were top in their field and highly published and regarded. Some students will take on the world, others will get married and stay in Florida. Of the many top notch programs at UF, funding from sports plus the huge college intake in education and allied health sciences helps keep the coffers filled but makes the place appear less intellectual. It's a mix and the Greek system, while influential in student government, is dilute thanks to the large campus and many student activities.

In high school I was told I wouldn't get into UF. It was as if UF was on the plan of an Ivy league school, but its far from a decent school for my major. I went to a community college where I pulled all As and Bs and got into UF. Not without trouble though, these people aren't the brightest. They rejected me because I failed to answer an email of theirs regarding what class I was going to take... Really? Send something that important in an email and not a letter? Why did that matter anyway? I told them several times my option anyway and they simply didn't listen. I had to deal with this while I was in Germany doing an independent study and had internet maybe 1 time every other week.Other majors may have better resources, but mine was so horribly poorly funded I couldn't complete my major here. Language requires a lot of resources and money. I choose UF for its apparent "budget", its reputation, study abroad, club, faculty reviews, library, and language lab.
Well, they cut out a massive amount of their programs and classes from the language department and several other departments. They include the computer science, philosophy, arts, music, and humanities. I can not even remain a full time student because they simply don't have enough classes that count toward my major. The TA's tend to be awesome, but its really a 50/50 shot. One of them was mediocre (knew her stuff but didn't know how to teach) and the other was simply amazing and the best so far. I learned more from him than I did the Dr. Professor. The funds simply go to sports or their research.
The rep it has is really inflated. Its only good in its sciences and sports. That also depends on the science. Most of the professors suck, plan and simple. Experts who can't communicate or teach. The good ones are hard to get classes with because everyone flocks to them. Are they all bad? Of course not, some amazing professors! But not without its poor ones. This is typical of most schools though, the good and the bad. The problem is, with its reputation, this school shouldn't have that problem on the same level.
Now the study abroad opportunities are a mixed bag as well. The semester long programs are VERY good and well worth it. Depending on your major or language preference, the head of department for that can affect the quality. Most of the Language abroad programs are excellent. The shorter ones though are nearly as expensive and last only a week to 6 weeks and that is much less cost effective and generally too rushed. Unless its the only ones available in the department, try for longer studies.
My language was suppose to have a club. Many of them are not very active and some are so inactive they are impossible to get a benefit from because they don't do anything.
I researched the program prior to applying. I knew all the credentials of my professors in the department. I decided that a few inexperienced or poor student reviews were expected of any school. The problem is that many of the language programs are underfunded and the format of the class has to change. Many are now "hybrid" classes although they say its an in-class lecture. This simply doesn't work for many students. Your paying and working to interact with these experts in their field... Instead your wallowing around talking to someone who knows as much (if that) as you and not improving your accent. I learned nothing in higher classes besides how to expect a bad grade and no help. The rules that are set down, the professors must follow it, regardless if the students are benefiting.
For the most part, the library is amazing. Study rooms, computers, amazing collection for the most part. Awesome in 95% of the subjects you can study at UF. If you are in a major with a small population of students, the library is largely useless. Most of the minority languages have no updated material.
For those that are language majors, go visit the language lab before enrolling. Ask for the material. There was NOTHING for me of updated material. They handed me an ancient dictionary and grammar book that contains such old language I got marked wrong for using it. Other languages are worse off.
If you are accepted into a major that has good funding and population, then if you can afford it and don't mind the quality not living up to the rep, then at least you will get the name of UF on it. If you are not in a popular major beware. The money and budget is horribly abused and wasted on non-vital stuff when VITAL things need fixing. Poor management.The campus is okay as far as safety. I did not live on campus but if you pay for the better housing it is pretty safe. I hear of problems in the more communal living. Still, crimes are pretty regular. Your get them sent to your phone. There is rape, sexual assault, etc. on campus. If you're irresponsible about yourself, you bet somethings going to happen. Act responsible to avoid these people and keep cautious and you'll be fine. Campus security is poor in my opinion. The city of Gainesville isn't safe nor horrible, just kinda typical city crime. Lots of beggars and petty theft. I wouldn't over worry if you use your common sense.Overall its an okay school. If your in one of their popular majors then you will be fine. Don't expect the best, but it will be good. If your aren't, then find somewhere else. Tuition hikes are every year, so prepare for that. If you don't mind being a number, like me, you will enjoy relatively small class sizes except in the most popular classes.


I've recently transferred from UF to another state school. HS counselors pump this school up like it's MIT or Harvard, when in realty, they're not even in the same league as Ivy league schools. This school is soo freaking big and you're just another number. Most of the dorms are like 60 yrs old with no AC and most of the faculty doesn't speak English and you feel like they have no time for you because they're doing research. Sure..UF has good athletics, but I didn't come to college to party 24/7 or live for college football. Interestingly enough, everybody underage drinks here, the cops don't care on game-day. Don't believe the hype about UF, go visit other schools in the state before you pull the trigger. I'm now a senior at the University of Miami and loving it. More expensive but worth every penny.