The QS folks were booted by the Times Higher Education folks awhile back, partly for methodological reasons, and perhaps partly because their business model is dubious and full of conflicts of interest (see also). We've commented on the "Quirky Silliness" rankings before: see here, here, here, and here.
In any case, the new rankings are out, and the 2017 edition includes a "world" ranking for philosophy, based on some meaningless, non-academic factors, but several putatively academic considerations. We noted several years ago the disreputable method by which QS rounded up evaluators, but things appear to have gotten better in the interim. I have filled out these surveys several years running now, including this year. QS still uses a foolish methodology, in which evaluators are to name top programs in the specialty, meaning the ultimate rank is determined by the dumbest evaluator (e.g., the one who forgot to list NYU among the top 10 or 15 programs in the field). The resulting academic reputation scores are a weird artifact of (1) some actual knowledge on the part of evaluators; (2) pure halo effect (e.g., Oxford has an academic reputation score of 90.2, Cambridge 89.0, even though Oxford is dramatically stronger than Cambridge, and Cambridge is not better than, e.g., Princeton [see below]); and (3) the geographic and institutional distribution of evaluators, which QS still does not disclose, which is absurd. Here is the "academic reputation" of the top 20 U.S. programs according to QS:
1. University of Pittsburgh (100)
2. New York University (92.5)
3. Rutgers University, New Brunswick (92.4)
4. Harvard University (88.7)
5. University of California, Berkeley (86.2)
6. Princeton University (85.8)
7. University of Notre Dame (84.0)
8. Stanford University (83.4)
9. Yale University (82.8)
10. University of Chicago (79.7)
11. Columbia University (79.6)
12. University of Michigan, Ann Arbor (79.1)
13. University of California, Los Angeles (77.9)
14. Massachussetts Institute of Technology (77.8)
15. University of North Carolina, Chapel Hill (76.1)
16. City University of New York (71.6)
17. Boston University (70.6)
18. Boston College (69.2)
19. Cornell University (66.5)
20. University of California, San Diego (64.9)
While in the past, NYU and Rutgers did not do well in the academic reputation component, that has changed, correctly. I infer from the results that QS got a high response rate from philosophers in Germany--thus programs with faculty with a strong presence in Germany (like Pittsburgh [e.g., Robert Brandom], Chicago [e.g., Robert Pippin[, and BU [e.g., Manfred Kuehn] did surprisingly well. (German programs also score quite highly, which confirms my suspicion.) Throughout large parts of the world, the local Catholic or "Pontifical" University is often a major center of research excellence, and so I surmise many evalautors are solicited from those schools--which would explain the surprisingly strong showings of Notre Dame and, especially, Boston College. Obviously top and somewhat narrowly "analytic" programs like Michigan and UCLA do not do as well as they should, presumably because they lack visibility in other parts of the world.
Even weirder are the results for "citations per paper":
2. Princeton University (95.4)
3. Rutgers University, New Brunswick (94.6)
4. University of Pennsylvania (94.3)
5. University of Colorado, Boulder (94.2)
6. University of California, San Diego (94.0)
7. Yale University (93.8)
8. Massachussetts Institute of Technology (93.5)
9. Carnegie-Mellon University (92.6)
10. University of Southern California (92.1)
11. University of North Carolina, Chapel Hill (91.7)
12. University of Pittsburgh (91.5)
13. New York University (91.2)
14. University of Washington, Seattle (91)
15. University of Michigan, Ann Arbor (90.9)
16. University of Miami (90.7)
17. Harvard University (90.3)
18. University of Texas, Austin (90.1)
18. University of Wisconsin, Madison (90.1)
20. Northwestern University (89.5)
Since they do not disclose the faculty lists being used for these calculations, it is again impossible to know how to interpret this data. But the citations listing at least includes programs that are missing from the academic reputation survey solely because of geographic and halo effect biases on the part of evaluators.
Recent Comments