Benchmarking Research Performance in Department of Computer Science,
School of Computing, National University of Singapore

In April 1999, the Department of Computer Science at the National University of Singapore conducted a study to benchmark its research performance. The study shows, using publication counts alone, that NUS ranks 26th among a list of top 70 CS departments in the US.

In our study, we chose to use statistics of conference publications. A total of 39 top conferences in Computer Sciences were used. We used papers published from 1995-1997; we stopped at 1997 so that the proceedings from the most recent year would be likely to be available in the library. The following top 70 US CS departments in the ranking published by the National Research Council  was used as our "authoritative ranking".
 

1 Stanford University  25 University of Chicago 49 Rensselaer Polytechnic Inst 
2 Massachusetts Inst of Technology  26 Purdue University  50 Univ of California-Santa Cruz
3 University of California-Berkeley 27 Rutgers State Univ-New Brunswick 51 Univ of Illinois at Chicago
4 Carnegie Mellon University  28 Duke University  52 Washington University
5 Cornell University  29 U of North Carolina-Chapel Hill 53 Michigan State University
6 Princeton University  30 University of Rochester  54 CUNY - Grad Sch & Univ Center 
7 University of Texas at Austin  31 State U of New York-Stony Brook 55 Pennsylvania State University
8 U of Illinois at Urbana-Champaign  32 Georgia Institute of Technology  56 Dartmouth College
9 University of Washington  33 University of Arizona 57 State Univ of New York-Buffalo
10 University of Wisconsin-Madison  34 University of California-Irvine 58 University of California-Davis
11 Harvard University  35 University of Virginia 59 Boston University
12 California Institute Technology  36 Indiana University 60 North Carolina State University
13 Brown University  37 Johns Hopkins University 61 Arizona State University 
14 Yale University  38 Northwestern University 62 University of Iowa 
15 Univ of California-Los Angeles  39 Ohio State University 63 Texas A&M University
16 Univ of Maryland College Park  40 University of Utah  64 University of Oregon 
17 New York University  41 University of Colorado  65 University of Kentucky
18 U of Massachusetts at Amherst  42 Oregon Graduate Inst Sci & Tech 66 Virginia Polytech Inst & State U 
19 Rice University  43 University of Pittsburgh 67 George Washington University
20 University of Southern California  44 Syracuse University 68 Case Western Reserve Univ
21 University of Michigan  45 University of Pennsylvania 69 University of South Florida 
22 Univ of California-San Diego  46 University of Florida 70 Oregon State University
23 Columbia University  47 University of Minnesota  
24 University of Pennsylvania  48 Univ of California-Santa Barbara   

We counted the number of papers published in the selected conferences by NUS and the 70 US computer science departments, and checked how well counting the publications agreed with the ranking published by the NRC. To measure the degree of disagreement, we counted the number of pairs of universities that had the property that University A was ranked above University B, but University B had a higher paper count. Using this method, NUS's estimated ranking among US universities was 26th and it agreed with 80% of the relative rankings of the NRC study. To further assess the quality of our method, we used a variant of a standard technique, called "cross validation". Note that our method can be viewed as using the NRC ranking to estimate a weighting on conferences, and then using the weighting to rank the departments again. We performed the following experiment to validate our approach: First, we chose a weighting of the conferences using only the departments in the NRC ranking with odd-number ranks. Then, we took the resulting weights, and counted the number of disagreements that they had with pairs of departments with even-numbered ranks. We found that our method agreed with 80% of the pairs of even-numbered-ranked universities.

To address  the effect of biases (e.g., large departments, selected conferences, etc), we have also tried a variety of different methods, which balanced our prior knowledge about the prestige of conferences with information obtained by looking at where members of well-respected universities published. Using these methods, our estimate for the ranking of our department generally fell somewhere in the 20s. Please refer to the full report on the experiments.