This morning, Rochester's city paper, the Democrat & Chronicle, ran a front-page article on grade inflation in local colleges. (Or, rather, the article's subtitle makes it sound as though it's about grade inflation.) The print edition also includes a chart that notes the percentage of As & Bs handed out at each school :
SUNY Brockport 63%
Finger Lakes Community College 54%
Genesee Community College 66%
SUNY Geneseo 72%
Hobart and William Smith 63%
Monroe Community College 65%
Nazareth College 82%
University of Rochester 80%
Rochester Institute of Technology 65%
Roberts Wesleyan College 85%
[One liberal arts college, St. John Fisher, did not report]
How to make sense of these statistics? Here we have a Ph.D.-granting university (the U of R), a technical school emphasizing mathematics & computer science (RIT), three community colleges, two SUNY four-year colleges, and three liberal arts colleges (of which Hobart and William Smith is the toniest). Some of these things are not like the others. The CCs only offer two-year degrees, while RIT offers both two- and four-year programs. Moreover, the U of R's admissions process is far more selective than that of any other college in the area. At the same time, the two SUNY schools acquire many of their students from the three CCs.
Lumping all of these schools together, then, is a little awkward, and certainly makes it difficult to draw overarching conclusions. If selectivity correlates with higher GPAs, for example, then the U of R at first appears to be on safe ground--but shouldn't its GPAs be higher than those at Nazareth and Roberts Wesleyan? Similarly, some have speculated (myself included) that high tuition rates often lead to grade inflation. And yet, SUNY Brockport ($4,350/year for in-state tuition) polls the same result as Hobart and William Smith ($30,000/year). For that matter, shouldn't Hobart and William Smith have the same results as Nazareth, Roberts Wesleyan, and the University of Rochester? In addition, given that RIT specializes in subjects that traditionally resist the inflationary impulse, shouldn't its results be markedly different from Brockport's?
The article, purportedly about "questions," actually focuses on explanations: many students take the independent study option (the U of R); students are now allowed to revise their work extensively (Brockport); students are graded on more assignments than they were several years ago (Hobart and William Smith). Only Kathleen Kutolowski, chair of Brockport's history department, notes that professors might be tempted to skew grades in order to get better course evaluations. Overall, however, there's no attempt to actually explain what the percentages might mean. If the U of R's students are superduperultra good, then are Roberts Wesleyan's superduperultramega good? If Brockport's faculty feel some pressure when it comes to teaching evaluations, do the U of R's feel some pressure when it comes to the yearly tuition? (And are Hobart and William Smith faculty just endowed with awesome reserves of virtue?)
 Matthew Daneman, "High Grades Given at Area Colleges Spur Questions: Some See Schools Improving; Others See Soft Standards," Democrat & Chronicle 30 Oct. 2005: 14A.