At Modern Luxury, connection and community define who we are. We use cookies to improve the Modern Luxury experience - to personalize content and ads, to provide social media features and to analyze our traffic. We also may share information about your use of our site with our social media, advertising and analytics partners. We take your privacy seriously and want you to be aware that we have recently made changes to our Privacy Policy, which can be found here.


Community College Chancellor Blasts 'Washington Monthly' Story on CCSF

Scott Lucas | September 9, 2013 | Story Politics

In late August, we ran an item about Washington Monthly's in-depth story ("America's Worst Community Colleges") criticizing the California Community College system and CCSF in particular for failingon a wide range of metrics. The piece's author, Haley Sweetland Edwards, argued that CCSF was an example of the state's "broken" higher education apparatus and that according to the magazine's data, CCSF was ranked a dismal 842 out of 1,011 community colleges around the country.

As you might expect, the story produced a backlash. As part of that, we received the following letter from California Community Colleges Chancellor Brice W. Harris. Here, in full, is his response, in which he calls Edwards's story "flawed and offensive," followed by the reporter's point-by-point rebuttal:

"[The Washington Monthly's] story and purported “rankings” of community colleges, especially those in the Bay Area, are based on a concocted methodology that wouldn’t pass muster in a community college research or statistics class. The respected researchers at the Center for Community College Student Engagement who collect and maintain the data [it] relied upon have denounced the survey, pointing out that the data have been 'manipulated in ways not transparent to the reader.'

[Its] convenient but discredited metrics ignore facts on the ground at community colleges in California and elsewhere in the country. Nearly a third of our students take just one course—for example, a computer software brush-up class—and don’t return after getting the instruction for which they came. Yet [the magazine] counts these students as intending to pursue a certificate, degree or transfer and incorrectly reckon they dropped out. [The story's] graduation metric counts only full-time students entering in a fall semester and completing within three years. Seventy-two percent of California community college students are part-time, often juggling jobs, military service, and family responsibilities with academic or vocational pursuits. Any honest and rigorous examination of the metric [used] would have acknowledged the national consensus that has held these criteria in ridicule for years. Nowhere in the nation is community college participation higher than in California, which holds up open access as a bedrock principle of higher education and has been modeled around the world. Yet [the] survey penalizes this approach in favor of a four-year centric mindset.

An accurate view of student outcomes at California community colleges can be found in our Student Success Scorecard which precisely measures completion and transfer rates for students pursuing those goals. It shows that more than 70 percent of college-prepared students seeking a certificate, degree or transfer achieve their goal.

[The story's] attempt to discredit the teaching and learning that is occurring at City College of San Francisco is especially flawed and offensive. To be sure, the college has challenges, which it will overcome. It may surprise your readers to know that City College of San Francisco actually performs higher than the statewide average in key student success indicators when using 2010-11 data, not the 2007 data your reporting relied on."

We reached out to the author of the original piece, Haley Sweetland Edwards, for her response. Here's what she told us:

"As for the [Center for Community College Student Engagement] complaint: CSSEE has issued a version of this same statement after each of the last two community college rankings. They pretty much have to, because their gig depends on colleges voluntarily choosing to administer the survey. A few points: CCSSE has administered the exact same survey every year since 2005 so there's no reason why results from 2007 can't be compared to later years. In fact, that's more or less the point of issuing identical surveys year to year. For community colleges scoring in the top ten percent of one of the benchmarks is another way of saying 'A small minority of colleges in the Bay Area did well on a small minority of our measures.' We did not misstate the method for calculating the benchmarks; notice that CCSSE does not explain how our explanation is wrong or what explanation is right—they simply say we 'manipulated' them in 'ways not transparent'" How, exactly? Our two number crunchers here are professors and extremely well-respected policy wonks who study this stuff for a living [...] Neither are from California nor have any interest in punishing that fair state; they are simply reporting the numbers.

Again, there is nothing 'statistically wrong' about our ranking method. In fact, we have in the past shared the precise methodology with CCSSE, right down to to the formulas in the spreadsheets, and would be happy to do so again. Notice what they're actually saying is 'undisclosed' and 'not transparent' which, absent the publication of said formulas in the magazine, is unavoidable. The method CCSSE uses for calculating its benchmarks is also, for the record, non-transparent. They give no explanation on their own website and do not publish the formulas either.

As for the graduation metric: it's true that graduate rates are not an ideal measure. That's why we also looked at completions per 100 full-time equivalent students—a well respected measure. Please scroll down in this link to the Chronicle of Higher Education's illustration of federal data sets: Even by that measure, California CCs are STILL doing nearly 40% worse than the national average -- with 10.6 FTE students out of 100 completing, versus the national average, 14.2/100, for public 2-years.

But even richer? The 70 percent graduation rate that Mr. Harris uses in his complaint [...] is based on a cohort of 430 students. The graduation rate metric he says is too narrow is a cohort of 1,372 students (first-time, full-time). So, um—yes. Graduation rate metrics are not ideal, Mr. Harris, but you can't disparage the cake and cite it, too.

My main question to Mr. Harris, and the many critics of this piece, is this: do you really think CCCs are serving the needs of the community? You may object to the (admittedly incendiary) language of 'Worst Community Colleges,' but if you take a step back and imagine yourself a black or Latino young man in California with the goal of getting a degree in, say, computer science—you've got about an eleven percent chance of making it through. Eleven percent. There is some reality in blaming the student for some part of that failure—many are not prepared for college-level work. There is also some reality at pointing at other states and whining that their 2-year systems are also bad. But at some point, those who are running the system have to step up and say: Good god, something has got to change. We are simply not serving the student population that needs a post-secondary degree the most. California has always been a leader in higher education. They didn't get there by saying that downright bad is good enough."

We have reached back out to Chancellor Harris for a response to this letter and will update this post with any further communication between the parties.

Have feedback? Email us at
Email Scott Lucas at
Follow us on Twitter @sanfranmag
Follow Scott Lucas on Twitter @ScottLucas86


Photography by: