Community Trust CEO Mazany cites SchoolCuts

“The very day that CPS announced the school closing list, that evening a group of software coders put up the site Schoolcuts.org,” says Terry Mazany, President and CEO of the Chicago Community Trust, who had also served as interim CEO of the Chicago Public Schools in 2010.

“They had aggregated all the datasets about school performance. And geomapped the schools that are on the list for closing …. That’s the sort of service you would hope that government might provide but these groups did it out of a sense of community service,” he says.

“They had this site up and running — and it is masterful.” 

Read more:
http://www.knightdigitalmediacenter.org/blogs/sduros/2013/04/how-chicago-community-trust-and-opengov-chicago-are-creating-new-type-accountab

Elnaz quoted in Tribune article

Excerpted from the Trib’s story, School closings: A closer look at CPS strategy, Elnaz is quoted in the fourth paragraph shown below (italics), commenting on the use of Value Added scores in determining which schools to close.

– – – – – –

At the 25 schools whose students are being moved to a school at the same performance level, CPS considered additional criteria.

Adam Anderson, CPS’ officer of portfolio, planning and strategy, said the district looked for the receiving school to outscore the closing school on at least three of four measures: a higher percentage score within the performance rating; composite meets or exceeds score on the ISAT; an improvement metric for reading; and another one for math.

With two of the four measures dealing directly with improvement, schools with solid scores that dipped slightly in some cases fared worse than poorly performing schools that could show improvement.

Elnaz Moshfeghian of Open Data Institute, which helps produce the “Schoolcuts” blog to study the data surrounding CPS’ school closings, said a closer look at that controversial improvement metric shows it changes for schools year to year.

“It’s been called a noisy number,” Moshfeghian said. “It is not a reliable and stable school metric and should not be half the reason why one school stays open and the other closes. You wonder why it’s being used at all.”

Anderson said the improvement, or value-added, score did not carry significant weight as the district decided which schools to close.

“Value-added scores alone would not have made a school higher-performing,” he said. “If you look at total weighting, in our best three of four measure it’s much further weighted toward ISAT measures than value added.”

– – – – – –

Full story at http://trib.in/114lsUl

CPS creates new “better” bad school category

CPS Guidelines For School Actions reads: “The CEO may only propose a [school] closure…if the students impacted…have the option to enroll in a higher performing school.” The Commission on School Utilization says, “The goal must be to enroll all displaced students into higher-performing schools.”

Looking at the list of closing and receiving schools—and the associated data—the question naturally arises, “Is CPS living up to its own standard?” The answer, unfortunately, is “Not always,” or more troubling, “No one really knows.”

For many years, CPS has classified schools as performing at levels 1 (best), 2, or 3 (worst). However, when you examine levels at closing and receiving schools, you find that in 25 cases, students are slated to be sent to a school that is not rated at a higher level.

CPS seems to have temporarily redefined “higher performing” to mean:

. . .higher on the majority [three out of four] of the following metrics for the 2011–2012 school year: percentage of points on the Performance Policy, ISAT composite meets or exceeds score, Value Added reading, and Value Added math.

CPS has created a new category of “better” bad that allows them to claim that all displaced students will be sent to better schools. Except, this “better” bad standard is not met in the case of closing Owens, with students sent to Gompers.

Using Value Added scores for this purpose, however, is problematic. The mechanics of calculating the scores are complex and education experts agree that these scores should not be used in this way.

Making matters worse, when you apply statistical analysis of validity to the underlying CPS metrics, they are shockingly unstable. From 2011 to 2012, 77 schools flipped from being among the top quarter of schools in Value Added Reading to the lowest quarter or vice versa.

How can CPS justify closing a school because it had a very low Value Added score in the most recent year even though it had a very high score in the previous year? Using their methods, might that school be just as likely to have a very high score next year? Statistically, this extremely “noisy” number is a difference without distinction and certainly no marker of “better.”

In recent weeks, debates have centered on utilization rates, savings, safety, neighborhood disruption, and racial discrimination. Statistical evidence indicates that—at the very least—many students will not be moving to a higher performing school.