CPS creates new “better” bad school category

CPS Guidelines For School Actions reads: “The CEO may only propose a [school] closure…if the students impacted…have the option to enroll in a higher performing school.” The Commission on School Utilization says, “The goal must be to enroll all displaced students into higher-performing schools.”

Looking at the list of closing and receiving schools—and the associated data—the question naturally arises, “Is CPS living up to its own standard?” The answer, unfortunately, is “Not always,” or more troubling, “No one really knows.”

For many years, CPS has classified schools as performing at levels 1 (best), 2, or 3 (worst). However, when you examine levels at closing and receiving schools, you find that in 25 cases, students are slated to be sent to a school that is not rated at a higher level.

CPS seems to have temporarily redefined “higher performing” to mean:

. . .higher on the majority [three out of four] of the following metrics for the 2011–2012 school year: percentage of points on the Performance Policy, ISAT composite meets or exceeds score, Value Added reading, and Value Added math.

CPS has created a new category of “better” bad that allows them to claim that all displaced students will be sent to better schools. Except, this “better” bad standard is not met in the case of closing Owens, with students sent to Gompers.

Using Value Added scores for this purpose, however, is problematic. The mechanics of calculating the scores are complex and education experts agree that these scores should not be used in this way.

Making matters worse, when you apply statistical analysis of validity to the underlying CPS metrics, they are shockingly unstable. From 2011 to 2012, 77 schools flipped from being among the top quarter of schools in Value Added Reading to the lowest quarter or vice versa.

How can CPS justify closing a school because it had a very low Value Added score in the most recent year even though it had a very high score in the previous year? Using their methods, might that school be just as likely to have a very high score next year? Statistically, this extremely “noisy” number is a difference without distinction and certainly no marker of “better.”

In recent weeks, debates have centered on utilization rates, savings, safety, neighborhood disruption, and racial discrimination. Statistical evidence indicates that—at the very least—many students will not be moving to a higher performing school.

Advertisements

3 thoughts on “CPS creates new “better” bad school category

  1. Is there any way to find out which list of educators have a problem with this …
    “Using Value Added scores for this purpose, however, is problematic. The mechanics of calculating the scores are complex and education experts agree that these scores should not be used in this way.”
    and which 77 schools switched place this year.
    we would use it to defend our school at the meetings.

    Thank you

    • Hi Alexander, There are many cites for criticisms of Value-Add Modeling (VAM). If you search the internet for “problems with value-added scores” you will find articles by Daniel McCaffrey (RAND), Jack Jennings (Center for Educ Policy), Jesse Rothstein (U of C/Berkeley), and others. Even enthusiastic proponents of VAM caution against using it for the types of purposes that CPS is using it for in these school actions. It is all too tempting to ignore the weaknesses of VAM in the pursuit of an “easier” way to assess schools and teachers. We’ll get back to you on the schools which switched places this year for their VA scores on the SchoolCuts blog.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s