New Visions for Public Schools

Unlocking New York City’s High School Progress Report

The New York City Department of Education's high school Progress Report is the set of accountability metrics that includes the overall "A" through "F" letter grade that the DOE began issuing to most public schools annually in 2007.

Unlocking New York City’s High School Progress Report, a new study by New York University economist Sean Corcoran and his colleague, Grace Pai, represents the most comprehensive analysis of the Progress Report (PR) to date. The study, commissioned by New Visions for Public Schools, provides an in-depth analysis of the tool's component parts and the nature of its metrics, as well as insight into its use of peer groupings to benchmark the performance of individual schools.

 

 

"Our goal in commissioning this important study is to contribute to the rich and ongoing conversations that the Department of Education has welcomed since the Progress Report's first iteration and which have been critical to its significant improvements over time," said Robert L. Hughes, president of New Visions.

The DOE high school PR evaluates schools on a range of metrics including attendance, credit accumulation, graduation rates, and scores on standardized assessments. Schools are compared to every other school in the city as well as to a group of forty "peer" schools whose incoming students share certain characteristics. In order to account for variation in incoming student populations, the comparison to peer schools is weighted three times as heavily as the citywide comparison.

Using data from the 2010-2011 school year, Corcoran and Pai elaborate on the key features of the PR methodology. Among their findings:

• Peer groupings are formed using an index that is based on average proficiency of a school's incoming students, with adjustments for the percent of students who receive special education or are over-age. Importantly, peers only affect a school's score to the extent they affect the range of outcomes to which its performance is benchmarked.

• While the use of peers moderates the correlation between incoming student characteristics and scores, schools' overall PR scores remain associated with many pre-existing risk factors, suggesting that a school's score can be influenced by factors outside of its control.

• The Peer Index has only a modest effect on the overall grade assigned to high schools in large part because of the diversity of peer groups. When the authors calculated PR scores using a formula that ignored peers entirely in favor of a citywide comparison, about two-thirds of high schools received the same overall grade.

• The weighted Regents passing rate—a subcategory that accounts for a sizable portion of a school's overall score—is treated differently from other categories, such as high school graduation rate.  It is benchmarked, first, against expectations (based on 8th grade test scores) and second, against the peer group and citywide average. The implication of this double benchmarking is that schools with high-achieving students may be penalized for failing to achieve mathematically impossible growth targets.

In their conclusion, the authors propose several recommendations for improving the high school PR. They recommend that peer groups should be based on more than a single-dimensional peer index. Among the options they propose is abandoning peer groups entirely in favor of comparing actual student performance to predicted performance based on a broad range of student and school characteristics. If peer groups are maintained, they recommend an alternative method for assigning peers that takes into account pre-existing student and school characteristics.

"With thoughtful modification, the Progress Report can do a better job of measuring school impact," said Hughes.

 

comments powered by Disqus