A closer look at student performance in city schools

Print More

Parents in central cities seeking good educations for their children face the disconcerting reality that relatively few city school children pass standardized tests required by New York. In Rochester, Buffalo, Syracuse and Schenectady, fewer than 20 percent of students passed state- required English language arts and mathematics exams in 2016 and 2018.

John Bacheller

A considerable body of research shows that student performance is strongly related to socioeconomic status. Parents’ incomes and educational status predict almost three-quarters of the variation in performance between schools and school districts, where parents are relatively affluent and well-educated, and those who are not. More than 90 percent of students were disadvantaged in Rochester, compared with more than 80 percent in Buffalo, Syracuse, Utica and Schenectady.  

But socioeconomic status does not tell the whole story. There are variations in performance that are not accounted for by economic and educational status.

The quality of a child’s education has long-term consequences. SES and student performance on tests are strongly related to economic status once students become young adults. Only 23 percent of students from low-income families whose math scores were below average had above-average socioeconomic status as young adults, according to a 2019 Georgetown University study. By contrast, 47 percent of students from low-income families whose math scores were above median achieved above-average socioeconomic status as young adults, though among students from high-income families whose performance was above average on math tests in 10th grade, 80 percent had above-average SES as adults.

Adjusted for SES, students in Albany, Rochester and Syracuse perform below expectations. In Rochester, only 9 percent of students in grades three through eight passed the state required exams in 2018. Based on the percentage of economically disadvantaged students and district educational levels, 17 percent were expected to pass. In Syracuse, only 13 percent passed; 18 percent were expected to pass. In Albany, 18 percent passed, compared with 31 percent predicted by the two factors.

Though we might infer that unexplained performance variations are the result of differences in school quality, we cannot be certain that it is the cause due to a limited number of data variables. Poor performance could be the result of differences in classroom instruction, but it also might be the result of other factors, such as student differences that existed in preschool years. We know that as early as third grade—the lowest grade level at which the tests are given—students in the poorly performing school districts did less well than predicted by their educational and economic status. This suggests that differences in early childhood might be important.

There are better measures of the impact of schools on student learning than comparing achievement on tests at a single point in time. One approach is to compare the educational growth of students in different schools and school districts. This approach reduces the effects of non-school related differences.

When student educational growth is compared, different patterns appear. Differences among urban school districts are smaller using the educational growth measure than the SES/education model. But differences within school districts are large in some cases. Charter schools, for instance, are more successful in some cities than in others in providing better settings for student educational growth.

Charter schools

Given the bleak academic performance of central city school students, it is not surprising that charter schools were established. The fundamental promise of these schools was to offer learning environments that would be more conducive to student success than existing district-operated schools. Supporters believe that charter schools, free of bureaucratic rules and limits placed on school districts, are able to incentivize teachers to help students improve and thus produce better results for city children.

Charter schools have been controversial. Teachers unions have argued that they divert needed resources from existing public schools, and that they produce little real benefit for schools. Assessed in aggregate across the nation, evaluations of them have been mixed.

Defenders point to consistent differences in charter school performance by state, reflecting differences in the way charter schools are authorized and evaluated. For example, a 2017 study by the Center for Research on Educational Outcomes at Stanford University found that overall, students at charter schools performed significantly better than those at comparable school district operated schools. CREDO compares the educational growth of students at charter and district operated schools to evaluate differences in outcomes. Overall, students at charter schools in New York performed significantly better in mathematics than those in schools run by school districts, by about one-tenth standard deviation (one tenth of a standard deviation is equivalent to a move from the 50thpercentile to the 54thpercentile.) In English, the difference amounts to a change from the 50thpercentile to the 51stpercentile. 

The difference in performance New York was largely driven by charter school performance in New York City. Upstate differences between charter school and district school performance were not statistically significant. However, CREDO data shows that Uncommon Schools in Rochester was associated with a positive difference of 0.11 standard deviation in English compared with district operated schools—a difference of 4.37 percentiles. In math, the difference was larger—0.26 standard deviation, or 10.26 percentiles.

Not all charter schools outperformed their traditional public-school counterparts. There were large variations in charter school performance. Nearly half significantly outperformed district schools, but slightly more than half performed at levels that were not significantly different from schools operated by school districts or performed worse.  

In other studies, CREDO has found that charter schools operated by charter school networks that manage more three or more schools generally outperform other charters. Several networks that perform well operate upstate, including Uncommon Schools and KIPP schools.

New York has recently begun publishing dataon student growth at schools administering the state’s third through eighth grade exams. The state defines student growth as follows: “[A] Student Growth Percentile (SGP)…measure[s] a student’s improvement or growth relative to other students, considering the students’ prior academic histories…The SGP indicates whether a student grew more than or less than students with similar test histories in the state. New York defines elementary/middle-level growth as “three years of student-level growth in ELA and mathematics combined.”

Analyzing student progress to assess the quality of schools and school districts offers significant advantages over simple comparisons of student test scores. Student progress comparisons exclude the effect of differences in student performance that result from differences that are not related to school quality. Even so, there are weaknesses in the student growth measure unless some additional control measures are used. In New York’s case, three additional controls are included in constructing the measure: percentage of English language learners in the classroom, percentage of students with disabilities, and percentage of students in poverty.

Educational growth in school districts

Although Albany, Rochester and Syracuse performed relatively poorly compared with what the SES/education model predicted, the state’s student educational growth data did not show the same performance deficit. Large upstate cities except Binghamton showed student educational growth that was within two percentiles of the average for all school districts in the counties within which they were located. Utica did best, with the average student in the district being in the 52ndpercentile in the state, 2.95 percentiles higher than the county average (49 percent).

In the upstate metropolitan counties studied, the districts that performed the best were a mix of large and small, suburban and rural places. In the best-performing district, Whitney Point, the average student’s performance growth placed the student in the 55th percentile.

The weakest-performing districts were also a mix of district types. In the worst-performing district, the average student’s performance growth placed him or her in the 41stpercentile.

Although the gap between the best-performing district and the worst is large—15 percentiles—most differences between districts in the upstate metropolitan counties studied were small—64 percent of the districts were within plus or minus 2 percent of the average for all districts.

At the same time, there are large variations in student educational growth among schools within the same school district.

Educational growth in schools

The following section is based on student growth data published by New York for 2017-2018. Charter schools from three counties—Albany (Albany), Monroe (Rochester) and Erie (Buffalo) —were included in this analysis because data was available for five or more charter schools in each of these counties.

This chart shows the average elementary/middle-level growth percentile at each school in the three cities compared with other schools. Three quarters of the schools had EM growth percentiles that were within a range of only six points—between the 46thand 52ndpercentiles.

Charter school performance also varied in the 2017-2018 New York data. The best- (Buffalo Academy of Science – 69thpercentile) and worst-performing school (Charter School of Inquiry in Buffalo – 39thpercentile) were 30 percentiles apart in student growth. Similarly, there were large variations in district operated schools in each of the cities. For example, at the best-performing district operated school (the Montessori School in Albany), average EM student growth was in the 60th percentile, while at the worst school (PS 82 in Buffalo), the average student was in the 38thgrowth percentile compared to students statewide.

School sizes are relatively small. Random variations in performance could occur because of the small number of test subjects in each. To prevent misinterpretations of differences in school performance because of sample variability, I use the method applied in the CREDO study. For schools to be labeled better or worse performing than average, a statistical test had to show with 95 percent confidence that the school’s student growth percentile was different from the average (49thpercentile) of school districts in the areas examined.

NYSED data from 2017-2018 shows that in the three cities and counties studied, charter schools did perform better overall than district operated schools. Thirty seven percent of charter schools had average EM growth percentiles that were significantly above average, compared with 23 percent of district-operated schools. Even so, 63 percent of charter schools’ average EM growth percentiles were average or below average. For district operated schools, 77 percent were average or significantly below average. But there were large variations in the performance of district-operated and charter schools in each of the three cities studied.

Rochester and Albany had the smallest percentage of district-operated schools that had EM student growth that was significantly above average. In those cities, charter school performance was better than the performance of district-operated schools, with 55 percent of charter schools in Rochester and 40 percent in Albany having significantly greater student growth than the average. Only 13 percent of district schools were above average in each city. In Albany, nearly half of district-operated schools were below average. None of the charter schools in Albany showed below-average student educational growth. 

Buffalo differed. There, a higher percentage of district-operated schools had significantly above average growth (33 percent) than charter schools (21 percent). But 20 percent of district-operated schools in Buffalo were below average, compared to 7 percent of charter schools.

Rochester

In Rochester, six of nine schools with significantly above-average EM growth percentiles were charter schools, compared to only three district-operated schools. Among schools that had EM growth percentiles that were significantly below average, seven of nine were district-operated schools.

Most schools operated by the Rochester school district had EM growth percentiles that were statistically average. For many of these students, seeking to transfer to a school ranking higher might not offer a significant advantage, given the amount of variation in school performance that could be associated with statistical sample “noise.” For students in below-average schools operated by the Rochester school district, moving to a better-performing school could provide greater educational opportunity.

Maintaining quality

Although differences in student growth between city school districts were not large, within school districts there were relatively large differences between the best- and worst-performing schools. In two of three cities—Albany and Rochester—charter schools as a group outperformed those operated by school districts. But in those cities some charter schools performed poorly, and some district-operated schools performed well.

For students, avoiding schools whose performance is significantly below average could be beneficial to student growth. But geographic accessibility and realities of competition for limited seats in better-performing schools can make it difficult to get into them.

For policy makers there are several challenges. Attempts to turn around poorly performing schools do not have a promising track record. As Brian Backstrom of the Rockefeller Institute at the State University of New York writes: “Over the past half-century, billions of dollars have been spent across the nation on efforts to transform persistently low-performing public schools — most of them urban, most of them low-income, and most of them disproportionately enrolled with students of color — into models of success. It hasn’t worked.”

School turnaround efforts are often frustrated by forces that contribute to organizational inertia, ranging from unions that fear that members will lose their jobs to uneven implementation because of differences in senior and middle-level managers’ commitment and ability and insufficient long-term financial commitments.

Backstrom points out that the most effective approach to improving student performance has been to close poor-performing schools. But doing so faces practical impediments—most notably that better alternatives must be available to displaced students if efforts are to succeed. This can be difficult, because existing, better-performing schools usually have enrollment constraints that limit the number of additional students that they can accommodate.

Where city school districts have seen improved results, charter schools often play an important role. The challenge here is twofold. One issue is scalability. Evidence suggests that some charter school operators with proven track records—KIPP and Uncommon Schools are two examples—are more likely to provide statistically superior results than independently operated charter schools, but the successful organizations are constrained by their organizational capacity to grow while maintaining quality. 

Charter schools take time to establish and, in many cases, need substantial private financial resources to support their efforts. Quality control is a concern as well. Student performance at some charter schools is significantly worse than at most district-operated schools. These schools do not add meaningful choices to those seeking greater educational growth. Here, the state Education Department should be vigilant in monitoring charter school performance.

John Bacheller, former head of the policy and research division of Empire State Development, is an author of Policy by Numbers, a blog that focuses on data and policy at the state level, with a focus on Upstate New York.

One thought on “A closer look at student performance in city schools

  1. You say,
    ” Attempts to turn around poorly performing schools do not have a promising track record.”
    Now, why should it be so hard to turn poorly performing schools around”

    I think that the difference between poorly performing schools and highly performing schools has a lot to do with a few simple rules. We can endlessly argue about what steps to take. But we might start with a change in ATTITUDE. A negative, defeatist attitude will tend to yield poor results.

    Perhaps a few videos can help teachers and students to become more positive about school.
    Here, for example is a 2 min. video called: “I don’t want to go to school” by Barry Polisar:

    https://www.youtube.com/watch?v=ABA_KzwOKtM

    A funny video like this could be played in schools, repeatedly, to encourage teachers and principals.
    We need superintendents, with super intentions, principals with good principles, and teachers who are not cheaters (same letters as teachers)

    I think there is no secret, magical formula for school success, but COMMON SENSE ideas may help.
    =====================================================================
    Harry S. Pearle, Ph.D. Consultant

Leave a Reply

Your email address will not be published. Required fields are marked *