Advertisement

The Statistics of the EQA Audit Report

Stats play major role in contract renewal dispute

NEWS ANALYSIS

As the issue of renewing Cambridge Superintendent of Schools Thomas Fowler-Finn’s contract takes center stage, both his supporters and detractors have looked to a recent audit of Cambridge schools to further their political cause.

Both sides have accused the other of manipulating the Mass. Office of Education Quality and Accountability’s (EQA) 86-page report to suit its own agenda. Ascertaining what the report says—specifically, if it paints the picture of an improving or a worsening school district—is a matter of critical importance to the schools’ quality and to Fowler-Finn’s contract renewal.

Fowler-Finn, the superintendent for the past three years, is the face of one side of the debate. He has argued that the data proves that the schools are “on the road to improving,” a claim that has been supported by the director of the EQA, Joseph Rappa.

Opposite Fowler-Finn is School Committee member Patricia M. Nolan ’80, a newcomer with years of management consulting experience. Nolan has often challenged the data presented by the school administration.

THE ACHIEVEMENT GAP

Nolan’s first criticisms based on the data in the EQA report focus on the “achievement gap,” defined as the test score gap for six different subgroups of students—blacks, whites, Hispanics, Asians, special education students, and low income students.

Nolan has pointed out that how certain Cambridge subgroups score compared to the same subgroup’s state average varies. Cambridge’s blacks students score below the state average, for example, and its Hispanics and Asians score better. The gaps in their relative achievement increased between 2003 and 2005.

But Fowler-Finn points out that the reason for this more pronounced gap is not because the scores for some subgroups are increasing while others are decreasing. He has argued the discrepancy can be accounted by the fact that scores for some groups are increasing faster than others. He says that all groups have seen their scores increase at least somewhat.

In comparing the worst performing subgroup, special education students, to the best, Asians, Fowler-Finn showed how “education equity” could decrease even as test scores increased.

“Despite the broad scale improvement of all subgroups, the Asian students improved 9 points in [English Language Arts (ELA)] versus a 1 point improvement for the lowest scoring group, special education students,” Fowler-Finn wrote in an e-mail.

He added: “The EQA’s way of measuring progress, by comparing subgroups to subgroups, is rejected by many who feel that progress toward proficiency is the best way to determine the gap.”

The concern, then, is not that some students are improving while others are not, but that some students are improving faster than others.

PROGRESS TOWARD PROFICIENCY

But have the schools made overall “progress toward proficiency,” ensuring that the scores of all groups improve?

Test scores have improved across the board under Fowler-Finn’s tenure, from 2003 to 2005. But they have clearly improved more slowly than some committee members would like, and still lag behind state averages.

From 2003 to 2005, overall test scores on the ELA test increased by three points, from 74 percent testing proficient to 77 percent, and on the math test by five points, from 59 percent to 64 percent. All six subgroups improved on both tests from 2003 to 2005, except for the scores of low income students on the ELA test, which remained flat.

Test scores did decline in some subgroups between 2004 and 2005, as Nolan says, and overall ELA test scores declined by one point. Fowler-Finn attributes this to the fact that the state changed its tests between 2004 and 2005, and notes that the entire state’s average also declined by one point.

Still, as Nolan has pointed out, the school system has consistently performed below the state average. In 2005, 83 percent of Massachusetts students scored at the “proficient” level in ELA, compared to 77 percent in Cambridge. The numbers were 72 and 64 for math.

ADMINISTRATIVE MARKS

The administrative indicators for the Cambridge Public Schools improved clearly, something that Nolan has freely admitted. In the 12 areas identified by the EQA—covering everything from student assessment to professional development to financial management—the administration received higher marks than in previous years. In the case of the “curriculum” standard, the school system improved substantially.

Each of these 12 areas measured were also broken down into a total of 88 subcategories. In 2004, 4 subcategories were deemed “excellent,” 54 “satisfactory,” 24 “poor,” and 6 “unsatisfactory.” In 2006, 5 were deemed “excellent,” 80 “satisfactory,” 1 “poor,” and 2 “unsatisfactory.”

THE REAL DEBATE

There is no doubt that during Fowler-Finn’s tenure, test scores have risen—by three points in ELA and by five points in math. There is also no doubt that they are still below the state average. And finally, there is no doubt that the administrative marks have improved substantially.

The true debate—the one for the School Committee and the public—is whether the speed of the improvements is adequate.

Fowler-Finn has compared the Cambridge Public Schools to an “ocean liner” in trying to highlight the difficulties in changing the system’s course. Nolan has argued that the system is small—with only one high school—and that it can be “turned around on a dime.”

Both sides have even cited influential educational research supporting their positions.

But unlike the debate over the EQA report, this debate cannot be settled by simply trying to make sense of statistics.

—Staff writer Paras D. Bhayani can be reached at pbhayani@fas.harvard.edu.
Advertisement
Advertisement