Charting a Course for School Turnarounds in Boston Public Schools


The challenge:

Boston Public Schools needed a concrete way to quickly progress toward district objectives amid fast-changing conditions, including at 11 low-performing “turnaround” schools.


The intervention:

SDP Fellow Nathan Kuder launched Quarterly Review, a comparative statistics (or CompStat) program in which district and school leaders quickly established data-driven goals and strategies, regular meetings and follow up, and a structure of shared accountability.


The impact:

Quarterly Review established a new working relationship for turnaround school leaders and the BPS central office, with cross-functional teams working to achieve shared metrics aligned to their central mission.

The Challenge:

Charting a Course for School Turnarounds

In rapid succession, Boston Public School (BPS) established a new Office of Accountability to support district goals and was charged with supporting 11 low-performing “turnaround” schools. These schools received $20 million in federal School Improvement Grants and the flexibility to make big changes, and were given three years to improve. If they didn’t, they risked losing their funding and being taken over by the state.

The combination of benefits, attention, expectations, and consequences meant that school leaders felt extreme pressure to dramatically improve their performance. To ensure their success, district leaders needed to understand their challenges, craft strategies for growth, and quickly identify and offer the right kinds of support.

The Intervention:

Establishing a “Stat” Program

Kuder, Boston’s SDP fellow, had recently worked on a CompStat program in the city’s police department. With his leadership and the support of SDP, BPS implemented Quarterly Review, a statistics-focused improvement protocol similar not only to CompStat, but also stat programs at urban districts in Washington, D.C., Memphis, and Philadelphia.

Kuder guided BPS and school leaders as they built Quarterly Review based on a highly structured process that includes ongoing data monitoring and links goals with actions and owners. Like other Stat programs, Quarterly Review has four core elements:

• Clarity of organizational mission and purpose
• Statistical analysis of problems and results
• Organizational flexibility and responsiveness
• Internal accountability

In practice, Quarterly Review required clear, mission-focused communication, in-depth preparation for outcomes reporting during regular meetings, immediate follow-up, and adherence to a central task list. The protocol was appealing because it emphasizes execution during goal-setting, since goals are linked to strategies whose implementation and outcomes are assigned to responsible parties. BPS was also motivated by the public task list, which holds all parties accountable for their assigned tasks on a schedule agreed on by the group, and by Kuder’s expertise.

In building its first-year pilot plan, BPS established new teams from the central office, with expertise in academics, budget, operations, and transportation. They provided each school with a single point person to call with questions, set dates for four meetings that year, and enlisted enthusiastic staff on the ground to provide principals with support.

The Impact:

Lessons Learned

Leaders in Boston publicly committed to Quarterly Review meetings and held them before the process and metrics were fully fleshed out. In doing so, they found that the act of holding the meeting was more important than getting it right the first time; early meetings actually influenced later revisions to the process.

However, there was insufficient planning in terms of assigning tasks and responsibilities to various teams before these meetings began. Initial planning centered on finalizing the meeting format, communicating the process to participants, and assembling key performance data. That left major issues unaddressed, such as when teams would review data, manage the public task list, and assign follow-up tasks. Boston did resolve these open issues, but only after several iterations. Just as detailed preparation is key while a successful Stat program, it is crucial before a Stat program begins.

Organizations should also find ways to interact with participating schools or departments that will reinforce the process. In Boston, for example, district accountability staff conducted walkthroughs and organized a network of school data users to build capacity and aid in interpreting performance metrics. These strategies can enhance a Stat program and address a major challenge such programs face: the perception that they are a compliance exercise or take focus away from the “real work” of education.

SDP Resources:

Find the full report of Kuder’s analysis here.