Albuquerque Public Schools had received a federal grant that called for new teacher evaluations, and district officials were looking for ways to not only fulfill that obligation, but also help teachers improve.
SDP Fellows Sade Bonilla and Richard Bowman designed and conducted a voluntary teacher evaluation and compensation pilot in four schools, winning the support of the local union and school leaders to offer feedback, assess performance, and award bonuses based on individual impact.
Teachers were provided with multiple observations, feedback, and value-added scores as part of the pilot, and earned additional compensation based on their participation. Albuquerque also put in place a new teacher observation rubric that was later implemented district wide and incorporated a value-added model as a tool for principals reviewing staff performance. The pilot gained media attention for the collaborative roles played by district and union leaders in leveraging data to make a difference and is a model for piloting and measuring new policy innovations.
Designing a New Teacher Evaluation System
Albuquerque Public Schools had won federal grants to kickstart reforms and improve student outcomes. As part of those efforts, the district decided to conduct an experiment: a pilot of a new multiple-measures evaluation system for teacher performance, based on observations, student growth data, student learning goals, and student perception surveys.
SDP Fellows Sade Bonilla and Richard Bowman were initially tasked with reviewing an outside vendor’s proposal for an evaluation pilot. Based on their experience and with the support and resources of the Strategic Data Project, they responded by proposing to design, conduct, and analyze the pilot evaluation system and results themselves. The district agreed.
A Voluntary Pilot to Explore Best Practices
From the design phase on, conducting the pilot required earning good will and buy-in from teachers themselves. The SDP fellows met with leaders from the district and the local teachers’ union in order to discuss and finalize aspects of the evaluation program and included union-requested aspects in the final system, such as self-reported assessments of progress toward student learning goals. They also consented to conducting a voluntary pilot; although a mandatory pilot would have been more representative, a voluntary program was more politically stable and palatable to the union.
Nonetheless, the evaluation pilot was met with hostility and skepticism from some teachers at the four schools. Bonilla and Bowman visited each school to answer questions and dispel myths about the evaluations, taking care to explain each component of the multiple-measures system to a non-technical audience. Then, they visited each school again, in order to respond to teachers’ questions.
Throughout the year, each participating teacher was observed by an administrator three times. Their students completed perception surveys, and teachers submitted self-reported student learning goals and progress. In addition, value-added based on district benchmark assessments were included in teachers’ assessments.
The SDP fellows created two interim status reports for participating teachers, with the goal of sharing feedback and making the assessments transparent. At all stages, the SDP fellows actively followed an agenda of change management, by being accessible to teachers, actively listening to and addressing their concerns, and ensuring that their processes and findings were accessible to a variety of audiences.
Bonilla and Bowman’s efforts illustrate the importance of non-technical aspects of using data analysis to effect change, especially when undertaking politically sensitive work. While the individual elements of their system were strong, they credited its successful implementation to their coalition-building efforts early on and clear and responsive communications efforts throughout the year.
Find Bonilla and Bowman’s full report here.