Putting New Performance Ratings to Use

Teacher-evaluation systems can show which teachers have the biggest impact. How can we use those insights to improve learning for students?
 

icon

The challenge:

In Pittsburgh, the “Empowering Effective Teachers” initiative and evaluation system had created a lot of data about individual teachers’ performance—but not yet a detailed plan for how to use it to improve student outcomes across the district.

icon

The intervention:

Using SDP diagnostic analyses, SDP Fellows Tara Tucci and Ashley Varrato dove deep into effectiveness ratings, placement data, and test scores to assess the impact of highly effective teachers, determine where and with whom they were working, and unveil key predictors of teacher effectiveness—and then designed a secure way to get that information to school leaders.

icon

The impact: 

Among their findings: the highest-need students were most likely to be taught by a low-rated teacher, and low-rated teachers were the most likely to have switched class and school assignments from the previous year. They designed a secure data system to share this information with district leaders and principals, to inform conversations about improving classroom practice and more equitably distributing high-performing teachers across all Pittsburgh schools.

The Challenge:

Putting New Performance Ratings to Use

States and school districts across the country have revamped their teacher-evaluation systems in recent years, creating a wealth of new data about individual educators’ effectiveness. But these performance reports and ratings can’t do much on their own. They need to be portable and actionable, especially at the school level, so school leaders can select appropriate professional development opportunities based on individual needs and ensure effective teachers share their practice with others. 

In Pittsburgh, the district and local teachers’ union had collaborated to create the “Empowering Effective Teachers” initiative, which included a new multiple-measures evaluation system. After its first year, teachers were rated in terms of their effectiveness, including small groups at the very top and very bottom of the scale. Because individual teachers can have very different impacts on their students based on their relative skill in the classroom, it was important for officials in Pittsburgh to not merely identify very high- and low-performing teachers, but to apply this information in order to improve teacher performance and student learning across the district.

The Intervention:

Mapping Effectiveness and Sharing Insights

SDP Fellows Tucci and Varrato conducted several analyses in order to learn more about the district’s high- and low-rated teachers, in order to inform an urgent goal: how to maximize the share of students exposed to highly effective teachers.

They found that the district was retaining its top-performing teachers, with a 98.7 percent retention rate, and that those teachers were far less likely than their low-performing peers to have switched class or school assignments in the past year. In terms of what characteristics predicted high ratings, the SDP fellows found that National Board Certification was most strongly associated with a top “distinguished” rating in Pittsburgh, and that other factors like education level or years of experience were not.

Tucci and Varrato’s analysis also showed the potential of a highly effective teacher to transform student outcomes: students with the highest-rated teachers were twice as likely to progress from “basic” to “proficient” scores on standardized tests compared to students with the lowest-rated teachers.

The fellows then applied their technical expertise to ensure this information could be put to use. As district and union leaders references their analyses in goal-setting conversations, school leaders were able to access a new data warehouse in order to review reports. Other technical improvements included launching and improving data-reporting systems for observations and student surveys, and making class rosters more robust in order to give teachers more insight into the student performance that figures into their value-added calculations.

The Impact: 

Lessons Learned

The work by the SDP fellows paved the way for on-the-ground improvements in Pittsburgh. Not surprisingly, while their efforts did involve sophisticated statistical analyses, they also involved practical solutions to workaday challenges: keeping data secure and accessible, making data entry a user-friendly experience, and ensuring that the variables weighed in complex analyses are clear and transparent.

Performing such analysis within a fast-moving school or district context makes the urgency of these efforts clear. The major recommendations from the SDP fellows involve ensuring that system users can access the information and put it to use. Doing that requires timeliness and buy-in, which are most easily achieved when districts prioritize and focus their goals. Rather than changing many things at once, selecting particular areas for improvement, such as targeted professional development based on evaluation results, is wise. Those goals can inform user-experience design and other decisions, to ensure data isn’t merely generated, but used.

SDP Resources:

Find the full capstone report by Tucci and Varrato, which includes contributions by other SDP fellows as well.