Strategic Data Excellence: Elk Grove Unified’s Innovative Approach to Assess Program Implementation

How can a school district know what helps students succeed? Most often, leaders and analysts look at data like test scores, on-time graduation, and attendance rates. While these measures provide important insights about what a district is achieving, they don’t explain why it is achieving at that level. In the absence of knowing the why, districts may be left with educated guessing as the primary means of improvement planning.   

Elk Grove TeamFrom left to right, the leadership team on this project: Jenifer Avey, Executive Director of School Supports ; Christine Hikido, (Retired) Director of Research & Evaluation; Mark Cerutti, Deputy Superintendent of Education Services & Schools; Jeremy Hinshaw, Director of Research & Evaluation; Rebecca Rangel, Senior Research Analyst 

A pioneering and holistic approach to measure the efficacy of educational programs, not just their outcomes, is helping fill in those blanks at Elk Grove Unified School District, which educates about 63,000 students in the southeastern suburbs of Sacramento, Calif. The district won the 2023 Strategic Data Excellence Award from the Strategic Data Project for its unique evaluation system, the Program Implementation Continuum (PIC). With this award, SDP seeks to celebrate education agencies who are using data and evidence with excellence to solve big thorny problems, and raise the profile of what’s possible across the education field. 

PIC integrates quantitative data with qualitative measures, such as student surveys, to assess how well classroom instruction reflects the district’s instructional framework. 

“We were a hammer in search of nails,” said Jeremy Hinshaw, Director of Research and Evaluation at Elk Grove. “There are these platitudes that inevitably come around, like ‘you can’t always measure what matters.’ I really disagree with that. I think it’s more like, how much effort do you want to put into it? We can find ways to measure things.” 

A Commitment to Continuous Improvement 
The PIC system integrates a broad array of measures to link formative information about classroom practice and school culture with summative outcomes of interest. That includes student surveys on school climate, a teaching and learning survey for teachers, a parent and community engagement survey, school visits, and classroom observations, along with data like the shares of students receiving targeted academic interventions. School performance is measured in five domains, such as Teaching and Learning, Social Emotional Learning, and English Learner Programs, with up to five component scores determining a category 1–4 rating in each domain. 

For example, PIC measures the fidelity with which a school has implemented the Positive Behavioral Interventions and Supports (PBIS) framework, which is designed to improve school culture and climate. A school’s category rating is based on a review of its policies, classroom practices, and use of data, among other factors. Since PIC’s debut in 2016-17, site-based coaching using PIC measures has contributed to rising PBIS implementation scores and an estimated 17 percent drop in student suspensions.  

This serves to illustrate PIC’s theory of action: “If we systematically measure program implementation, we can measurably improve program implementation, which will then lead to improved student outcomes.” 

However, PIC is not an accountability system. “Accountability is always based on outcomes. PIC is our support system, and the way we can direct people toward what they should be doing based on successful implementation elsewhere,” said Hinshaw. 

A Central Role for Student Voice 
Student surveys play an important role in the PIC system, which uses a district-created instrument published in English, Spanish, Hmong, and Vietnamese. Elementary and secondary students take slightly different versions of the survey, which asks students questions about their feelings of safety and belonging, whether teachers explain lessons clearly and help them with their work, and if they consider their classwork interesting and meaningful.  

Hinshaw noted that the district was inspired to include student survey data by the Measures of Effective Teaching (MET) project, a major study that used surveys and another innovative data and assessment strategies to identify high-impact instructional strategies.  

In assessing how well instructional principles are implemented, teachers and principals tend to report very high levels of implementation, Hinshaw noted. Yet not all students are performing at or above grade level, so not every response can be accurate. 

By contrast, student survey response data is very strongly aligned to academic outcomes—such that “the first time I ran the analysis of teaching and learning data with student perspective, I found these tremendous relationships and my initial reaction was, it’s too good to be true. But students really can and should have a meaningful voice in these measures.” 

Lessons Learned 
A major early win for PIC was the enthusiastic support of the local teachers’ union, said Hinshaw. Any district looking to implement a similar approach should actively collaborate with teacher leadership from the outset.  

“We were very careful in trying to articulate a well-reasoned rationale for how we were going to do this and why it would benefit teachers and the district, and they were totally on board,” he said. “It was a testament to the ongoing good relationship we have with our teachers and the trust they put in us, the people on the other side of the table.” 

Another strength of PIC is its reach and clarity. The system works at scale and an interactive internal website allows district personnel to access school-level data in a simple format. In addition, PIC has established a common vocabulary and shared focus on implementation fidelity across the district. 

However, an enduring challenge has been the timeline for PIC results. Like most summative data reports, PIC ratings were provided to principals months after the measured school year had concluded.  

“We were focused on summative implementation measures, which meant we were giving principals data from last year and telling them that’s what they should work on, but it wasn’t always relevant,” Hinshaw said. “Some principals hadn’t been there the previous year, it wasn’t the same staff, and so forth. It was dead data.” 

This winter, the district began to pilot a new formative survey tool for teachers that can provides actionable implementation data in real time. Instead of giving students the full survey near the end of the school year, teachers can pick and choose a shorter set of questions to gather confidential insights from students through the “rapid cycle measures” survey tool. 

“It’s a flexible tool and teachers can use any part of it any time, and only teachers get to see their data,” said Hinshaw. A teacher concerned about classroom culture can choose to administer survey questions that capture students’ opinions and experiences about that, for example, and get the response data back the following day.  

“We’re going to see if this helps teachers and helps lessen their burdens, and if teachers can put these results to work in a cycle of inquiry,” he said. “I’m most excited about that.” 

 

Learn more about the SDP Award for Strategic Data Excellence