Looking for Early Warning Signs in Los Angeles High Schools

icon

The challenge:

All California students were required to pass the state’s exit exam in order to earn their diploma. To figure out which students needed early, extra help, schools in Los Angeles gave 9th grade students a diagnostic pre-test.

icon

The intervention:

SDP Fellow Hanseng Chen analyzed student performance on the diagnostic and another annual test, and compared that to students’ graduation outcomes four years later. Through his analyses in the SDP program, he found that the official diagnostic pre-test wasn’t as predictive of students’ graduating as the regular, less expensive, 9th grade end-of-year exam.

icon

The impact:

Los Angeles stopped administering the more expensive, less accurate diagnostic test.

 

The Challenge:

Looking for Early Warning Signs in Los Angeles High Schools

In California, high-school students were required to pass the California High School Exit Exam in English and math in order to earn a diploma. Students first took the test in 10th grade, and those who did not pass the first time were given extra chances and support to pass both sections throughout their years in school. Statewide, about 4 out of 5 students passed the test on their first try.

As part of their efforts to increase the local graduation rate, leaders at Los Angeles Unified School District introduced a diagnostic pre-test in 9th grade to see which students were unlikely to pass the test the following year. Students who fell short on the diagnostic test were given additional support over the next year, until they took the official exit exam for the first time in 10th grade.

Students did significantly better on the official exit exam after a year of support than they did on the diagnostic test in 9th grade: 63 percent of students who failed the math pre-test in 9th grade passed the official math exit exam the following year. But the reason for that progress wasn’t clear. Was the diagnostic test too difficult? Was the intervening support especially effective? 

The Intervention:

Testing a Test’s Predictions

SDP Fellow Hansheng Chen dug through the data to determine the source of the progress. 

First, he looked at whether students had participated in an intervention before taking the official exit exam. Among students who had failed the math diagnostic test, 62 percent went on to pass the second test without any additional support. Among students who failed the diagnostic and had extra help, 74 percent passed—a difference of 12 percentage points.

Chen then looked at how those students had done on their annual statewide math test. Among the group of students who failed the math diagnostic but had done well on the statewide math test that year, more than 90 percent went on to pass the official exit exam—whether they got extra help first, or not. But among students who had earned low scores on the statewide test, those who got extra help were far more likely to go on to pass the exit exam: 68 percent with intervention, compared to 54 percent without.

This was an interesting discovery, which raised a new question: Was the statewide 9th grade math test, which students would have to take anyway, actually a better predictor of their success on the 10th grade exit exam than the extra diagnostic pre-test? To find out, Chen compared the predictive powers of both tests.


The Impact:

Lessons Learned

His in-depth analysis found that both tests were predictive of whether students would graduate on time—but the regular 9th grade test was more accurate. In response, Los Angeles phased out the more expensive diagnostic test, since Chen had demonstrated that its goal would be achieved by the other exam. 

In addition, Chen built a new data tool for high schools: predictions of students eventual scores on the math exit exam, based on their 9th grade test results. The tool was designed to not only sound an early alarm, but also help teachers and school leaders give students they support they need.

 

SDP Resources:

Read Hansheng Chen's capstone report.