Is your school beating the odds? SDP Fellows make it easier to measure

analysisFormer SDP Fellows Aaron Butler and Hannah Poquette pay it forward with detailed tutorial for Beating-the-Odds analyses in education.

Former SDP Fellows Aaron Butler and Hannah Poquette wanted a nuanced understanding of how Kentucky’s high-performing schools were reaching or exceeding their goals. Then embedded within the Kentucky Department of Education (KDE), the fellows and their colleagues partnered with Regional Educational Laboratory (REL) Appalachia to unearth and ultimately learn from the state’s high achievers. To do so, they first had to identify those high-achievers; they needed a beating-the-odds (BTO) analysis.

BTO analyses use demographic information to predict a school’s performance, then compare that with data on the school’s actual performance. Any discrepancy between the predicted performance and the actual performance can reveal schools that are doing better or worse than expected.

Yet while there are a number of available resources on how to learn BTO, Butler and Poquette were surprised to find that there were few, if any, resources that specifically walked people through the process with education data. So they built one, and it’s now available to the public for free on OpenSDP.

“We learned many lessons through our project with KDE and REL Appalachia, and we wanted to package it in a way that we could share with the community,” Butler explained. “Additionally, we saw working through the tutorial process as a way to enhance our own learning.”

Butler and Poquette had a few goals when developing the guide. One of them was to think critically through the steps of a BTO approach and how to make good decisions during each of those steps. “One of the things that surprised me when conducting a BTO was how much the preliminary work of modeling can impact the results you get,” reflected Poquette. “This model can be sensitive, and the results you get on the back end result because of decisions you make in the beginning. The literature is clear that both the quality of the data and the design of the model are critical.”

The context in which BTOs are used is also quite important, and the fellows argue that these analyses shouldn’t be used for accountability. Once a BTO delivers a list of schools who are performing above expected, there are manifold reasons that should be considered when answering the question of ‘why?’ The initial KDE/REL project built in follow-up qualitative work like focus groups and observations to capture and understand the practices of high performers, making the BTO analysis simply a starting point for deeper conversations.

Fellows are trained by the Strategic Data Project to think through how data are used: to incent, to empower, and to evaluate. Different data should be used for different purposes. BTO analyses are a tool for empowerment, helping identify and explore possible bright spots in practice, not evaluation.

“BTO analysis is an exploratory tool,” said Butler. “It should be used as a conversation starter, not a tool for rewarding or sanctioning schools.”

Jared Knowles, SDP’s curriculum designer for statistical computing, agrees. Knowles was one of the people who helped develop the OpenSDP platform that hosts the BTO tutorial, and he played a critical role in guiding Butler and Poquette during their development of the guide. “BTO analyses give you a snapshot into a particular period of time, but it’s important to zoom out and use that information to look for trends as well. Schools’ performance trajectory may be impacted due to changing demographics and educational environments, for example. Thus, this tool should be less about accountability and more about finding those successful schools and learning from and replicating their practices.”

Because BTO analyses surface any school that exceeds its performance prediction, a “beating the odds” performance is most often caused by either true performance differences, instability in the measures or model, or undesirable “gaming” strategies like when schools manipulate who is tested or how tests are administered. Interestingly, Poquette and Butler’s original study with KDE and REL Appalachia surfaced a noticeable outlier school whose scores were eventually invalidated for cheating.

The new BTO tutorial is also unique in that it moves beyond general description to provide practical tools for users. A synthetic “Faketucky” data set based on real student data and actual, non-synthetic code in the R programming language are both available for people to download and work with as they learn the BTO technique. Users can also use the code to work with their own data. This tangible component to the tutorials adds a level of usability not widely available to many.

“This guide embodies the intention we had when we started OpenSDP,” added Knowles. “While you can find research guides and tutorials for education, many of those resources are very conceptual. Aaron and Hannah’s work did a great job of including both a conceptual framework and steps for technical implementation. We were really pleased that they put so much time and energy into developing this tutorial and that they were willing to go through the process of editing and iterating with us.”

Poquette and Butler both cited their gratitude to SDP and their desire to pay it forward to the network as major motivating factors for developing the BTO guide. “I owe a lot of the success I’ve had to the SDP program and network,” concluded Butler. “We both felt like this was a great opportunity to pay it forward.”