Accelerating Education Through the Information Age

Information ageIn 2011, SDP Fellows Richard Bowman and Sade Bonilla set out to prove that greater data use was possible in New Mexico. 

“The path from data generation to use… is not simple, automatic, or quick. The seemingly straightforward story of information supply, demand, and use is complicated by users’ norms (how they prefer to make decisions), relationships (who they know and trust), and capacities (their confidence and capability to turn data into actionable insights).”

- Toward Data-Driven Education Systems, Brookings 2018.

The appetite for data in education has been steadily increasing over the past decade or so. While gains in desire and capacity for data signify promise for the field of education as a whole, this promise wasn’t always visible. The work being done today grows out of earlier efforts, when people were first turning to data for help answering education’s big questions.

Richard BowmanAmong education’s data pathfinders was SDP Fellow Richard Bowman, who in 2011 was exploring the feasibility of how data could be leveraged at Albuquerque Public Schools (APS) alongside research partner Sade Bonilla. Bowman, Bonilla, and the district were trying to determine the effectiveness of its teachers and were charged with building a data-backed assessment in the budding stages of data use in education. While the project itself gleaned helpful insights to inform decisions in the short term, projects like these served as early exemplars that created a vision for how data use in education could look.

Setting the Stage in Albuquerque

To pave the way for a more advanced utilization of data, Bowman and Bonilla developed a pilot to identify factors that assisted rigorous, transparent, and equitable evaluation systems—a requirement of a federal School Improvement Grant (SIG) of which several APS schools were recipients. The district conducted this teacher evaluation and compensation pilot utilizing multiple observations by administrators, data on student growth, student learning goals, and student perception surveys. The results of this pilot helped inform stakeholders and explore various evaluation metrics before being required to implement them.

“At that time,” said Bowman, “we didn’t have any systems that had been used in that manner. We had student information systems (SIS), but getting data out of them was complex and time consuming. Because of this, we didn’t really have the information infrastructure to determine how well teachers were doing and how students were learning as a result.”

Albuquerque Public Schools was a large district, serving approximately 90,000 K-12 students at 13 high, 27 middle, 89 elementary, 10 alternative and 21 locally authorized charter schools. Thus, the potential for great impact through a more data-driven approach to evaluation, leadership, and policy was far reaching. Improving teaching in a district like this would touch tens of thousands of students.

Sade Bonilla“APS data usage at the time was typically about complying with state and federal requirements,” explained Bonilla. “From the research perspective, it was often retrospective. Programs that required evaluation were evaluated. We arrived amid the Great Recession and the district was struggling to meet basic student needs. Data was being used all the time but not strategically to inform future action. Trying to seed a shift from retrospective to prospective is challenging when faced with limited resources.”

Planting the Seeds of Data

The initial goal of this project was to design a teacher evaluation plan for one middle school, which ultimately expanded into four schools before evolving further into the implementation and administration of a larger teacher evaluation system. At the time, educators were skeptical of using classroom observations as the sole source of evaluation. Concerns about fidelity to the rubric or bias of principals could cast doubt on the evaluation data. Bowman and Bonilla’s project combined multiple measures, which was something that hadn’t been done before.

While the project successfully achieved its goal, the real impact of projects like these was in the way it showed what was possible.

“Back then, we really just planted the seeds that opened people’s eyes to how data could be used,” reflected Bowman. “We showed people how you can put five measures together and use them to describe how teachers are doing in a way that everyone can buy in to. This is something that wasn’t really happening much seven years ago.”

“We had ideas,” added Bonilla. “At the time I did not think they were particularly bold ideas. But by being there, we drew back the curtain, just slightly, to show what might be possible.”

Born from those initial seeds were such efforts as the APS Dashboard, a resource for the APS community that provides usable data reports on enrollment and demographics, academic performance, parent and student engagement, and college readiness. While this system wasn’t a direct outgrowth of Bowman and his partner’s work in 2011, the progress in the use and presentation of data in the district was a continuous, compounding thread. “Our impact was in the ‘yes we can’ factor, and that idea germinated over several years. It was a long arc.”

Bowman and Bonilla, like many early SDP Fellows, set out in the beginning to prove that something was possible. They started small but carved a path to greater data infrastructures, mindsets, and cultures for the many leaders who would follow. Like the tentative pencil lines of a sketch, this new understanding of data’s role in education are what thrust the field as a whole into the information age.

Key Skills Required for Building the Capacity and Appetite for Data

Many pioneers and early adopters, no matter the industry or organization, often bump up against similar hurdles to organizational change. For those still in the early stages of building a data culture, Bowman and Bonilla’s project points to some key skills required for change:

  1. Negotiation: Resulting in the addition of student learning goals, weights of the multiple measures in a teacher’s evaluation, and buy-in of such stakeholders as union representatives.
  2. Clear communication: Explaining the pilot components effectively to a non-technical audience. Answering educators’ questions about value added and survey analysis to alleviate their concerns that components were unfair.
  3. Visualizing progress: Developing easy to understand reports for a non-technical audience on VAM and student perception survey results. Using technology (Excel and VBA Macros) to efficiently create and distribute individualized reports.
  4. Authentic listening: Displaying empathy by actively listening, responding, and validating individuals’ fears and concerns.
  5. Building bridges: Framing the pilot through multiple lenses which allowed various stakeholders to see the shared value in the project.