Becoming a Learning Agency: 3 Lessons from Ohio’s Shift Toward Evidence Use in Education
By Sara Kerr
When an unexpected crisis hits, an organization faces it with the systems it has, not necessarily the systems it needs. This is as true for government agencies, like state education agencies (SEAs), as it is for health care systems, corporations, nonprofits, or even families. SEAs that already have systems which promote continuous learning have important resources to aid their short-, medium-, and long-term responses to crises like the current COVID-19 pandemic including:
- Data systems to track key leading and lagging education indicators;
- Continuous improvement methodologies to help test and improve new practices and policies;
- Partnerships with researchers to help make smarter, evidence-based investments with increasingly limited funds; and, most important of all,
- A learning culture that empowers decision-makers throughout the education system to respond to changing and often ambiguous conditions.
Results for America (RFA) has for the past three years been working with a dozen leading SEAs to help them build and strengthen their data and evidence capacity. This post spotlights one — the Ohio Department of Education (ODE) — that has made significant progress in transforming itself into more of a learning agency than in the past. ODE readily admits this work is still in progress, but three key lessons have already emerged. We hope other SEAs, along with school districts and other government agencies and education organizations, can adapt and apply these lessons to their own contexts, during, after, and well beyond the current crisis.
Lesson 1: Federal law can be a catalyst for change.
During the No Child Left Behind Act (NCLB) era, ODE invested in some of the infrastructure needed to function like a learning agency, primarily a statewide longitudinal data system to collect and disaggregate much more student-level data than in the past. But having data and using data are very different things. Like many other states, Ohio’s education system remained data rich, but insight poor.
When Congress replaced NCLB with the bipartisan Every Student Succeeds Act (ESSA) in 2015, ODE saw in the new law several opportunities to embrace data, evidence, and evaluation. (RFA has catalogued these leverage points here.) While ESSA pushed more decision-making from the federal level to states and districts, Congress put guardrails on those new authorities, including new data reporting requirements, a mandate that improvement plans for struggling schools include evidence-based interventions, and a heavier emphasis on progress monitoring and evaluation.
Seizing the opportunity, ODE developed its ESSA plan in part based on feedback from school districts, school leaders, teachers, and other stakeholders about how to better and more consistently use data and evidence to make smart decisions. The SEA also ensured a tight alignment between its ESSA plan and the state’s strategic plan, which also integrates the use of data and continuous improvement throughout its various initiatives. And it oriented the state’s system of support for districts and schools — especially those struggling the most — around a continuous improvement framework, the Ohio Improvement Process, which brings together educators and leaders at all levels of a district to jointly examine data, consult the evidence base, determine and implement strategies, study results, and use what they learn to improve.
Counterfactuals are hard to prove, but ODE staff point to ESSA as a helpful catalyst in getting them closer to where they want to be: a learning agency, empowered by evidence, that supports local school districts in meeting the needs of their students and communities.
Lesson 2: Learning agencies are better equipped to shift from monitoring compliance to supporting improvement efforts.
Since the start of his tenure at ODE, State Superintendent of Instruction Paolo DeMaria has sought new ways to empower people throughout the public education system to support better student outcomes. Internally at ODE, that means encouraging staff to look at relevant data to understand what’s working (or not) and for whom. In interactions with schools and school districts, it means making a real — and difficult — shift from being an SEA focused on compliance to one focused more on support. ODE’s new focus constantly grapples with the following guiding question, which the strategic plan places at the center of internal decision making: “How can the Department provide the best possible supports to schools, districts and educators so they are best positioned to challenge, prepare and empower each child in Ohio?” ODE still has a long way to go, but the more they have embraced data, evidence, and evaluation, the more they’ve been able to provide meaningful support to schools and districts throughout the state.
DeMaria and ODE know that without providing necessary supports, empowering can become (or at least be perceived as) just setting someone else up for failure. For example, in the school improvement context, ODE chose not to require districts to choose interventions from a state list. At the same time, ODE leaders like Chris Woolard, Senior Executive Director for Performance and Impact, and Heather Boughton, Director of the ODE Office of Research, Evaluation & Advanced Analytics, recognized that most local decision-makers needed resources and guidance to navigate the evidence definitions and requirements in ESSA and make thoughtful decisions about which interventions best meet their schools’ needs. So ODE worked closely with its longstanding research partner — the Ohio Education Research Center — to create Ohio’s Evidence Based Clearinghouse, which “is intended to empower Ohio’s districts with the knowledge, tools and resources that will help them identify, select and implement evidence-based strategies for improving student success.”
No other state in the nation has created a clearinghouse with the breadth and depth of Ohio’s. Most importantly, it is a significant part of ODE’s multifaceted effort to meet school district leaders where they are and to support their capacity to make evidence-based decisions rather than telling them what to do. The Clearinghouse supports this approach by (i) establishing a single point of access to more than 200 evidence-based interventions, (ii) creating a user-friendly crosswalk that makes it easier for educators to understand how each intervention stacks up against ESSA’s standards, and (iii) allowing users to search for interventions based on their own local contexts. ODE supports districts and schools to use the Clearinghouse as part of their comprehensive planning and improvement process — and in the future plans to highlight success stories of Ohio districts, schools, and educators using evidence-based strategies to strengthen student, school, and community outcomes. The Clearinghouse has become an important tool for accelerating the long-awaited shift from using data for compliance to data- and evidence-informed continuous improvement.
Lesson 3: Asking the right questions ensures solutions have their intended impact.
As ODE took advantage of ESSA’s evidence definitions and requirements and began moving toward becoming a learning agency focused on supporting local leaders, ODE staff repeatedly turned to the same low-cost, high-impact strategy: asking the right questions.
Rethinking what questions to ask and how to ask them helps cultivate a culture that values data and evidence as tools for improvement rather than as boxes to check. Too often in the past, data collections and questions from the SEA were received by district and school leaders as disconnected from the core business of better serving students.
In terms of what questions to ask, ODE widened the SEA’s aperture from a narrow set of metrics to consider a broader set of data aligned with the state’s commitment to taking a whole child approach, specifically by focusing on four “equal” learning domains: foundational knowledge and skills, well-rounded content, leadership and reasoning skills and social-emotional learning. The state is now working to align its standards, assessments, and accountability systems to these domains, which will lead to big shifts in what questions the SEA asks, from assessment design to stakeholder surveys to accountability indicators.
ODE is also asking questions that go beyond its own data system by more regularly connecting with partners in other social sectors to better identify challenges facing Ohio students (and families) by looking across datasets relevant to students’ lives. This approach is starting to help ODE better coordinate the provision of wraparound supports to remove those barriers to learning, such as its recent partnership with the Ohio Department of Medicaid that enabled schools to access aggregate data about the health and wellness of their most vulnerable students.
In terms of how to ask questions, ODE has been working hard to shift its accountability role from what has often been perceived as a compliance exercise. In its place, ODE is striving to put equity at the center of the questions it asks school districts, with accountability as a tool to advance it. This is evident in the state’s recently developed COVID-19 learning guidance for districts, which embeds specific recommendations for centering districts’ distance learning plans on the needs of their most vulnerable students. Partnerships with Harvard University’s Proving Ground and the National Center for Rural Education Research Networks are helping Ohio districts collaborate on asking relevant questions and using data and evidence to test possible answers and then continuously improve them.
ODE knows the shift toward being a learning agency is more of a marathon than a sprint. But they have made important progress already, progress that in some ways has prepared them for the current crisis, including the challenges revealed, the challenges exacerbated, and the challenges ahead. Data, evidence, and evaluation aren’t themselves the answers, but ODE will use them to find answers that work for Ohioans.
Sara Kerr is Vice President of Education Policy Implementation for Results for America. Read more about RFA’s Evidence in Education Lab here.