NotesFAQContact Us
Collection
Advanced
Search Tips
Back to results
Peer reviewed Peer reviewed
Direct linkDirect link
ERIC Number: ED663640
Record Type: Non-Journal
Publication Date: 2024-Sep-19
Pages: N/A
Abstractor: As Provided
ISBN: N/A
ISSN: N/A
EISSN: N/A
Working with Practitioner Partners to Develop an Approach to Evaluation That Facilitates Decision-Making
Alexandra Lee; Meetal Shah; Justin Long
Society for Research on Educational Effectiveness
An extensive number of studies evaluating educational interventions are conducted and disseminated through peer-reviewed articles and conference presentations annually, but there is limited uptake of research among key stakeholders--educators and administrators (henceforth, practitioners; Farley-Ripple et al, 2018; Hollands et al., 2019; Slavin, 2020). Despite federal legislation (i.e., Every Student Succeeds Act [ESSA]) mandating that schools, districts, and states consider research evidence, practitioners tend to focus on product cost, alignment with district initiatives, and informal teacher feedback when making purchasing decisions (Dexter et al., 2017; Roberts et al., 2017). When it comes to educational technology (edtech) interventions, 89% of K-12 practitioners reported that they "do not" require edtech products to have research evidence when making purchasing decisions and only 8% insisted that edtech research studies use a randomized control trial (RCT) design (Roberts et al., 2017). Low demand for rigorous evidence from schools and districts will make edtech companies less likely to invest in evaluations of their products, with only 7% conducting high-quality research on their products (Hulleman et al., 2017). On the other hand, practitioners often do not seek research to inform decisions about what edtech products to invest in because it is not timely, relevant, or accessible to them (Hering, 2016). Moreover, there is a need to reconcile differences between practitioners' experiences with edtech products and available evidence on them. Effectiveness studies conducted in "real-world" settings typically have lower effect sizes than efficacy studies conducted in a controlled environment (Crone et al., 2019). Similarly, interventions implemented by teachers have a lower effect size compared to interventions implemented by researchers (Crone et al., 2019). Given the inconsistencies between published research and practitioners' experiences and the persistent "file drawer problem" (Gage et al., 2017; Ropovik et al., 2021), practitioners are understandably skeptical of research. Nevertheless, not using evidence for decision-making may lead to adverse educational outcomes, particularly for students from historically marginalized communities (Gray et al. 2023). In this paper, we share findings from a practitioner-researcher partnership designed to empower practitioners and build capacity within education organizations to make data-driven decisions that improve outcomes for "all" stakeholders in the education process. Our approach is informed by a framework conceptualized by Crone et al. (2019), which highlights the benefits of enabling districts to have ownership throughout the entire research process. It involves close collaboration and support to guide districts in answering questions that directly inform decision-making and positions practitioners as shared partners in knowledge mobilization where they contribute to both the production and end use of research (Levin, 2013; Rycroft-Smith, 2023). We co-develop research questions, and plans for data collection, and analytical approach (e.g., selection of usage metrics, achievement metrics, covariates, and subgroup analyses [e.g., low-achieving students, students on free/reduced lunch, Black and Latinx students, English learners, students on IEPs]). To provide evidence in a timely manner, we work with districts to prepare and analyze data through an automated platform that presents results in an online data dashboard. After data analysis is completed, we meet with our research practitioner partners to answer questions, co-develop interpretations of the data, and develop next steps. We refer to this formative evaluation process using our digital platform as a Rapid Cycle Evaluation (RCE), as the process allows for repeated experimentation and formative insights of the efficacy of edtech products and interventions. For practitioner-researcher partnerships to result in data-driven decision-making, researchers must: a) engage practitioners in constructive dialogue around important issues of interest; b) develop an understanding of local needs of schools and districts; and c) build tools, resources, and methods that are practical, actionable, and relevant. Therefore, our approach involves the use of automated statistical tools and accessible data dashboards that empower practitioners to conduct repeatable analyses of their data and produce consumable reports specific to their local needs. Furthermore, by giving practitioners access to tools they need to conduct their own evaluations, the RCE process addresses the "file drawer" problem by making all evidence available whether it is positive, null, or negative. We have been using this approach for ten years, partnering with districts to conduct 3,015 distinct RCEs to date and 867 RCEs in the 2023-24 academic year, and have found it to be a successful method for ensuring that research findings are understood, utilized, and valued by education stakeholders. On average, each district conducts 12 RCEs (SD = 16) on 3 edtech products (SD = 2). Seventy-nine% of the RCEs use a correlative research design that investigates the association between the use of an edtech product and a related, standardized outcome measure. While it is clear that districts have used RCEs to gain insights about the usage and efficacy of edtech products, we do not know how they use this information for decision-making. Given this, in the 2023-24 academic year, we introduced an additional layer of data collection to our engagement with districts. Specifically, we used a survey to solicit feedback from district administrators about: 1) how they plan to use their RCE report(s), 2) which stakeholders they intend to share RCE information with, 3) what elements of the RCE report were most important for their decision-making around edtech interventions, and 4) if there was other information they would like to have in the future. Data collection will be complete in spring 2024 and we look forward to providing the SREE audience with descriptive statistics about the RCEs that have been conducted in the 2023-24 school year and practitioners' perspectives on their use of RCE reports to make decisions about edtech tools being used in their schools. By understanding how practitioners use RCE reports to make decisions, this study has implications for other education evaluators because it will provide detailed information about which aspects of evaluation reports are most important to them, which stakeholders are involved in reviewing evaluations, and what information is missing but needed as part of evaluations. In other words, the primary aim of this study is to evaluate the efficacy of our evaluations, and in turn, show a commitment to knowledge mobilization (Levin, 2013).
Society for Research on Educational Effectiveness. 2040 Sheridan Road, Evanston, IL 60208. Tel: 202-495-0920; e-mail: contact@sree.org; Web site: https://www.sree.org/
Publication Type: Reports - Research
Education Level: N/A
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: Society for Research on Educational Effectiveness (SREE)
Grant or Contract Numbers: N/A