NotesFAQContact Us
Collection
Advanced
Search Tips
Back to results
Peer reviewed Peer reviewed
Direct linkDirect link
ERIC Number: ED663585
Record Type: Non-Journal
Publication Date: 2024-Sep-20
Pages: N/A
Abstractor: As Provided
ISBN: N/A
ISSN: N/A
EISSN: N/A
Rapid Cycle Implementation Monitoring and Support
Xin Wei; Jeremy Roschelle; Danae Kamdar; Tiffany Leones; Ximena Dominguez; William Corrin
Society for Research on Educational Effectiveness
Background/Context: Funders, researchers and developers share an interest in applying rapid cycle evaluation techniques in education (McNall & Foster-Fishman, 2007; Resch, 2016). Both rapid-cycle and other evaluation processes require monitoring the quality of implementation (Moir, 2018). By quickly monitoring and adjusting implementation, fidelity can be improved and study quality can be increased. Specifically regarding implementation, rapid cycle monitoring and support has been conceptualized in health care (Gaidhane et al., 2023) but not in education. Edtech-based interventions collect usage data constantly By exploring how a framework from healthcare called Rapid Cycle Implementation Monitoring and Support (RCIMS) (Figure 1, Gaidhane et al., 2023) applies to the study of edtech interventions--it could become possible improve implementations more rapidly and thus enable higher-quality rapid cycle evaluations. Purpose/Objective/Research: Question Overall, our team is evaluating the benefits of using varied adaptive learning algorithms within elementary mathematics classrooms to individualize instruction for learners. Our equity concern relates to students who are many years below expected grade level performance in mathematics. In the context of this study, we ask: how can rapid cycles of monitoring and inform adjustments necessary to improve implementation of supplementary supports for these students? More specifically, the study team wanted timely insight on these implementation factors: (1) What percentage of total enrolled students, by condition, have taken the diagnostic tests (necessary for adaptivity) and can participate in an intervention?; (2) To what extent are students using the platform in their assigned condition or is there contamination between conditions?; (3) Are schools, classrooms and students using the platform for recommended durations of time (on average, 45 minutes per week)?; and (4) Are there differences between conditions in expected patterns of use, such as completing or not completing assignments in the product, time allocation across different math activities, and engagement with various content areas and standards? Setting and Population: Research is occuring in 12 elementary schools serving about 4,000 students. Minority student representation varied from 37% to 70% and the proportion of economically disadvantaged students ranged from 29% to 50% across districts. In addition, a high proportion of students have prior math achievement levels that are two or more years behind grade level. Intervention & Implementation: We are working with two commonly used math platforms that supplement teachers' mathematics instruction. Each vendor implemented two contrasting algorithms for adaptive learning. Later, we will analyze the comparative benefits of the algorithms. Presently, we are examining implementation. Each participating school received training and support from the vendors to use the platforms. Students were first assigned to take a diagnostic test, then randomly assigned to conditions, and thereafter began using the product. Students' expected usage on each product, as communicated to teachers and school leaders, is 60 minutes per week. Our general implementation research approach involves collecting and analyzing data from each platform; interpreting the data with input from experts on each vendor's team and with coordinators in the school districts; and determining support steps to be led by the research team, vendor's support team, and/or by district or school coordinators. Research Design: Broadly, we have been able to align our implementation research to RCMIS in Figure 1 as follows: (1) We defined key implementation questions (see above) and then (2) determined how existing data collected within the platform could address each question. To answer these questions, (3) we created data analysis scripts to produce biweekly analytic reports on implementation, which we (4) shared with the project team every two weeks. Findings then informed (5) implementation support actions which were later (6) re-evaluated based on subsequent findings. Our "at-risk" population concern has been on students whose diagnostic level in mathematics is more than 2 years below grade level, and we have many students in this at-risk population to observe. However, unlike the RCMIS framework, our process has involved adjusting all steps in the figure to achieve a more coherent, tighter process over time; for example, performance indicators did not stay fixed. Data Collection and Analysis: Data was collected on two product platforms and made available for analysis after removing PII. Supplementary data include a weekly educator survey around their primary topic of instruction and discussions with district or school coordinators. Quantitative usage data were analyzed using descriptive statistics, correlational analysis, and longitudinal trends, while qualitative data were analyzed thematically. Findings/Results: Overall, we have been able to execute biweekly cycles (e.g., "rapid cycles") of implementation monitoring. We have been able to analyze each of the focal questions, but interpretation is easier for some questions than others. We found two factors complicate some interpretations: (a) the variables available in platform data are not always well-aligned to our implementation questions and (b) platform data is missing information needed for interpretation and collecting that data from participants by surveys or other means is slower and less comprehensive than collecting data in the platform. We have been able to identify and execute support actions and have seen progress in some variables, such as appropriate usage. But we have also found that platform data reveals fine grain variability, and we often have to decide how to allocate limited human resources to the most important implementation issues; this involves expert judgment not just analysis of data. Conclusion: In parallel to the RCIMS framework from healthcare, rapid cycles can be used with educational technologies to improve implementation during research studies. Despite challenges in data interpretation and variable identification, the approach enabled continuous improvement and refining the study for future years. In contrast to the top-down sensibility of RCIMS as described in healthcare, we found it necessary to adjust all elements of the rapid cycle process over time to achieve tighter coherence to our implementation fidelity goals. Equity Statement: This study embeds equity principles by ensuring that the rapid cycle approach is adaptable to diverse educational settings, and by focusing specifically on a well-defined and large "at risk" population. Through inclusive data collection and analysis practices, this research aims to contribute to the development of equitable and effective educational interventions.
Society for Research on Educational Effectiveness. 2040 Sheridan Road, Evanston, IL 60208. Tel: 202-495-0920; e-mail: contact@sree.org; Web site: https://www.sree.org/
Publication Type: Reports - Research
Education Level: Elementary Education
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: Society for Research on Educational Effectiveness (SREE)
Grant or Contract Numbers: N/A