ERIC Number: ED658591
Record Type: Non-Journal
Publication Date: 2022-Sep-24
Pages: N/A
Abstractor: As Provided
ISBN: N/A
ISSN: N/A
EISSN: N/A
The ASSISTments Effectiveness Study: Understanding Implementation and Impacts during the Time of COVID
Kirk Walters; Rachel Garrett; Dioni Garcia-Piriz; Eban Witherspoon; Max Pardo; Lauren Burr; Melissa Rogers
Society for Research on Educational Effectiveness
Background/Context: A minority of U.S. eighth graders reach the proficient level on the National Assessment of Educational Progress (U.S. Department of Education, 2021). Early evidence suggests that the pandemic has made things worse, especially so for underserved student populations (Lewis et al., 2021). Math proficiency is central to advanced high school course taking opportunities, including STEM pathways, and thus a popular target of reform. One of the few middle school programs with strong evidence of improvement is ASSISTments, a free, online tool that provides students immediate feedback on their work and teachers with a summary report of student performance (see Figure 1). A prior IES-funded efficacy study in rural Maine found that ASSISTments significantly improved student outcomes on a standardized math assessment, with larger effects for low-achieving students (Roschelle et al., 2016). The current effectiveness study is a multi-site, clustered randomized trial with schools in geographically and culturally diverse settings across the U.S. Although COVID-19 disrupted the study in the midst of implementation and data collection in spring 2020, participants agreed to an additional year of implementation and data collection. This provides an unusual opportunity to examine the implementation and impacts of the web-based program when the pandemic forced schools into distance and hybrid learning environments. Purpose/Objective/Research Questions: The primary purpose of the study was to examine whether the results from the prior efficacy study generalized to a more culturally and geographically diverse set of schools, guided by the following questions: Primary: What is the impact of ASSISTments on seventh-grade student math learning, and to what extent do impacts vary across schools and student subgroups? Exploratory: What is the impact of ASSISTments on seventh-grade students' math mindset, and to what extent do impacts vary across schools and student subgroups? What is the impact of ASSISTments on seventh-grade math teachers' homework review practices? To what extent is the impact of ASSISTments on seventh-grade student math learning mediated by students' math mindsets and teachers' homework review practices? Implementation: To what extent do schools assigned to receive ASSISTments implement with fidelity and was fidelity affected by COVID-19? What is the per-pupil cost of implementing ASSISTments? Setting: The study includes 54 schools from a mix of rural and urban settings across six states, organized into by the two cohorts and timeframe illustrated in Figure 2. Population/Participants/Subjects: Based on a partial sample of 33 schools and more than 2,400 students, our preliminary results show baseline equivalence at <0.198 for all key characteristics included in Table 1. Intervention/Program/Practice: ASSISTments is a free, web-based tool that supports independent practice, either in class or for homework. It is designed to be used 2-3 times per week and involves the following four steps: 1. Teachers select problems to create an assignment for their students to work on as independent practice or homework. The ASSISTments team incorporates these practice problems from a teacher's textbook into the tool in advance. 2. Students solve the problems and enter their responses into the tool (either online or offline) and get immediate feedback. 3. Teacher accesses a simple report that summarizes responses across students and questions, providing teachers with actionable data, such as common wrong answers. 4. Teacher shares the report with the class to help address and repair common mistakes and misconceptions. Research Design: The study used school-level randomization, including 54 public and charter schools in urban and rural areas serving 7th grade students. As shown in Figure 2, Cohort 1 schools started implementation in the 2018-19 school year and Cohort 2 schools started in 2019-20, with all schools continuing through 2020-21. Teachers in schools randomly assigned to use ASSISTments received initial group training followed by individual, virtual coaching during each school year. Student impact analyses use multilevel models nesting students within teachers and schools, and account for the randomization design in addition to baseline student characteristics. Data Collection and Analysis: To assess student learning, the study team collected all available math assessment information from spring 2021, which primarily consists of statewide accountability tests and the NWEA MAP assessment. Test scores are standardized using either statewide or national norming information to enable analyses across different tests. Student math mindset and teacher homework review practices were captured in student and teacher surveys administered in spring 2021. The study also is leveraging online usage and teacher survey data from the treatment group to examine implementation fidelity. Findings/Results: For both cohorts, teachers use of ASSISTments was much lower than expected in Years 2019-20 and 2020-21, and lower than the 2-3 times per week documented in the prior efficacy study. However, for the subsample of teachers who had data in fall and spring of Years 2 and 3, their fidelity rates almost doubled, from 26% implementing with fidelity in 2020 to 43% in 2021. The service contrast was also low. Of teachers who use computer-supported programs, 94% and 95% of control and treatment teachers, respectively, reported that the program provided student performance data reports. Treatment and control were also similar in how often they reviewed reports. Our current impact analyses with 33 schools and more than 2,400 students show no significant impacts on any of our student attitude or student math performance measures, as illustrated in Tables x and y, respectively. Conclusions: We hypothesize that low implementation fidelity and weak service contributed to the null results to date. The lack of service contrast was likely due to the fact that more programs like ASSISTments were available as the study progressed and because teachers were forced to use web-based tools to varying degrees during COVID.
Descriptors: COVID-19, Pandemics, Rural Schools, Grade 7, Middle School Mathematics, Computer Assisted Instruction, Urban Schools, Public Schools, Charter Schools, Electronic Learning, Program Implementation, Fidelity, Educational Technology, Feedback (Response), Competency Based Education, Mathematics Achievement, Improvement Programs, Geographic Regions, Diversity, Cultural Differences, Influence of Technology, Mathematics Education, Teaching Methods
Society for Research on Educational Effectiveness. 2040 Sheridan Road, Evanston, IL 60208. Tel: 202-495-0920; e-mail: contact@sree.org; Web site: https://www.sree.org/
Publication Type: Reports - Research
Education Level: Elementary Education; Grade 7; Junior High Schools; Middle Schools; Secondary Education
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: Society for Research on Educational Effectiveness (SREE)
Grant or Contract Numbers: N/A