NotesFAQContact Us
Collection
Advanced
Search Tips
Back to results
Peer reviewed Peer reviewed
Direct linkDirect link
ERIC Number: ED659736
Record Type: Non-Journal
Publication Date: 2023-Sep-27
Pages: N/A
Abstractor: As Provided
ISBN: N/A
ISSN: N/A
EISSN: N/A
Using Mixed Effects Models to Identify Usage in Educational Technology Programs
Weiling Li; Aaron Butler; Amy Dray
Society for Research on Educational Effectiveness
Background/Context: Technology in education has become a vital part of modern education systems (Turugare & Rudhumbu, 2020, Osterweil et. al, 2015; Ghory & Ghafory, 2021). Over the past two decades, it has transformed the way we teach, learn, and assess knowledge (U.S. Department of Education Office of Educational Technology, 2017; Partala & Saari, 2015; Dube & Wen, 2021). The proliferation of EdTech products has led to an increasing need for researchers and practitioners to evaluate products' effectiveness (Osterweil et. al, 2015). Used with student performance data, intervention dosage can provide a clearer picture of intervention success (Mason & Smith, 2020). One potential approach is the establishment of usage cutoffs. This study explores how to identify optimal usage cutoffs for EdTech products to facilitate the process of finding cause-and-effects. Cutoffs will help researchers, educators, and policymakers make informed decisions regarding the adoption and implementation of technology tools. Purpose/Objective/Research Question: We present an innovative approach for calculating usage cutoffs for EdTech products by employing mixed effects models. The data structure in e-learning ecology is essentially hierarchical by nature (Lin et. al, 2023). A model-based approach has been effective in analyzing intervention effects (Bates et.al, 2015). By breaking down usage indicators into dichotomous groups and testing optimal cutoff values in mixed effects models, the model statistics will provide a threshold for the maximum effect of users engaging with an EdTech product. Quasi-experimental studies can be designed based on the cutoff value to estimate the effectiveness of the product. The What Works Clearinghouse (WWC) defines quasi-experimental designs (QED) as interventions and comparison conditions formed using a non-random process, as long as the groups are mutually exclusive (WWC, 2022). This threshold enables researchers to quickly identify the maximum effectiveness of the product in the unique sample. This approach is especially meaningful for addressing diversity and equity. One strength of mixed effects models is their flexibility in specifying the model's structure to address research questions. This flexibility allows researchers to incorporate equity-related factors and interactions, such as examining how the relationship between usage and product effectiveness may differ based on gender or socio-economic status. By including these factors in the analysis, researchers can better understand and address potential inequities in the implementation and effectiveness of EdTech products. Research Design: To establish an optimal cutoff for the number of skills completed, a mixed-effects model below was used to model the factors affecting student outcomes: Outcome[subscript ij] =[alpha][subscript 0] +[beta][subscript 1](Baseline)[subscript ij]+[beta]2(#ofcompleted skills)[subscript ij]+[beta]3(Grade level)[subscript ij]+[mu][subscript j]+e[subscript ij] where subscripts i and j denote student and school, respectively; outcome represents student achievement; baseline represents the baseline measure of the outcome variable. After identifying "# of completed skills" is a statistically significant factors affecting students' outcomes, we split the "# of completed skills" into dichotomous groups to test optimal cutoff values in this model. Once the optimal cutoff value was identified, a quasi-experimental study was designed to estimate the intervention effects of the product. Students whose usage reached or exceeded the cutoff value would be in the treatment group; students who completed 0 skills would be in the control group. Then we established baseline equivalence for treatment and control groups using propensity score matching. After baseline equivalence was achieved, a linear regression model was applied to examine intervention impacts on student outcomes. Findings and Results: Our outcome was Math MAP scores in Spring 2022 (NWEA, 2022). To account for the nested structure of students within schools, we first estimated Model 1 as the NULL model, including only school_id to identify the school-level variance. We then added "Math MAP score in fall 2021" as a baseline in Model 2, followed by "Grade_level" as a covariate in Model 3, and "# of completed skills" in Model 4. Results displayed in Table 1 indicated that "# of completed skills" was a statistically significant variable, improving the models. Further analysis involved breaking down the "number of completed skills" variable by cutoff. Model 5 examined a cutoff of 1 (1 or more vs. 0), Model 6 examined a cutoff of 2 (2 or more vs. 0 to 1), Model 7 examined a cutoff of 3 (3 or more vs. 0 to 2), and so on for subsequent models. The minimum values of the model fitting statistics (AIC/BIC) were observed for Model 9 (which examined a skill cutoff of 5), suggesting that model 9 was the best fit for the data among all the cutoff models. As such, the best cutoff value for this math sample was 5 skills. As illustrated in Table 2, baseline equivalence was established based on knowledge baseline and SES (Socioeconomic status). Baseline and SES are included as covariates in the outcome estimate (WWC, 2022). The impact analysis results presented in Table 3 indicate that there were statistically significant (p<0.05) and positive intervention effects in Grades 1, 2, and 5, with students who completed 5 or more skills showing significantly higher MAP scores in these grades. Conclusions: In the era of AI, more people are questioning what students should learn to prepare for the future. The EdTech revolution is approaching. We cannot expect students to learn something by briefly touching an EdTech product. Usage cutoffs play a critical role in examining the effectiveness of EdTech products. By establishing a usage threshold, consistent and fair comparisons can be made. Then data-driven decision making can be facilitated. The process will result in a more accurate and dependable evaluation process. While this method is useful for identifying thresholds in specific samples, it may not be generalizable for making best dosage recommendations. Therefore, a meta-analysis of results from multiple samples may be necessary to generate the best dosage recommendations. As EdTech continues to evolve and shape the future of education, it is crucial that researchers, educators, and policymakers embrace the use of usage cutoffs to ensure that the products they endorse and promote are genuinely effective and advantageous for all learners.
Society for Research on Educational Effectiveness. 2040 Sheridan Road, Evanston, IL 60208. Tel: 202-495-0920; e-mail: contact@sree.org; Web site: https://www.sree.org/
Publication Type: Reports - Research
Education Level: N/A
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: Society for Research on Educational Effectiveness (SREE)
Grant or Contract Numbers: N/A