NotesFAQContact Us
Collection
Advanced
Search Tips
Back to results
Peer reviewed Peer reviewed
Direct linkDirect link
ERIC Number: ED659391
Record Type: Non-Journal
Publication Date: 2023-Sep-30
Pages: N/A
Abstractor: As Provided
ISBN: N/A
ISSN: N/A
EISSN: N/A
Approaches to Statistical Efficiency When Comparing the Embedded Adaptive Interventions in a SMART
Timothy Lycurgus; Daniel Almirall
Society for Research on Educational Effectiveness
Background: In educational settings, individuals are often best served by an intervention that is adapted over sequential stages to suit their initial and changing needs. The salience of an adaptive intervention is, perhaps, most clear in the classroom. Learning itself is a sequential process: mastering a given concept or technique frequently necessitates a thorough understanding of the preceding concepts. Following an initial lesson or assignment, a classroom teacher may begin monitoring each student to identify those meeting or failing to meet criteria for early signs of success, and then offer each student targeted support based on their needs (Dawson et al., 2014; Arendale, 1994; Rowan et al., 2019). Outside the classroom, as well, there are myriad scenarios where it may be necessary to adapt and re-adapt intervention, both at the school and school-district level. Increasingly, there is interest by educators in determining how best to make sequences of intervention decisions (Raudenbush, 2008). Such "adaptive interventions" (AI) determine which treatment should be offered to a student or participant at any given stage of the intervention. Also referred to as dynamic treatment regimens or dynamic instructional regimes (Raudenbush, 2008), these adaptive interventions tailor the treatment to best-serve the needs of the participants. To optimize an AI, researchers may utilize sequential, multiple assignment randomized trials, or SMARTs (Lavori and Dawson, 2004; Murphy, 2005). SMARTs are a type of factorial design (Murphy and Bingham, 2009) where some or all participants are randomized multiple times to one or more treatment options at critical decision points in the AI (Almirall et al., 2018). SMARTs, frequently utilized in the medical and behavioral intervention sciences, are growing in popularity in education sciences as well. For example, Kim et al. (2019) and Fleury and Towson (2021) use SMARTs to inform development of AIs aimed, respectively, at personalizing print and digital content for early elementary students and at improving reading in preschool children with autism. Purpose: We introduce statisticians and methodologists in education sciences to adaptive interventions and SMART designs. Next, we provide education scientists with a suite of easy-to-implement techniques that may lead to increased statistical efficiency when analyzing data from a SMART. We demonstrate the benefits of the efficiency techniques through a comprehensive simulation study. Research Design: An adaptive intervention (AI) is a pre-specified set of decision rules that guides how best to serve the baseline and ongoing needs of individuals (Seewald et al., 2020a). These rules tailor the provision of treatment at critical points throughout the intervention. There are four aspects of AIs: decision points, intervention options, decision rules, and tailoring variables (Seewald et al., 2020a). Decision points are the times at which an intervention decision is made; the set of interventions available at a decision point are the intervention options. These may include, among others, different interventions or different intensities of interventions. The decision rule determines which option is selected for an individual at a given decision point. The decision rule makes this determination based on the value of one or more tailoring variables, which consist of known information collected prior to or at the current decision point. In SMARTs, which are used to optimize AIs, participants take part in multiple stages of the intervention, corresponding to each decision point where individuals may be randomized to two or more intervention options. There are many different SMART designs, but we focus on the prototypical SMART, seen in Figure 1. In the prototypical SMART, all participants are randomized during the first stage of the intervention. At subsequent stages, only non-responders are re-randomized to an adjusted intervention. Techniques to Increase Efficiency: In this paper, we present four techniques that may be used to improve statistical efficiency when analyzing data from a SMART. Technique 1: It is well-known that incorporating baseline covariates may increase efficiency in outcome analysis (Bloom et al., 2007). For example, controlling for a pre-test score will often substantially improve precision. The gains in efficiency should typically remain when analyzing SMART data as well. Technique 2: Rather than using known inverse probability of assignment weights, it may be possible to realize gains in efficiency by estimating the weights, either through the sample proportions assigned to each treatment or even through modeling (Hernan et al., 2002; Hirano et al., 2003; Brumback, 2009; Almirall et al., 2014). Technique 3: As is standard in many education studies, often we are able to collect longitudinal outcomes when conducting SMARTs. Obtaining longitudinal data (e.g. an intermediate outcome immediately preceding the second-stage randomization) should permit the researcher to realize substantial gains in efficiency. Technique 4: Rather than assuming a constant variance across time, model the variance as a function of the time point. Simulations: We present a comprehensive simulation study using modifications of the data generative models presented in Seewald et al. (2020b) in order to better understand whether, and under what conditions, the four techniques lead to improvements in statistical efficiency. We compare each technique to an analytical approach that adopts none of these techniques, estimating efficiency in terms of ratios of root mean-squared-errors (rMSE). Results: Results are presented in Table 1. When both [rho], the within-person correlation, and [nu], the correlation between outcome Y and covariate X, are small, the techniques provide marginal gains to efficiency. Increasing [rho] leads to greater gains in efficiency for each technique, but particularly for Techniques 3 and 4 which directly incorporate the longitudinal outcomes. Increasing [nu] generally corresponds to greater relative efficiency for each technique in comparison with the baseline scenario. This benefit is larger for Techniques 1 and 2, the two methods that rely on incorporating information from X into their estimation procedures. Conclusion: In this paper, we introduced adaptive interventions and SMARTs to education scientists. We presented four techniques that may increase statistical efficiency when analyzing SMART data. We showed through a comprehensive simulation study that all four techniques have the ability to improve efficiency.
Society for Research on Educational Effectiveness. 2040 Sheridan Road, Evanston, IL 60208. Tel: 202-495-0920; e-mail: contact@sree.org; Web site: https://www.sree.org/
Publication Type: Reports - Research
Education Level: N/A
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: Society for Research on Educational Effectiveness (SREE)
Grant or Contract Numbers: N/A