NotesFAQContact Us
Collection
Advanced
Search Tips
Back to results
Peer reviewed Peer reviewed
Direct linkDirect link
ERIC Number: ED663045
Record Type: Non-Journal
Publication Date: 2024-Sep-20
Pages: N/A
Abstractor: As Provided
ISBN: N/A
ISSN: N/A
EISSN: N/A
Criterion-Related Validity of Youth Mental Health Assessments: Application of Kraemer's Satellite Model
Elizabeth Talbott; Andres De Los Reyes; Junhui Yang; Mo Wang
Society for Research on Educational Effectiveness
Youth in the U.S. face a mental health crisis--one that has steadily worsened over the past 10 years (Centers for Disease Control, 2023). Youth with disabilities may experience increased mental health risk compared to their peers without disabilities. Recent national surveys and systematic reviews reveal that mental health risk may be elevated for youth with dyslexia (Wilmot et al., 2023); intellectual disabilities (Buckley et al., 2020; Comer-HaGans et al., 2020); physical disabilities (Lai et al., 2020); autism spectrum disorder (Thiele-Swift & Dorstyn, 2024); and attention deficit hyperactivity disorder (Sultan et al., 2021). Conducting frequent, formative assessment using multiple informants (parents, teachers, youth) is key to tailoring and personalizing mental health interventions for all youth (Ng & Weisz, 2016). To accomplish this, practitioners need a measurement-based feedback system that is comparable to those systems used in academic assessment (i.e., curriculum-based measurement; Hosp et al., 2014). Regardless of whether one seeks to advance youth progress in reading, math, or mental health, this frequent, formative, non-judgmental system for providing feedback is key to personalizing interventions for individual youth. Why do practitioners lack critical guidance for doing so? As we have pointed out in recent work (De Los Reyes, Talbott et al., 2022; Talbott, De Los Reyes et al., 2023), research in youth mental health assessment has not kept up with research on interventions (see also Jensen-Doss, 2011; Jensen-Doss & Hawley, 2010). In contrast, the scientific commitment to developing and testing evidence-based interventions for youth mental health has advanced substantially since the 1960s (Weisz et al., 2019). As a result, both the quality and quantity of evidence-based interventions for youth have increased. The result has been that, although researchers have conducted more than 453 randomized controlled trials with nearly 32,000 participants over the past 53 years, the interventions being tested in these trials have not increased in the potency of their outcomes over this 50-year period. That is, we see negligible differences between the effects of interventions youth received in the 1970s for such needs as anxiety, depression, attention-deficit/hyperactivity disorder, and conduct problems, and the effects of these same kinds of interventions when tested within the service settings of today (Weisz et al., 2019). This finding, in light of a persistent youth mental health crisis (Centers for Disease Control, 2023) is a clear signal to members of the research and practice communities to accelerate research that provides specific guidance to practitioners in using assessment data to personalize, adapt, and deliver evidence-based interventions to meet youth needs (Weisz et al., 2019). Such an approach is particularly critical to meet the needs of youth with disabilities under federal law. Not only are data from multiple informants (parents, teachers, youth) key to personalizing interventions, but effective mental health service delivery includes gathering, interpreting, and using data from multiple sources throughout the assessment process. Yet we know from more than 50 years of research that informants who rate youth mental health are likely to disagree in their ratings. Despite this persistent finding, there is no clear empirical guidance for practitioners about how to integrate data from parents, teachers, and youth in the mental health assessment process. The satellite model, introduced by Kraemer and colleagues in 2003, is a promising and practical approach to integrating data from different informants. In the satellite model, an individual informant's data functions like the data contributed by a single satellite within a global positioning system. That is, each informant within a "satellite array" of different informants provides incrementally valuable data about a target youth's mental health (De Los Reyes et al., 2023). As a result, the satellite model allows researchers and practitioners to integrate the ratings of different informants about the unique contexts (e.g., home, school, community) in which they observe behavior and their unique perspectives (e.g., parents, teachers, youth) on the behavior they do observe. In this study, we conducted a Monte Carlo simulation of the evidence-based "satellite" model (Kraemer et al., 2003) to integrating results from youth mental health assessments compared to common procedures practitioners might use in the field. The goal of the simulation was to determine which approaches performed best in simulated predictions of validity criteria. We compared the satellite model to four common procedures practitioners might use: (a) practitioners know the best informant to make intervention decisions; (b) practitioners make a random guess about which is the best informant for intervention decisions; (c) practitioners are wrong about the best informant and use that information to make decisions; and (d) practitioners are wrong about the best informant but use data from that informant in weighting different informant ratings. Under these different data conditions, we found that Kramer's satellite model consistently outperformed the four common approaches in simulated predictions of validity criteria. This innovative use of the Monte Carlo simulation informs future directions for our work in youth mental health assessment. Our next step is to add a cost analysis to these various approaches, including the costs of practitioner time, training, tools, and expertise. Our goal is to identify those approaches to integrating ratings from different informants that are not only evidence-based (e.g., the satellite model), but also yield the best "bang for the buck" (see Yeh, 2007).
Society for Research on Educational Effectiveness. 2040 Sheridan Road, Evanston, IL 60208. Tel: 202-495-0920; e-mail: contact@sree.org; Web site: https://www.sree.org/
Publication Type: Reports - Research
Education Level: N/A
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: Society for Research on Educational Effectiveness (SREE)
Grant or Contract Numbers: N/A