ERIC Number: ED657019
Record Type: Non-Journal
Publication Date: 2021-Sep-29
Pages: N/A
Abstractor: As Provided
ISBN: N/A
ISSN: N/A
EISSN: N/A
What Happens after the Program Ends? A Synthesis of Post-Program Effects in Higher Education
Michael Weiss; Rebecca Unterman; Dorota Biedzio
Society for Research on Educational Effectiveness
Background: Some education programs' early positive effects disappear over time. Other programs have unanticipated positive long-term effects. Foundations have warned of the dangers of putting too much weight on in-program effects, which "often fade over time." The U.S. Department of Education's Institute of Education Sciences (IES) even has a special funding category dedicated to continued follow-up to allow the exploration of such issues. Objective: This research begins to tackle the topic of post-program effects in an unexamined context: postsecondary education. Are in-program effects--that is, the effects observed while the intervention was active--maintained once the program ends? Do they grow and improve? Or do they fade out? Theorists from various education-related fields have hypothesized what may happen to effects after a program ends. The most prominent set of hypotheses comes from early education researchers, who outline three processes by which program effects may be sustained, grow, or fade over time: development of trifecta skills, "foot-in-the-door" programs, and sustaining environments. Though initially developed around a different transition point, these theories may also apply to postsecondary post-program effects. Trifecta skills are "malleable, fundamental, and would not have developed eventually in the absence of the program." In postsecondary education, trifecta skills may be fostered by something like a student success course that teaches study and time management skills at college entry. Foot-in-the-door programs leverage sensitive periods to "avoid imminent risks" or "seize emerging opportunities," as is intended by, for example, emergency financial aid. Finally, sustaining environments are experienced by students who, after a program ends, "move into high-quality environments that support their continued growth," such as a high-quality postsecondary institution after a summer bridge program. Setting/Participants/Intervention/Data: This investigation capitalizes on two decades of rigorous program evaluations conducted by MDRC, including 29 postsecondary interventions. The programs vary in terms of their features (like financial supports, advising, tutoring, learning communities, etc.), their duration (lasting from one semester to three years), the populations they served, and their contexts. Each evaluation was high quality, using a randomized controlled trial (RCT) design to estimate program effectiveness. The sample includes over 40,000 students who participated in 29 RCTs across over 35+ institutions. This work draws on THE-RCT dataset, which is a restricted access file (RAF) containing de-identified student-level data from 31 of MDRC's higher education randomized controlled trials (RCTs), involving 45 institutions and 67,400 students. Data include demographics (such as gender and race and ethnicity), outcomes (such as enrollment, credits earned, and credentials), and study-related variables. The programs range from "light touch" interventions to comprehensive interventions, with varying durations (one semester to three years), targeting various populations and contexts. Several of the studies had multiple intervention arms, meaning the RAF includes 41 unique programs. Results: Figure 1 (below) presents findings from each of 29 programs. The y-axis represents the estimated effect of the program on cumulative credits earned, an indicator of progress toward a degree. These effects refer to the average number of additional credits students earned because of the program--credits they would not have earned in the absence of the program. The x-axis represents time in years, centered around the final program semester. Negative time values (zero included) are "in-program" semesters. Positive time values are "post-program" semesters, the emphasis of this research. Since most studies have at least one year of post-program follow-up, that time frame is highlighted--that is, the time between the two vertical dashed lines. The results are striking: During the year after these programs ended, effects on academic progress (as measured by credit accumulation) are consistently maintained. Specifically, when pooled across all studies, the change in the effect on cumulative credits earned during the first post-program year is just +0.02 credits. This change is neither practically nor statistically significantly different from zero. Surprisingly, despite the many differences across the 29 RCTs, the evidence points to a consistent pattern of post-program maintenance of effects in every study. There is not clear evidence of post-program growth--that is, improved effects--or fade-out, for any of the programs examined. Conclusion: The finding that effects on credit accumulation are broadly maintained after postsecondary programs end ought to be encouraging to education reformers concerned about fade-out. While in-program effects are sometimes important on their own, benefits that are maintained into the future are especially powerful.
Descriptors: High School Graduates, Post High School Guidance, Transitional Programs, Program Effectiveness, Individual Development, Sustainability, Achievement Gains, College Credits
Society for Research on Educational Effectiveness. 2040 Sheridan Road, Evanston, IL 60208. Tel: 202-495-0920; e-mail: contact@sree.org; Web site: https://www.sree.org/
Publication Type: Reports - Research
Education Level: High Schools; Secondary Education
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: Society for Research on Educational Effectiveness (SREE)
Grant or Contract Numbers: N/A