NotesFAQContact Us
Collection
Advanced
Search Tips
Assessments and Surveys
What Works Clearinghouse Rating
Does not meet standards1
Showing 1 to 15 of 44 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Ke Yu – International Journal of Bilingual Education and Bilingualism, 2024
Research syntheses that evaluate bilingual program effectiveness have grown exponentially since the 1980s. Contradictory to earlier anti-bilingual findings, these research syntheses, including statistical meta-analyses, have converged on findings supporting L1 teaching. This study examines the methodological soundness of eight statistical…
Descriptors: Bilingual Education Programs, Program Effectiveness, Meta Analysis, Standards
Peer reviewed Peer reviewed
Direct linkDirect link
Shepley, Collin; Grisham-Brown, Jennifer; Lane, Justin D. – Topics in Early Childhood Special Education, 2022
Multitiered systems of support provide a framework for matching the needs of a struggling student with an appropriate intervention. Experimental evaluations of tiered support systems in grade schools have been conducted for decades but have been less frequently examined in early childhood contexts. A recent meta-analysis of multitiered systems of…
Descriptors: Positive Behavior Supports, Preschool Children, Preschool Education, Program Evaluation
Hedges, Larry V.; Schauer, Jacob M. – Journal of Educational and Behavioral Statistics, 2019
The problem of assessing whether experimental results can be replicated is becoming increasingly important in many areas of science. It is often assumed that assessing replication is straightforward: All one needs to do is repeat the study and see whether the results of the original and replication studies agree. This article shows that the…
Descriptors: Replication (Evaluation), Research Design, Research Methodology, Program Evaluation
Hedges, Larry V.; Schauer, Jacob M. – Grantee Submission, 2019
The problem of assessing whether experimental results can be replicated is becoming increasingly important in many areas of science. It is often assumed that assessing replication is straightforward: All one needs to do is repeat the study and see whether the results of the original and replication studies agree. This article shows that the…
Descriptors: Replication (Evaluation), Research Design, Research Methodology, Program Evaluation
Peer reviewed Peer reviewed
Direct linkDirect link
Barrett, Paula M.; Cooper, Marita; Stallard, Paul; Zeggio, Larissa; Gallegos- Guajardo, Julia – Education and Treatment of Children, 2017
This response aims to critically evaluate the methodology and aims of the meta-analytic review written by Maggin and Johnson (2014). The present authors systematically provide responses for each of the original criticisms and highlight concerns regarding Maggin and Johnson's methodology, while objectively describing the current state of evidence…
Descriptors: Anxiety, Prevention, Program Effectiveness, Program Evaluation
Peer reviewed Peer reviewed
Direct linkDirect link
Kulik, James A.; Fletcher, J. D. – Review of Educational Research, 2016
This review describes a meta-analysis of findings from 50 controlled evaluations of intelligent computer tutoring systems. The median effect of intelligent tutoring in the 50 evaluations was to raise test scores 0.66 standard deviations over conventional levels, or from the 50th to the 75th percentile. However, the amount of improvement found in…
Descriptors: Intelligent Tutoring Systems, Meta Analysis, Computer Assisted Instruction, Statistical Analysis
Peer reviewed Peer reviewed
Direct linkDirect link
Reeves, Barnaby C.; Higgins, Julian P. T.; Ramsay, Craig; Shea, Beverley; Tugwell, Peter; Wells, George A. – Research Synthesis Methods, 2013
Background: Methods need to be further developed to include non-randomised studies (NRS) in systematic reviews of the effects of health care interventions. NRS are often required to answer questions about harms and interventions for which evidence from randomised controlled trials (RCTs) is not available. Methods used to review randomised…
Descriptors: Research Methodology, Research Design, Health Services, Workshops
Peer reviewed Peer reviewed
Direct linkDirect link
Valentine, Jeffrey C.; Thompson, Simon G. – Research Synthesis Methods, 2013
Background: Confounding caused by selection bias is often a key difference between non-randomized studies (NRS) and randomized controlled trials (RCTs) of interventions. Key methodological issues: In this third paper of the series, we consider issues relating to the inclusion of NRS in systematic reviews on the effects of interventions. We discuss…
Descriptors: Research Design, Randomized Controlled Trials, Intervention, Bias
Peer reviewed Peer reviewed
Direct linkDirect link
Suggate, Sebastian P. – Journal of Learning Disabilities, 2016
Much is known about short-term--but very little about the long-term--effects of reading interventions. To rectify this, a detailed analysis of follow-up effects as a function of intervention, sample, and methodological variables was conducted. A total of 71 intervention-control groups were selected (N = 8,161 at posttest) from studies reporting…
Descriptors: Reading Instruction, Intervention, Program Effectiveness, Meta Analysis
Peer reviewed Peer reviewed
Direct linkDirect link
Miles, Eleanor; Sheeran, Paschal; Webb, Thomas L. – Psychological Bulletin, 2013
Augustine and Hemenover (2013) were right to state that meta-analyses should be accurate and generalizable. However, we disagree that our meta-analysis of emotion regulation strategies (Webb, Miles, & Sheeran, 2012) fell short in these respects. Augustine and Hemenover's concerns appear to have accrued from misunderstandings of our inclusion…
Descriptors: Effect Size, Meta Analysis, Accuracy, Self Control
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Jamaludin, Khairul Azhar; Alias, Norlidah; DeWitt, Dorothy – Turkish Online Journal of Educational Technology - TOJET, 2015
The practice of homeschooling still receives contrasting responses on its relevancy and effectiveness. The current study is aimed to map the trends in the selected eleven studies from various educational journals. The analysis focuses on mapping the trends on: a) research settings, b) target sample, c) method or instrument used, d) common focus or…
Descriptors: Journal Articles, Home Schooling, Educational Practices, Educational Research
Peer reviewed Peer reviewed
Direct linkDirect link
Hawkins, Alan J.; Stanley, Scott M.; Cowan, Philip A.; Fincham, Frank D.; Beach, Steven R. H.; Cowan, Carolyn Pape; Rhoades, Galena K.; Markman, Howard J.; Daire, Andrew P. – American Psychologist, 2013
In the past decade, the federal government, some states, and numerous communities have initiated programs to help couples form and sustain healthy marriages and relationships in order to increase family stability for children. Thus, the authors value the attention given to this emerging policy area by the "American Psychologist" in a recent…
Descriptors: Marriage, Interpersonal Relationship, Low Income Groups, Federal Aid
Peer reviewed Peer reviewed
Direct linkDirect link
Swanson, Elizabeth; Wanzek, Jeanne; Haring, Christa; Ciullo, Stephen; McCulley, Lisa – Journal of Special Education, 2013
Treatment fidelity reporting practices are described for journals that published general and special education intervention research with high impact factors from 2005 through 2009. The authors reviewed research articles, reported the proportion of intervention studies that described fidelity measurement, detailed the components of fidelity…
Descriptors: Intervention, Research Methodology, Fidelity, Journal Articles
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Institute of Education Sciences, 2013
In January 2011, a Joint Committee of representatives from the U.S. Department of Education (ED) and the U.S. National Science Foundation (NSF) began work to establish cross-agency guidelines for improving the quality, coherence, and pace of knowledge development in science, technology, engineering and mathematics (STEM) education. Although the…
Descriptors: STEM Education, Research and Development, Intervention, Educational Improvement
Peer reviewed Peer reviewed
Direct linkDirect link
Stufflebeam, Daniel L. – Journal of MultiDisciplinary Evaluation, 2011
Good evaluation requires that evaluation efforts themselves be evaluated. Many things can and often do go wrong in evaluation work. Accordingly, it is necessary to check evaluations for problems such as bias, technical error, administrative difficulties, and misuse. Such checks are needed both to improve ongoing evaluation activities and to assess…
Descriptors: Program Evaluation, Evaluation Criteria, Evaluation Methods, Definitions
Previous Page | Next Page ยป
Pages: 1  |  2  |  3