NotesFAQContact Us
Collection
Advanced
Search Tips
What Works Clearinghouse Rating
Showing 1 to 15 of 93 results Save | Export
Joshua B. Gilbert; James Soland – Annenberg Institute for School Reform at Brown University, 2024
Differences in effect sizes between researcher developed (RD) and independently developed (ID) outcome measures are widely documented but poorly understood in education research. We conduct a meta-analysis using item-level outcome data to test potential mechanisms that explain differences in effects by RD or ID outcome type. Our analysis of 45…
Descriptors: Effect Size, Research Design, Research Methodology, Meta Analysis
Peer reviewed Peer reviewed
Direct linkDirect link
Menglin Xu; Jessica A. R. Logan – Educational and Psychological Measurement, 2024
Research designs that include planned missing data are gaining popularity in applied education research. These methods have traditionally relied on introducing missingness into data collections using the missing completely at random (MCAR) mechanism. This study assesses whether planned missingness can also be implemented when data are instead…
Descriptors: Research Design, Research Methodology, Monte Carlo Methods, Statistical Analysis
Peer reviewed Peer reviewed
Direct linkDirect link
Clintin P. Davis-Stober; Jason Dana; David Kellen; Sara D. McMullin; Wes Bonifay – Grantee Submission, 2023
Conducting research with human subjects can be difficult because of limited sample sizes and small empirical effects. We demonstrate that this problem can yield patterns of results that are practically indistinguishable from flipping a coin to determine the direction of treatment effects. We use this idea of random conclusions to establish a…
Descriptors: Research Methodology, Sample Size, Effect Size, Hypothesis Testing
Peer reviewed Peer reviewed
Direct linkDirect link
Ledford, Jennifer R.; Zimmerman, Kathleen N. – Remedial and Special Education, 2023
A number of resources are available for evaluating the rigor of single-case designs, including the commonly used multiple baseline design. In this article, we discuss two characteristics commonly cited as necessary for the highest rigor in multiple baseline designs--concurrence and response-guided baseline condition duration. We suggest that both…
Descriptors: Research Methodology, Research Design, Behavior Change, Intervention
Declercq, Lies; Jamshidi, Laleh; Fernández-Castilla, Belen; Moeyaert, Mariola; Natasha, Beretvas S.; Ferron, John M.; Van den Noortgate, Wim – Grantee Submission, 2020
To conduct a multilevel meta-analysis of multiple single-case experimental design (SCED) studies, the individual participant data (IPD) can be analyzed in one or two stages. In the one-stage approach, a multilevel model is estimated based on the raw data. In the two-stage approach, an effect size is calculated for each participant and these effect…
Descriptors: Research Design, Data Analysis, Effect Size, Models
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Jane E. Miller – Numeracy, 2023
Students often believe that statistical significance is the only determinant of whether a quantitative result is "important." In this paper, I review traditional null hypothesis statistical testing to identify what questions inferential statistics can and cannot answer, including statistical significance, effect size and direction,…
Descriptors: Statistical Significance, Holistic Approach, Statistical Inference, Effect Size
Moeyaert, Mariola; Akhmedjanova, Diana; Ferron, John; Beretvas, S. Natasha; Van den Noortgate, Wim – Grantee Submission, 2020
The methodology of single-case experimental designs (SCED) has been expanding its efforts toward rigorous design tactics to address a variety of research questions related to intervention effectiveness. Effect size indicators appropriate to quantify the magnitude and the direction of interventions have been recommended and intensively studied for…
Descriptors: Effect Size, Research Methodology, Research Design, Hierarchical Linear Modeling
Peer reviewed Peer reviewed
Direct linkDirect link
Hong, Sanghyun; Reed, W. Robert – Research Synthesis Methods, 2021
The purpose of this study is to show how Monte Carlo analysis of meta-analytic estimators can be used to select estimators for specific research situations. Our analysis conducts 1620 individual experiments, where each experiment is defined by a unique combination of sample size, effect size, effect size heterogeneity, publication selection…
Descriptors: Monte Carlo Methods, Meta Analysis, Research Methodology, Experiments
Peer reviewed Peer reviewed
Direct linkDirect link
Toste, Jessica R.; Logan, Jessica A. R.; Shogren, Karrie A.; Boyd, Brian A. – Exceptional Children, 2023
Group design research studies can provide evidence to draw conclusions about "what works," "for whom," and "under what conditions" in special education. The quality indicators introduced by Gersten and colleagues (2005) have contributed to increased rigor in group design research, which has provided substantial…
Descriptors: Research Design, Educational Research, Special Education, Educational Indicators
Peer reviewed Peer reviewed
Direct linkDirect link
López-López, José A.; Page, Matthew J.; Lipsey, Mark W.; Higgins, Julian P. T. – Research Synthesis Methods, 2018
Systematic reviews often encounter primary studies that report multiple effect sizes based on data from the same participants. These have the potential to introduce statistical dependency into the meta-analytic data set. In this paper, we provide a tutorial on dealing with effect size multiplicity within studies in the context of meta-analyses of…
Descriptors: Effect Size, Literature Reviews, Meta Analysis, Research Methodology
Hedges, Larry V.; Schauer, Jacob M. – Journal of Educational and Behavioral Statistics, 2019
The problem of assessing whether experimental results can be replicated is becoming increasingly important in many areas of science. It is often assumed that assessing replication is straightforward: All one needs to do is repeat the study and see whether the results of the original and replication studies agree. This article shows that the…
Descriptors: Replication (Evaluation), Research Design, Research Methodology, Program Evaluation
Hedges, Larry V.; Schauer, Jacob M. – Grantee Submission, 2019
The problem of assessing whether experimental results can be replicated is becoming increasingly important in many areas of science. It is often assumed that assessing replication is straightforward: All one needs to do is repeat the study and see whether the results of the original and replication studies agree. This article shows that the…
Descriptors: Replication (Evaluation), Research Design, Research Methodology, Program Evaluation
Peer reviewed Peer reviewed
Direct linkDirect link
Manolov, Rumen; Guilera, Georgina; Solanas, Antonio – Remedial and Special Education, 2017
The current text comments on three systematic reviews published in the special section "Issues and Advances in the Systematic Review of Single-Case Research: An Update and Exemplars." The commentary is provided in relation to the need to combine the assessment of the methodological quality of the studies included in systematic reviews,…
Descriptors: Research Design, Meta Analysis, Research Methodology, Functional Behavioral Assessment
Peer reviewed Peer reviewed
Direct linkDirect link
Jamshidi, Laleh; Heyvaert, Mieke; Van den Noortgate, Wim – AERA Online Paper Repository, 2017
Based on the increasing interest in systematic reviews and meta-analyses of Single-Subject Experimental Designs (SSEDs), the aim of the present review is to determine the general characteristics of these meta-analyses, including design characteristics of the primary studies and the meta-analyses, the kind of data, and the kind of analysis. After a…
Descriptors: Research Design, Experiments, Effect Size, Meta Analysis
Peer reviewed Peer reviewed
Direct linkDirect link
Griffith, Catherine; Mariani, Melissa; McMahon, H. George; Zyromski, Brett; Greenspan, Scott B. – Professional School Counseling, 2019
Authors performed a content analysis of school counseling-related intervention research in 21 journals affiliated with the American Counseling Association and the American School Counselor Association across the 10-year span of 2006-2016. Results indicated that minimal school counseling intervention research articles were published (N = 53) in…
Descriptors: School Counselors, School Counseling, Intervention, Educational Research
Previous Page | Next Page »
Pages: 1  |  2  |  3  |  4  |  5  |  6  |  7