NotesFAQContact Us
Collection
Advanced
Search Tips
Audience
Researchers1
Location
Laws, Policies, & Programs
Assessments and Surveys
Program for International…1
What Works Clearinghouse Rating
Showing all 15 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Aydin, Orhan; Tanious, René – Journal of Applied Behavior Analysis, 2022
Visual analysis and nonoverlap-based effect sizes are predominantly used in analyzing single case experimental designs (SCEDs). Although they are popular analytical methods for SCEDs, they have certain limitations. In this study, a new effect size calculation model for SCEDs, named performance criteria-based effect size (PCES), is proposed…
Descriptors: Evaluation Criteria, Effect Size, Research Design, Data Analysis
Peer reviewed Peer reviewed
Direct linkDirect link
Prathiba Natesan Batley; Erica B. McClure; Brandy Brewer; Ateka A. Contractor; Nicholas John Batley; Larry Vernon Hedges; Stephanie Chin – Grantee Submission, 2023
N-of-1 trials, a special case of Single Case Experimental Designs (SCEDs), are prominent in clinical medical research and specifically psychiatry due to the growing significance of precision/personalized medicine. It is imperative that these clinical trials be conducted, and their data analyzed, using the highest standards to guard against threats…
Descriptors: Medical Research, Research Design, Data Analysis, Effect Size
Declercq, Lies; Jamshidi, Laleh; Fernández-Castilla, Belen; Moeyaert, Mariola; Natasha, Beretvas S.; Ferron, John M.; Van den Noortgate, Wim – Grantee Submission, 2020
To conduct a multilevel meta-analysis of multiple single-case experimental design (SCED) studies, the individual participant data (IPD) can be analyzed in one or two stages. In the one-stage approach, a multilevel model is estimated based on the raw data. In the two-stage approach, an effect size is calculated for each participant and these effect…
Descriptors: Research Design, Data Analysis, Effect Size, Models
Peer reviewed Peer reviewed
Direct linkDirect link
Brunner, Martin; Keller, Lena; Stallasch, Sophie E.; Kretschmann, Julia; Hasl, Andrea; Preckel, Franzis; Lüdtke, Oliver; Hedges, Larry V. – Research Synthesis Methods, 2023
Descriptive analyses of socially important or theoretically interesting phenomena and trends are a vital component of research in the behavioral, social, economic, and health sciences. Such analyses yield reliable results when using representative individual participant data (IPD) from studies with complex survey designs, including educational…
Descriptors: Meta Analysis, Surveys, Research Design, Educational Research
Peer reviewed Peer reviewed
Direct linkDirect link
Toste, Jessica R.; Logan, Jessica A. R.; Shogren, Karrie A.; Boyd, Brian A. – Exceptional Children, 2023
Group design research studies can provide evidence to draw conclusions about "what works," "for whom," and "under what conditions" in special education. The quality indicators introduced by Gersten and colleagues (2005) have contributed to increased rigor in group design research, which has provided substantial…
Descriptors: Research Design, Educational Research, Special Education, Educational Indicators
Steenbergen-Hu, Saiying; Olszewski-Kubilius, Paula – Gifted Child Quarterly, 2016
This methodological brief introduces basic procedures and issues for conducting a high-quality meta-analysis in gifted education. Specifically, we discuss issues such as how to select a topic and formulate research problems, search for and identify qualified studies, code studies and extract data, choose and calculate effect sizes, analyze data,…
Descriptors: Meta Analysis, Academically Gifted, Research Methodology, Research Problems
Peer reviewed Peer reviewed
Direct linkDirect link
Gorard, Stephen – International Journal of Research & Method in Education, 2013
Experimental designs involving the randomization of cases to treatment and control groups are powerful and under-used in many areas of social science and social policy. This paper reminds readers of the pre-and post-test, and the post-test only, designs, before explaining briefly how measurement errors propagate according to error theory. The…
Descriptors: Pretests Posttests, Research Design, Comparative Analysis, Data Analysis
Peer reviewed Peer reviewed
Direct linkDirect link
Haber, Mason G.; Mazzotti, Valerie L.; Mustian, April L.; Rowe, Dawn A.; Bartholomew, Audrey L.; Test, David W.; Fowler, Catherine H. – Review of Educational Research, 2016
Students with disabilities experience poorer post-school outcomes compared with their peers without disabilities. Existing experimental literature on "what works" for improving these outcomes is rare; however, a rapidly growing body of research investigates correlational relationships between experiences in school and post-school…
Descriptors: Meta Analysis, Predictor Variables, Success, Postsecondary Education
Peer reviewed Peer reviewed
Direct linkDirect link
Wester, Kelly L.; Borders, L. DiAnne; Boul, Steven; Horton, Evette – Journal of Counseling & Development, 2013
The purpose of this study was to examine the quality of quantitative articles published in the "Journal of Counseling & Development." Quality concerns arose in regard to omissions of psychometric information of instruments, effect sizes, and statistical power. Type VI and II errors were found. Strengths included stated research…
Descriptors: Periodicals, Journal Articles, Counseling, Research
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Citkowicz, Martyna; Hedges, Larry V. – Society for Research on Educational Effectiveness, 2013
In some instances, intentionally or not, study designs are such that there is clustering in one group but not in the other. This paper describes methods for computing effect size estimates and their variances when there is clustering in only one group and the analysis has not taken that clustering into account. The authors provide the effect size…
Descriptors: Multivariate Analysis, Effect Size, Sampling, Sample Size
Peer reviewed Peer reviewed
Direct linkDirect link
Byiers, Breanne J.; Reichle, Joe; Symons, Frank J. – American Journal of Speech-Language Pathology, 2012
Purpose: Single-subject experimental designs (SSEDs) represent an important tool in the development and implementation of evidence-based practice in communication sciences and disorders. The purpose of this article is to review the strategies and tactics of SSEDs and their application in speech-language pathology research. Method: The authors…
Descriptors: Evidence, Research Design, Speech Language Pathology, Intervention
Plonsky, Luke – ProQuest LLC, 2011
I began this study with two assumptions. Assumption 1: Study quality matters. If the means by which researchers design, carry out, and report on their studies lack in rigor or transparency, theory and practice are likely to be misguided or at least decelerated. Assumption 2 is an implication of Assumption 1: Quality should be measured rather than…
Descriptors: Evidence, Conferences (Gatherings), Investigations, Applied Linguistics
Peer reviewed Peer reviewed
Direct linkDirect link
Harvey, Shane T.; Boer, Diana; Meyer, Luanna H.; Evans, Ian M. – Journal of Intellectual & Developmental Disability, 2009
Background: This meta-analysis of interventions with challenging behaviour in children with disabilities updates a comprehensive meta-analysis that previously addressed reported standards of practice and effectiveness of different strategies. Method: Four effect-size algorithms were calculated for published intervention cases, and results analysed…
Descriptors: Research Design, Intervention, Meta Analysis, Disabilities
Peer reviewed Peer reviewed
Direct linkDirect link
Byrd, Jimmy K. – Educational Administration Quarterly, 2007
Purpose: The purpose of this study was to review research published by Educational Administration Quarterly (EAQ) during the past 10 years to determine if confidence intervals and effect sizes were being reported as recommended by the American Psychological Association (APA) Publication Manual. Research Design: The author examined 49 volumes of…
Descriptors: Research Design, Intervals, Statistical Inference, Effect Size
Peer reviewed Peer reviewed
Osgood, D. Wayne; Smith, Gail L. – Evaluation Review, 1995
Strategies are presented for analyzing longitudinal research designs with many waves of data using hierarchical linear modeling. The approach defines well-focused parameters that yield meaningful effect size estimates and significance tests. It is illustrated with data from the Boys Town Follow-Up Study. (SLD)
Descriptors: Data Analysis, Effect Size, Estimation (Mathematics), Evaluation Methods