NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 4 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Zhu, Pei; Jacob, Robin; Bloom, Howard; Xu, Zeyu – Educational Evaluation and Policy Analysis, 2012
This paper provides practical guidance for researchers who are designing and analyzing studies that randomize schools--which comprise three levels of clustering (students in classrooms in schools)--to measure intervention effects on student academic outcomes when information on the middle level (classrooms) is missing. This situation arises…
Descriptors: Educational Research, Educational Researchers, Research Methodology, Multivariate Analysis
Peer reviewed Peer reviewed
Direct linkDirect link
Jacob, Robin; Armstrong, Catherine; Bowden, A. Brooks; Pan, Yilin – Journal of Research on Educational Effectiveness, 2016
This study evaluates the impacts and costs of the Reading Partners program, which uses community volunteers to provide one-on-one tutoring to struggling readers in under-resourced elementary schools. The evaluation uses an experimental design. Students were randomly assigned within 19 different Reading Partners sites to a program or control…
Descriptors: Volunteers, Tutorial Programs, Randomized Controlled Trials, Tutors
Peer reviewed Peer reviewed
Direct linkDirect link
Jacob, Robin; Zhu, Pei; Bloom, Howard – Journal of Research on Educational Effectiveness, 2010
This article provides practical guidance for researchers who are designing studies that randomize groups to measure the impacts of educational interventions. The article (a) provides new empirical information about the values of parameters that influence the precision of impact estimates (intraclass correlations and R[superscript 2] values) and…
Descriptors: Research Design, Research Methodology, Educational Research, Intervention
Somers, Marie-Andrée; Zhu, Pei; Jacob, Robin; Bloom, Howard – MDRC, 2013
In this paper, we examine the validity and precision of two nonexperimental study designs (NXDs) that can be used in educational evaluation: the comparative interrupted time series (CITS) design and the difference-in-difference (DD) design. In a CITS design, program impacts are evaluated by looking at whether the treatment group deviates from its…
Descriptors: Research Design, Educational Assessment, Time, Intervals