NotesFAQContact Us
Collection
Advanced
Search Tips
Publication Date
In 20250
Since 20240
Since 2021 (last 5 years)0
Since 2016 (last 10 years)1
Since 2006 (last 20 years)10
Audience
Researchers1
Assessments and Surveys
Stanford Achievement Tests1
What Works Clearinghouse Rating
Does not meet standards1
Showing all 10 results Save | Export
Zhu, Pei; Jacob, Robin; Bloom, Howard; Xu, Zeyu – MDRC, 2011
This paper provides practical guidance for researchers who are designing and analyzing studies that randomize schools--which comprise three levels of clustering (students in classrooms in schools)--to measure intervention effects on student academic outcomes when information on the middle level (classrooms) is missing. This situation arises…
Descriptors: Intervention, Academic Achievement, Research Methodology, Research Design
Peer reviewed Peer reviewed
Direct linkDirect link
Zhu, Pei; Jacob, Robin; Bloom, Howard; Xu, Zeyu – Educational Evaluation and Policy Analysis, 2012
This paper provides practical guidance for researchers who are designing and analyzing studies that randomize schools--which comprise three levels of clustering (students in classrooms in schools)--to measure intervention effects on student academic outcomes when information on the middle level (classrooms) is missing. This situation arises…
Descriptors: Educational Research, Educational Researchers, Research Methodology, Multivariate Analysis
Jacob, Robin; Zhu, Pei; Somers, Marie-Andrée; Bloom, Howard – MDRC, 2012
Regression discontinuity (RD) analysis is a rigorous nonexperimental approach that can be used to estimate program impacts in situations in which candidates are selected for treatment based on whether their value for a numeric rating exceeds a designated threshold or cut-point. Over the last two decades, the regression discontinuity approach has…
Descriptors: Regression (Statistics), Research Design, Graphs, Computation
Peer reviewed Peer reviewed
Direct linkDirect link
Jacob, Robin; Armstrong, Catherine; Bowden, A. Brooks; Pan, Yilin – Journal of Research on Educational Effectiveness, 2016
This study evaluates the impacts and costs of the Reading Partners program, which uses community volunteers to provide one-on-one tutoring to struggling readers in under-resourced elementary schools. The evaluation uses an experimental design. Students were randomly assigned within 19 different Reading Partners sites to a program or control…
Descriptors: Volunteers, Tutorial Programs, Randomized Controlled Trials, Tutors
Peer reviewed Peer reviewed
Direct linkDirect link
Jacob, Robin; Zhu, Pei; Bloom, Howard – Journal of Research on Educational Effectiveness, 2010
This article provides practical guidance for researchers who are designing studies that randomize groups to measure the impacts of educational interventions. The article (a) provides new empirical information about the values of parameters that influence the precision of impact estimates (intraclass correlations and R[superscript 2] values) and…
Descriptors: Research Design, Research Methodology, Educational Research, Intervention
Peer reviewed Peer reviewed
Direct linkDirect link
Hill, Heather C.; Beisiegel, Mary; Jacob, Robin – Educational Researcher, 2013
Commentaries regarding appropriate methods for researching professional development have been a frequent topic in recent issues of "Educational Researcher" as well as other venues. In this article, the authors extend this discussion by observing that randomized trials of specific professional development programs have not enhanced our…
Descriptors: Faculty Development, Educational Policy, Educational Research, Program Evaluation
Somers, Marie-Andrée; Zhu, Pei; Jacob, Robin; Bloom, Howard – MDRC, 2013
In this paper, we examine the validity and precision of two nonexperimental study designs (NXDs) that can be used in educational evaluation: the comparative interrupted time series (CITS) design and the difference-in-difference (DD) design. In a CITS design, program impacts are evaluated by looking at whether the treatment group deviates from its…
Descriptors: Research Design, Educational Assessment, Time, Intervals
Bloom, Howard; Zhu, Pei; Jacob, Robin; Raudenbush, Stephen; Martinez, Andres; Lin, Fen – MDRC, 2008
This paper provides practical guidance for researchers who are designing studies that randomize groups to measure the impacts of interventions on children. To do so, the paper: (1) provides new empirical information about the values of parameters that influence the precision of impact estimates (intra-class correlations and R-squares); (2)…
Descriptors: Pilot Projects, Research Methodology, Intervention, Sampling
Peer reviewed Peer reviewed
Direct linkDirect link
Rowan, Brian; Jacob, Robin; Correnti, Richard – New Directions for Youth Development, 2009
When attempting to identify educational settings that are most effective in improving student achievement, classroom process (that is, the way in which a teacher interacts with his or her students) is a key feature of interest. Unfortunately, high-quality assessment of the student-teacher interaction occurs all too infrequently, despite the…
Descriptors: Teaching Methods, Educational Quality, Classroom Observation Techniques, Teacher Surveys
Moss, Marc; Jacob, Robin; Boulay, Beth; Horst, Megan; Poulos, Jennifer – US Department of Education, 2006
In October 2003, the US Department of Education contracted with Abt Associates to design and conduct the Reading First Implementation Evaluation. This report focuses on the following questions: (1) How is the Reading First program implemented in districts and schools? and (2) How does reading instruction differ in Reading First schools and…
Descriptors: Economically Disadvantaged, Program Implementation, Program Evaluation, Reading Programs