ERIC Number: ED562331
Record Type: Non-Journal
Publication Date: 2015
Pages: 7
Abstractor: ERIC
ISBN: N/A
ISSN: N/A
EISSN: N/A
Available Date: N/A
Methodological Foundations for the Empirical Evaluation of Non-Experimental Methods in Field Settings
Wong, Vivian C.; Steiner, Peter M.
Society for Research on Educational Effectiveness
Across the disciplines of economics, political science, public policy, and now, education, the randomized controlled trial (RCT) is the preferred methodology for establishing causal inference about program impacts. But randomized experiments are not always feasible because of ethical, political, and/or practical considerations, so non-experimental methods are also needed for identifying "what works." Given the widespread use of non-experimental approaches for assessing program, policy, and intervention impacts, there is a strong need to know whether non-experimental approaches are likely to yield unbiased treatment effects, and the contexts and conditions under which non-experimental methods perform well. Over the last three decades, a research design has emerged to evaluate the performance of non-experimental designs in field settings. It is called the within-study comparison (WSC) design, or design replication study. In the traditional WSC design, treatment effects from an RCT are compared to those produced by a non-experimental (NE) approach that shares the same target population. The non-experiment may be a quasi-experimental (QE) design, such as a regression-discontinuity (RD) or an interrupted time series (ITS) design, or an observational study (OS) approach that includes matching methods, standard regression adjustments, and difference-in-differences methods. The goals of the WSC are to determine (1) whether the non-experiment can replicate results from a randomized experiment (which provides the causal benchmark estimate), and (2) the contexts and conditions under which these methods work in practice. Because applications of the WSC design are published throughout the social and health sciences, important WSC methodological innovations and findings are unknown and underutilized by evaluators and researchers. This paper will address this issue by "developing methodological foundations for within-study comparison designs that evaluate non-experimental methods." It will present a coherent framework that addresses design and analysis issues of WSCs for evaluating non-experimental methods. One figure is appended.
Descriptors: Research Methodology, Research Design, Comparative Analysis, Replication (Evaluation), Randomized Controlled Trials, Statistical Analysis, Feasibility Studies, Educational Research
Society for Research on Educational Effectiveness. 2040 Sheridan Road, Evanston, IL 60208. Tel: 202-495-0920; Fax: 202-640-4401; e-mail: inquiries@sree.org; Web site: http://www.sree.org
Publication Type: Reports - Research
Education Level: N/A
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: Society for Research on Educational Effectiveness (SREE)
Grant or Contract Numbers: N/A
Author Affiliations: N/A