Methodological Foundations for the Empirical Evaluation of Non-Experimental Methods in Field SettingsReport as inadecuate




Methodological Foundations for the Empirical Evaluation of Non-Experimental Methods in Field Settings - Download this document for free, or read online. Document in PDF available to download.



Society for Research on Educational Effectiveness

Across the disciplines of economics, political science, public policy, and now, education, the randomized controlled trial (RCT) is the preferred methodology for establishing causal inference about program impacts. But randomized experiments are not always feasible because of ethical, political, and/or practical considerations, so non-experimental methods are also needed for identifying "what works." Given the widespread use of non-experimental approaches for assessing program, policy, and intervention impacts, there is a strong need to know whether non-experimental approaches are likely to yield unbiased treatment effects, and the contexts and conditions under which non-experimental methods perform well. Over the last three decades, a research design has emerged to evaluate the performance of non-experimental designs in field settings. It is called the within-study comparison (WSC) design, or design replication study. In the traditional WSC design, treatment effects from an RCT are compared to those produced by a non-experimental (NE) approach that shares the same target population. The non-experiment may be a quasi-experimental (QE) design, such as a regression-discontinuity (RD) or an interrupted time series (ITS) design, or an observational study (OS) approach that includes matching methods, standard regression adjustments, and difference-in-differences methods. The goals of the WSC are to determine (1) whether the non-experiment can replicate results from a randomized experiment (which provides the causal benchmark estimate), and (2) the contexts and conditions under which these methods work in practice. Because applications of the WSC design are published throughout the social and health sciences, important WSC methodological innovations and findings are unknown and underutilized by evaluators and researchers. This paper will address this issue by "developing methodological foundations for within-study comparison designs that evaluate non-experimental methods." It will present a coherent framework that addresses design and analysis issues of WSCs for evaluating non-experimental methods. One figure is appended.

Descriptors: Research Methodology, Research Design, Comparative Analysis, Replication (Evaluation), Randomized Controlled Trials, Statistical Analysis, Feasibility Studies, Educational Research

Society for Research on Educational Effectiveness. 2040 Sheridan Road, Evanston, IL 60208. Tel: 202-495-0920; Fax: 202-640-4401; e-mail: inquiries[at]sree.org; Web site: http://www.sree.org





Author: Wong, Vivian C.; Steiner, Peter M.

Source: https://eric.ed.gov/?q=a&ft=on&ff1=dtySince_1992&pg=1188&id=ED562331



DOWNLOAD PDF




Related documents