NotesFAQContact Us
Collection
Advanced
Search Tips
Back to results
Peer reviewed Peer reviewed
Direct linkDirect link
ERIC Number: ED658612
Record Type: Non-Journal
Publication Date: 2022-Sep-22
Pages: N/A
Abstractor: As Provided
ISBN: N/A
ISSN: N/A
EISSN: N/A
Using Mental Rehearsal Techniques among Pre-Service Teacher Candidates during a Simulated Teaching Opportunity
Anandita Krishnamachari
Society for Research on Educational Effectiveness
A growing body of research shows that findings from studies often fail to replicate (Open Science Collaboration, 2015; Simmons, Nelson, & Simonsohn, 2011). Deviations from study protocols, sampling error, reduced statistical power, and unethical reporting procedures have been cited as reasons for replication failure (Gilbert, King, Pettigrew, and Wilson, 2016). However, measurement instability for assessing outcomes across studies maybe another reason why findings often fail to replicate. At present, most replication efforts assume that outcomes are measured using the same instrument, in the same context, at the same time across studies and that researchers use appropriate modeling techniques when estimating treatment effects. This is concerning, since prior work has shown that both systematic and random sources of measurement error (either because of inconsistent measurement approaches or incorrect modeling techniques) can lead to differences in effect estimates, often by a large magnitude (Flake & Fried, 2019; Soland & Schweig, 2021). Even in cases where the true effect of an intervention or policy is consistent across studies, measurement instability across settings, populations, or timing, as well as poor measurement decisions around how to correctly model outcomes, can lead to erroneous conclusions regarding the replicability of results. This highlights the need for careful consideration of measurement decisions in the design, implementation, and analysis of high-quality replication studies (Simmons et al., 2011). In this paper, we examine methodological considerations for assessing assumptions related to measurement stability for the direct replication of results (Steiner, Wong, & Anglin, 2019). In particular, we discuss two types of replication designs in which researchers may wish to evaluate assumptions related to measurement stability. In the first case, the researcher is conducting a direct replication study, which requires an assumption of measurement stability for comparing effects across studies. The paper will recommend open and transparent descriptions of outcomes measures included in study protocols, as well as diagnostic approaches for evaluating and reporting results related to measurement stability across studies (i.e. the use of measurement models for scoring outcomes and tests of measurement invariance across subpopulations of units and settings). In the second case, the researcher is conducting a conceptual replication study that seeks to evaluate the robustness and replicability of effects when systematic sources of variation in the outcome measure is introduced. The goal here is to identify the extent to which differences in measurement may amplify or dampen effect estimates across replication studies, which is essential for generalizing effects across measures. For example, a researcher may be interested in assessing the replicability of effects across proximal and distal measures of an outcome, or of different administrations of a measure (online vs. in-person) believed to assess the same underlying construct. We demonstrate replication designs for testing assumptions related to measurement stability across studies, and discuss diagnostic measures for assessing the extent to which these assumptions were varied in field settings. Methods are demonstrated using results from a series of systematic replication studies examining the effects of a coaching intervention on teachers' learning of pedagogical skills.
Society for Research on Educational Effectiveness. 2040 Sheridan Road, Evanston, IL 60208. Tel: 202-495-0920; e-mail: contact@sree.org; Web site: https://www.sree.org/
Publication Type: Reports - Research
Education Level: Higher Education; Postsecondary Education
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: Society for Research on Educational Effectiveness (SREE)
Grant or Contract Numbers: N/A