ERIC Number: EJ1182190
Record Type: Journal
Publication Date: 2014
Pages: 24
Abstractor: As Provided
ISBN: N/A
ISSN: EISSN-2196-0739
EISSN: N/A
Available Date: N/A
Using Response Time to Investigate Students' Test-Taking Behaviors in an NAEP Computer-Based Study
Lee, Yi-Hsuan; Jia, Yue
Large-scale Assessments in Education, v2 Article 8 2014
Background: Large-scale survey assessments have been used for decades to monitor what students know and can do. Such assessments aim at providing group-level scores for various populations, with little or no consequence to individual students for their test performance. Students' test-taking behaviors in survey assessments, particularly the level of test-taking effort, and their effects on performance have been a long-standing question. This paper presents a procedure to examine test-taking behaviors using response time collected from a National Assessment of Educational Progress (NAEP) computer-based study, referred to as MCBS. Methods: A five-step procedure was proposed to identify rapid-guessing behavior in a more systematic manner. It involves a non-model-based approach that classifies student-item pairs as reflecting either solution behavior or rapid-guessing behavior. Three validity checks were incorporated in the validation step to ensure reasonableness of the time boundaries before further investigation. Results of behavior classification were summarized by three measures to investigate whether and how students' test-taking behaviors related to student characteristics, item characteristics, or both. Results: In the MCBS, the validity checks offered compelling evidence that the recommended threshold-identification method was effective in separating rapid-guessing behavior from solution behavior. A very low percent of rapid-guessing behavior was identified, as compared to existing results for different assessments. For this dataset, rapid-guessing behavior had minimum impact on parameter estimation in the IRT modeling. However, the students clearly exhibited different behaviors when they received items that did not match their performance level. We also found disagreement between students' response-time effort and self reports, but based on the observed data, it is unclear whether the disagreement was related to how the students interpreted the background questions. Conclusions: The paper provides a way to address the issue of identifying rapid-guessing behavior, and sheds light on the question about students' extent of engagement in NAEP and the impact, without relying on students' self evaluation or additional costs in test design. It reveals useful information about test-taking behaviors in an NAEP assessment setting that has not been available in the literature. The procedure is applicable to future standard NAEP assessments, as well as other tests, when timing data are available.
Descriptors: Measurement, Test Wiseness, Student Surveys, Response Style (Tests), Reaction Time, Student Behavior, Guessing (Tests), Validity, Classification, Student Characteristics, Test Items, National Competency Tests, Item Response Theory
Springer. Available from: Springer Nature. 233 Spring Street, New York, NY 10013. Tel: 800-777-4643; Tel: 212-460-1500; Fax: 212-348-4505; e-mail: customerservice@springernature.com; Web site: https://bibliotheek.ehb.be:2123/
Publication Type: Journal Articles; Reports - Research
Education Level: N/A
Audience: N/A
Language: English
Sponsor: National Center for Education Statistics (ED)
Authoring Institution: N/A
Identifiers - Assessments and Surveys: National Assessment of Educational Progress
IES Funded: Yes
Grant or Contract Numbers: ED07CO0107
Author Affiliations: N/A