Publication Date
In 2025 | 0 |
Since 2024 | 1 |
Since 2021 (last 5 years) | 4 |
Since 2016 (last 10 years) | 8 |
Since 2006 (last 20 years) | 11 |
Descriptor
Source
Educational Assessment | 11 |
Author
Ann S. Rosebery | 1 |
Bulut, Hatice Cigdem | 1 |
Bulut, Okan | 1 |
Cormier, Damien C. | 1 |
Craig S. Wells | 1 |
Custer, Michael | 1 |
Dawadi, Saraswati | 1 |
DeMars, Christine E. | 1 |
H. Li | 1 |
Ilgun Dibek, Munevver | 1 |
J. A. Bialo | 1 |
More ▼ |
Publication Type
Journal Articles | 11 |
Reports - Research | 8 |
Reports - Evaluative | 3 |
Tests/Questionnaires | 1 |
Education Level
Elementary Education | 3 |
Higher Education | 3 |
Grade 5 | 2 |
Intermediate Grades | 2 |
Middle Schools | 2 |
Postsecondary Education | 2 |
Primary Education | 2 |
Secondary Education | 2 |
Early Childhood Education | 1 |
Grade 10 | 1 |
Grade 3 | 1 |
More ▼ |
Audience
Location
Massachusetts | 1 |
Nepal | 1 |
United Kingdom (England) | 1 |
Laws, Policies, & Programs
Assessments and Surveys
Massachusetts Comprehensive… | 1 |
Program for International… | 1 |
What Works Clearinghouse Rating
J. A. Bialo; H. Li – Educational Assessment, 2024
This study evaluated differential item functioning (DIF) in achievement motivation items before and after using anchoring vignettes as a statistical tool to account for group differences in response styles across gender and ethnicity. We applied the nonparametric scoring of the vignettes to motivation items from the 2015 Programme for…
Descriptors: Test Bias, Student Motivation, Achievement Tests, Secondary School Students
Ulitzsch, Esther; Penk, Christiane; von Davier, Matthias; Pohl, Steffi – Educational Assessment, 2021
Identifying and considering test-taking effort is of utmost importance for drawing valid inferences on examinee competency in low-stakes tests. Different approaches exist for doing so. The speed-accuracy+engagement model aims at identifying non-effortful test-taking behavior in terms of nonresponse and rapid guessing based on responses and…
Descriptors: Response Style (Tests), Guessing (Tests), Reaction Time, Measurement Techniques
Bulut, Okan; Bulut, Hatice Cigdem; Cormier, Damien C.; Ilgun Dibek, Munevver; Sahin Kursad, Merve – Educational Assessment, 2023
Some statewide testing programs allow students to receive corrective feedback and revise their answers during testing. Despite its pedagogical benefits, the effects of providing revision opportunities remain unknown in the context of alternate assessments. Therefore, this study examined student data from a large-scale alternate assessment that…
Descriptors: Error Correction, Alternative Assessment, Feedback (Response), Multiple Choice Tests
Tracy Noble; Craig S. Wells; Ann S. Rosebery – Educational Assessment, 2023
This article reports on two quantitative studies of English learners' (ELs) interactions with constructed-response items from a Grade 5 state science test. Study 1 investigated the relationships between the constructed-response item-level variables of English Reading Demand, English Writing Demand, and Background Knowledge Demand and the…
Descriptors: Grade 5, State Standards, Standardized Tests, Science Tests
Pastor, Dena A.; Ong, Thai Q.; Strickman, Scott N. – Educational Assessment, 2019
The trustworthiness of low-stakes assessment results largely depends on examinee effort, which can be measured by the amount of time examinees devote to items using solution behavior (SB) indices. Because SB indices are calculated for each item, they can be used to understand how examinee motivation changes across items within a test. Latent class…
Descriptors: Behavior Patterns, Test Items, Time, Response Style (Tests)
Soland, James; Kuhfeld, Megan – Educational Assessment, 2019
Considerable research has examined the use of rapid guessing measures to identify disengaged item responses. However, little is known about students who rapidly guess over the course of several tests. In this study, we use achievement test data from six administrations over three years to investigate whether rapid guessing is a stable trait-like…
Descriptors: Testing, Guessing (Tests), Reaction Time, Achievement Tests
Dawadi, Saraswati; Shrestha, Prithvi N. – Educational Assessment, 2018
There has been a steady interest in investigating the validity of language tests in the last decades. Despite numerous studies on construct validity in language testing, there are not many studies examining the construct validity of a reading test. This paper reports on a study that explored the construct validity of the English reading test in…
Descriptors: Foreign Countries, Construct Validity, Reading Tests, English (Second Language)
Roohr, Katrina Crotts; Sireci, Stephen G. – Educational Assessment, 2017
Test accommodations for English learners (ELs) are intended to reduce the language barrier and level the playing field, allowing ELs to better demonstrate their true proficiencies. Computer-based accommodations for ELs show promising results for leveling that field while also providing us with additional data to more closely investigate the…
Descriptors: Testing Accommodations, English Language Learners, Second Language Learning, Computer Assisted Testing
Petridou, Alexandra; Williams, Julian – Educational Assessment, 2010
The person-fit literature assumes that aberrant response patterns could be a sign of person mismeasurement, but this assumption has rarely, if ever, been empirically investigated before. We explore the validity of test responses and measures of 10-year-old examinees whose response patterns on a commercial standardized paper-and-pencil mathematics…
Descriptors: Validity, Measurement, Response Style (Tests), Scores
DeMars, Christine E. – Educational Assessment, 2007
A series of 8 tests was administered to university students over 4 weeks for program assessment purposes. The stakes of these tests were low for students; they received course points based on test completion, not test performance. Tests were administered in a counterbalanced order across 2 administrations. Response time effort, a measure of the…
Descriptors: Reaction Time, Guessing (Tests), Testing Programs, College Students
Pomplun, Mark; Ritchie, Timothy; Custer, Michael – Educational Assessment, 2006
This study investigated factors related to score differences on computerized and paper-and-pencil versions of a series of primary K-3 reading tests. Factors studied included item and student characteristics. The results suggest that the score differences were more related to student than item characteristics. These student characteristics include…
Descriptors: Reading Tests, Student Characteristics, Response Style (Tests), Socioeconomic Status