Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 1 |
Since 2016 (last 10 years) | 2 |
Since 2006 (last 20 years) | 3 |
Descriptor
Source
Grantee Submission | 3 |
Author
Adrea J. Truckenmiller | 1 |
DeBoer, George E. | 1 |
Eunsoo Cho | 1 |
Gary A. Troia | 1 |
Hardcastle, Joseph | 1 |
Herrmann-Abell, Cari F. | 1 |
Kachchaf, Rachel | 1 |
Noble, Tracy | 1 |
O'Connor, Mary Catherine | 1 |
Rosebery, Ann | 1 |
Wang, Yang | 1 |
More ▼ |
Publication Type
Reports - Research | 3 |
Speeches/Meeting Papers | 2 |
Education Level
Grade 5 | 3 |
Elementary Education | 2 |
Grade 8 | 2 |
Intermediate Grades | 2 |
Middle Schools | 2 |
Elementary Secondary Education | 1 |
Grade 10 | 1 |
Grade 11 | 1 |
Grade 12 | 1 |
Grade 4 | 1 |
Grade 6 | 1 |
More ▼ |
Audience
Location
Michigan | 1 |
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Adrea J. Truckenmiller; Eunsoo Cho; Gary A. Troia – Grantee Submission, 2022
Although educators frequently use assessment to identify who needs supplemental instruction and if that instruction is working, there is a lack of research investigating assessment that informs what instruction students need. The purpose of the current study was to determine if a brief (approximately 20 min) task that reflects a common middle…
Descriptors: Middle School Teachers, Middle School Students, Test Validity, Writing (Composition)
Herrmann-Abell, Cari F.; Hardcastle, Joseph; DeBoer, George E. – Grantee Submission, 2018
We compared students' performance on a paper-based test (PBT) and three computer-based tests (CBTs). The three computer-based tests used different test navigation and answer selection features, allowing us to examine how these features affect student performance. The study sample consisted of 9,698 fourth through twelfth grade students from across…
Descriptors: Evaluation Methods, Tests, Computer Assisted Testing, Scores
Kachchaf, Rachel; Noble, Tracy; Rosebery, Ann; Wang, Yang; Warren, Beth; O'Connor, Mary Catherine – Grantee Submission, 2014
Most research on linguistic features of test items negatively impacting English language learners' (ELLs') performance has focused on lexical and syntactic features, rather than discourse features that operate at the level of the whole item. This mixed-methods study identified two discourse features in 162 multiple-choice items on a standardized…
Descriptors: English Language Learners, Science Tests, Test Items, Discourse Analysis