NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 5 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Abedi, Jamal; Bayley, Robert; Ewers, Nancy; Mundhenk, Kimberly; Leon, Seth; Kao, Jenny; Herman, Joan – International Journal of Disability, Development and Education, 2012
Assessments developed and field tested for the mainstream student population may not be accessible for students with disabilities (SWDs) as a result of the impact of extraneous variables, including cognitive features, such as depth of knowledge required, grammatical and lexical complexity, lexical density, and textual/visual features. This study…
Descriptors: Test Items, Disabilities, Cognitive Ability, Cognitive Psychology
Abedi, Jamal; Leon, Seth; Kao, Jenny; Bayley, Robert; Ewers, Nancy; Herman, Joan; Mundhenk, Kimberly – National Center for Research on Evaluation, Standards, and Student Testing (CRESST), 2011
The purpose of this study was to examine the characteristics of reading test items that may differentially impede the performance of students with disabilities. The findings suggest that there are certain revisions that can be done on current assessments to make them more accessible for students with disabilities. Features such as words per page,…
Descriptors: Test Items, Reading Tests, Disabilities, Student Evaluation
Abedi, Jamal; Kao, Jenny C.; Leon, Seth; Sullivan, Lisa; Herman, Joan L.; Pope, Rita; Nambiar, Veena; Mastergeorge, Ann M. – National Center for Research on Evaluation, Standards, and Student Testing (CRESST), 2008
This study sought to explore factors that affect the accessibility of reading comprehension assessments for students with disabilities. The study consisted of testing students using reading comprehension passages that were broken down into shorter "segments" or "chunks." The results of the segmenting study indicated that: (a)…
Descriptors: Reading Comprehension, Disabilities, Reading Tests, Test Reliability
Abedi, Jamal; Leon, Seth; Kao, Jenny C. – National Center for Research on Evaluation, Standards, and Student Testing (CRESST), 2008
This study examines the incorrect response choices, or distractors, by students with disabilities in standardized reading assessments. Differential distractor functioning (DDF) analysis differs from differential item functioning (DIF) analysis, which treats all answers alike and examines all wrong answers against the correct answer. DDF analysis…
Descriptors: Test Bias, Disabilities, Grade 9, Grade 3
Abedi, Jamal; Leon, Seth; Kao, Jenny C. – National Center for Research on Evaluation, Standards, and Student Testing (CRESST), 2008
This study examines performance differences between students with disabilities and students without disabilities students using differential item functioning (DIF) analyses in a high-stakes reading assessment. Results indicated that for Grade 9, many items exhibited DIF. Items that exhibited DIF were more likely to be located in the second half…
Descriptors: Test Bias, Test Items, Student Evaluation, Disabilities