ERIC Number: ED400293
Record Type: Non-Journal
Publication Date: 1996-Apr
Pages: 23
Abstractor: N/A
ISBN: N/A
ISSN: N/A
EISSN: N/A
Differential Facet Functioning Detection in Direct Writing Assessment.
Du, Yi; And Others
In the framework of performance assessment, because of the involvement of many facets, the development of ways to detect differential item functioning or differential facet functioning (DFF) has lagged beyond the practical needs of test developers. To monitor the validity and fairness of an assessment, it is critical to discover a method that can detect multiple sources of potential DFF from raters, item, topics, and other facets. Many-faceted Rasch modeling with the FACETS software provides a powerful way to detect DFF in performance assessment. This study focuses on raters and topic types as two sources of DFF using the FACETS model. Data came from 1,734 essays written by 867 students in grades 6, 8, and 10 as part of the Illinois Goal Assessment Program. A measurement model of eight facets was used. With the FACETS model, DFF analysis of raters identified biased raters. Evidence was also found that bias on the part of these raters affected students' writing ability estimates. DFF statistics for topic types and student demography showed effects of performance of topic types on student subgroups and provided evidence of gender and age impacts on different topic types. (Contains 3 figures and 12 tables.) (SLD)
Descriptors: Age Differences, Elementary School Students, Elementary Secondary Education, Essay Tests, High School Students, Identification, Interrater Reliability, Item Bias, Item Response Theory, Performance Based Assessment, Sex Differences, Test Construction, Writing (Composition), Writing Tests
Publication Type: Reports - Evaluative; Speeches/Meeting Papers
Education Level: N/A
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: N/A
Grant or Contract Numbers: N/A