ERIC Number: ED498482
Record Type: Non-Journal
Publication Date: 2007-Jun
Pages: 16
Abstractor: Author
ISBN: N/A
ISSN: N/A
EISSN: N/A
Examining the Generalizability of Direct Writing Assessment Tasks. CSE Technical Report 718
Chen, Eva; Niemi, David; Wang, Jia; Wang, Haiwen; Mirocha, Jim
National Center for Research on Evaluation, Standards, and Student Testing (CRESST)
This study investigated the level of generalizability across a few high quality assessment tasks and the validity of measuring student writing ability using a limited number of essay tasks. More specifically, the research team explored how well writing prompts could measure student general writing ability and if student performance from one writing task could be generalized to other similar writing tasks. A total of four writing prompts were used in the study, with three tasks being literature-based and one task based on a short story. A total of 397 students participated in the study and each student was randomly assigned to complete two of the four tasks. The research team found that three to five essays were required to evaluate and make a reliable judgment of student writing performance. (Contains 1 figure and 4 tables.)
Descriptors: Program Effectiveness, Writing Ability, Writing Tests, Writing Evaluation, Writing Achievement, Generalizability Theory, Writing Research, Performance Based Assessment, Task Analysis, Construct Validity, Program Validation, Statistical Inference
National Center for Research on Evaluation, Standards, and Student Testing (CRESST). 300 Charles E Young Drive N, GSE&IS Building 3rd Floor, Mailbox 951522, Los Angeles, CA 90095-1522. Tel: 310-206-1532; Fax: 310-825-3883; Web site: http://www.cresst.org
Publication Type: Reports - Research
Education Level: N/A
Audience: N/A
Language: English
Sponsor: Institute of Education Sciences (ED), Washington, DC.
Authoring Institution: National Center for Research on Evaluation, Standards, and Student Testing, Los Angeles, CA.
Grant or Contract Numbers: N/A