ERIC Number: ED585785
Record Type: Non-Journal
Publication Date: 2014
Pages: 6
Abstractor: As Provided
ISBN: N/A
ISSN: EISSN-
EISSN: N/A
Writing Quality, Knowledge, and Comprehension Correlates of Human and Automated Essay Scoring
Roscoe, Rod D.; Crossley, Scott A.; Snow, Erica L.; Varner, Laura K.; McNamara, Danielle S.
Grantee Submission, Paper presented at the International Florida Artificial Intelligence Research Society Conference (27th, 2014)
Automated essay scoring tools are often criticized on the basis of construct validity. Specifically, it has been argued that computational scoring algorithms may be unaligned to higher-level indicators of quality writing, such as writers' demonstrated knowledge and understanding of the essay topics. In this paper, we consider how and whether the scoring algorithms within an intelligent writing tutor correlate with measures of writing proficiency and students' general knowledge, reading comprehension, and vocabulary skill. Results indicate that the computational algorithms, although less attuned to knowledge and comprehension factors than human raters, were marginally related to such variables. Implications for improving automated scoring and intelligent tutoring of writing are briefly discussed. [This paper was published in: "Proceedings of the Twenty-Seventh International Florida Artificial Intelligence Research Society Conference" (p.393-398). Association for the Advancement of Artificial Intelligence, 2014.]
Publication Type: Speeches/Meeting Papers; Reports - Research
Education Level: High Schools
Audience: N/A
Language: English
Sponsor: Institute of Education Sciences (ED)
Authoring Institution: N/A
Identifiers - Assessments and Surveys: Gates MacGinitie Reading Tests
IES Funded: Yes
Grant or Contract Numbers: R305A080589