ERIC Number: ED418101
Record Type: RIE
Publication Date: 1997-Dec
Pages: 39
Abstractor: N/A
ISBN: N/A
ISSN: N/A
EISSN: N/A
Methodological Approaches to Online Scoring of Essays.
Chung, Gregory K. W. K.; O'Neil, Harold F., Jr.
This report examines the feasibility of scoring essays using computer-based techniques. Essays have been incorporated into many of the standardized testing programs. Issues of validity and reliability must be addressed to deploy automated approaches to scoring fully. Two approaches that have been used to classify documents, surface- and word-based techniques, are reviewed. The candidate approaches to the automated classification of documents are reviewed, and then how these approaches could be used to achieve the overarching goal of the automated scoring of essays is discussed. The two approaches considered are Project Essay Grade (PEG) (A. Daigon, 1966; E. Page, 1966, 1968, 1994; E. Page and N. Peterson, 1995) and latent semantic analysis (LSA) (P. Foltz, 1996; T. Landauer, D. Laham, B. Rehder, and M. Schreiner, 1997). PEG uses a regression model where the independent variables are surface features of the text (document length, word length, and punctuation) and the dependent variable is the essay score, and LSA is based on a factor-analytic model of word co-occurrences. Following the review of PEG and LSA, additional uses of automated scoring of text-based data are explored. The final section outlines a plan for a feasibility study of the automated processing of text-based data. One assumption of this report and of both scoring approaches is that the human rating is the best estimate of the true essay score, and that there will almost always be a need for some portion of the documents to be scored using multiple trained raters. The potential of the proposed investigation lies more in the potential for practical spin-offs than in any theoretical contribution to writing, education, assessment, or cognition. (Contains 5 figures, 2 tables, and 50 references.) (SLD)
Publication Type: Reports - Evaluative
Education Level: N/A
Audience: N/A
Language: English
Sponsor: Office of Educational Research and Improvement (ED), Washington, DC.
Authoring Institution: National Center for Research on Evaluation, Standards, and Student Testing, Los Angeles, CA.
Grant or Contract Numbers: N/A