NotesFAQContact Us
Collection
Advanced
Search Tips
Back to results
Peer reviewed Peer reviewed
Direct linkDirect link
ERIC Number: EJ1439655
Record Type: Journal
Publication Date: 2024
Pages: 16
Abstractor: As Provided
ISBN: N/A
ISSN: ISSN-0895-7347
EISSN: EISSN-1532-4818
Available Date: N/A
Automated Scoring of Short-Answer Questions: A Progress Report
Brian E. Clauser; Victoria Yaneva; Peter Baldwin; Le An Ha; Janet Mee
Applied Measurement in Education, v37 n3 p209-224 2024
Multiple-choice questions have become ubiquitous in educational measurement because the format allows for efficient and accurate scoring. Nonetheless, there remains continued interest in constructed-response formats. This interest has driven efforts to develop computer-based scoring procedures that can accurately and efficiently score these items. Early procedures were typically based on surface features of the responses or simple matching procedures, but recent developments in natural language processing have allowed for much more sophisticated approaches. This paper reports on a state-of-the-art methodology for scoring short answer questions supported by a large language model. Responses were collected in the context of a high-stakes test for medical students. More than 35,000 responses were collected across 71 studied items. Aggregated across all responses the proportion of agreement with human scores ranged from 0.97 to 0.99 (depending on specifics such as training sample size). In addition to reporting detailed results, the paper discusses practical issues that require consideration when adopting this type of scoring system.
Routledge. Available from: Taylor & Francis, Ltd. 530 Walnut Street Suite 850, Philadelphia, PA 19106. Tel: 800-354-1420; Tel: 215-625-8900; Fax: 215-207-0050; Web site: http://www.tandf.co.uk/journals
Publication Type: Journal Articles; Reports - Research
Education Level: N/A
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: N/A
Grant or Contract Numbers: N/A
Author Affiliations: N/A