Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 0 |
Since 2016 (last 10 years) | 1 |
Since 2006 (last 20 years) | 2 |
Descriptor
High School Students | 2 |
Intelligent Tutoring Systems | 2 |
Natural Language Processing | 2 |
Scoring | 2 |
Accuracy | 1 |
Automation | 1 |
Chemistry | 1 |
Educational Technology | 1 |
Essays | 1 |
Evaluation Methods | 1 |
Reading Tests | 1 |
More ▼ |
Author
Crossley, Scott | 2 |
McNamara, Danielle S. | 2 |
Allen, Laura K. | 1 |
Davenport, Jodi | 1 |
Kyle, Kristopher | 1 |
Snow, Erica L. | 1 |
Publication Type
Reports - Research | 2 |
Speeches/Meeting Papers | 2 |
Education Level
High Schools | 2 |
Secondary Education | 2 |
Audience
Location
Arizona (Phoenix) | 1 |
California | 1 |
Laws, Policies, & Programs
Assessments and Surveys
Gates MacGinitie Reading Tests | 1 |
Writing Apprehension Test | 1 |
What Works Clearinghouse Rating
Crossley, Scott; Kyle, Kristopher; Davenport, Jodi; McNamara, Danielle S. – International Educational Data Mining Society, 2016
This study introduces the Constructed Response Analysis Tool (CRAT), a freely available tool to automatically assess student responses in online tutoring systems. The study tests CRAT on a dataset of chemistry responses collected in the ChemVLab+. The findings indicate that CRAT can differentiate and classify student responses based on semantic…
Descriptors: Intelligent Tutoring Systems, Chemistry, Natural Language Processing, High School Students
Crossley, Scott; Allen, Laura K.; Snow, Erica L.; McNamara, Danielle S. – Grantee Submission, 2015
This study investigates a new approach to automatically assessing essay quality that combines traditional approaches based on assessing textual features with new approaches that measure student attributes such as demographic information, standardized test scores, and survey results. The results demonstrate that combining both text features and…
Descriptors: Automation, Scoring, Essays, Evaluation Methods