Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 0 |
Since 2016 (last 10 years) | 2 |
Since 2006 (last 20 years) | 8 |
Descriptor
Essays | 8 |
Computer Assisted Testing | 5 |
Scoring | 5 |
Student Evaluation | 5 |
Interrater Reliability | 4 |
Reliability | 4 |
Scoring Rubrics | 4 |
Automation | 3 |
Comparative Analysis | 3 |
Elementary Secondary Education | 3 |
Foreign Countries | 3 |
More ▼ |
Source
West Virginia Department of… | 2 |
Council of Chief State School… | 1 |
Educational Sciences: Theory… | 1 |
Journal of Interactive… | 1 |
Journal of Technology,… | 1 |
Learning Policy Institute | 1 |
ProQuest LLC | 1 |
Author
Darling-Hammond, Linda | 2 |
Hixson, Nate | 2 |
Rhudy, Vaughn | 2 |
Alexander, R. Curby | 1 |
Dikli, Semire | 1 |
Ferster, Bill | 1 |
Hammond, Thomas C. | 1 |
Jeffery, Jill V. | 1 |
Lyman, Hunt | 1 |
Oguz, Aytunga | 1 |
Publication Type
Reports - Research | 4 |
Journal Articles | 3 |
Reports - Descriptive | 3 |
Dissertations/Theses -… | 1 |
Numerical/Quantitative Data | 1 |
Education Level
Elementary Secondary Education | 8 |
Higher Education | 3 |
Postsecondary Education | 3 |
Secondary Education | 3 |
High Schools | 2 |
Middle Schools | 1 |
Audience
Location
Australia | 2 |
Connecticut | 2 |
New Hampshire | 2 |
New York | 2 |
Rhode Island | 2 |
United Kingdom (England) | 2 |
Vermont | 2 |
West Virginia | 2 |
Singapore | 1 |
Turkey | 1 |
Laws, Policies, & Programs
Every Student Succeeds Act… | 2 |
Assessments and Surveys
National Assessment of… | 2 |
New York State Regents… | 2 |
What Works Clearinghouse Rating
Hixson, Nate; Rhudy, Vaughn – West Virginia Department of Education, 2012
To provide an opportunity for teachers to better understand the automated scoring process used by the state of West Virginia on our annual West Virginia Educational Standards Test 2 (WESTEST 2) Online Writing Assessment, the West Virginia Department of Education (WVDE) Office of Assessment and Accountability and the Office of Research conduct an…
Descriptors: Writing Tests, Computer Assisted Testing, Automation, Scoring
Darling-Hammond, Linda – Learning Policy Institute, 2017
After passage of the Every Student Succeeds Act (ESSA) in 2015, states assumed greater responsibility for designing their own accountability and assessment systems. ESSA requires states to measure "higher order thinking skills and understanding" and encourages the use of open-ended performance assessments, which are essential for…
Descriptors: Performance Based Assessment, Accountability, Portfolios (Background Materials), Task Analysis
Hixson, Nate; Rhudy, Vaughn – West Virginia Department of Education, 2013
Student responses to the West Virginia Educational Standards Test (WESTEST) 2 Online Writing Assessment are scored by a computer-scoring engine. The scoring method is not widely understood among educators, and there exists a misperception that it is not comparable to hand scoring. To address these issues, the West Virginia Department of Education…
Descriptors: Scoring Formulas, Scoring Rubrics, Interrater Reliability, Test Scoring Machines
Darling-Hammond, Linda – Council of Chief State School Officers, 2017
The Every Student Succeeds Act (ESSA) opened up new possibilities for how student and school success are defined and supported in American public education. States have greater responsibility for designing and building their assessment and accountability systems. These new opportunities to develop performance assessments are critically important…
Descriptors: Performance Based Assessment, Accountability, Portfolios (Background Materials), Task Analysis
Oguz, Aytunga – Educational Sciences: Theory and Practice, 2013
The aim of the present study is to develop a scale to determine how necessary the primary and secondary school teachers view the learner autonomy support behaviours and how much they perform these behaviours. The study group was composed of 324 primary and secondary school teachers. The process of developing the scale involved a literature scan,…
Descriptors: Elementary School Students, Secondary School Students, Personal Autonomy, Student Evaluation
Ferster, Bill; Hammond, Thomas C.; Alexander, R. Curby; Lyman, Hunt – Journal of Interactive Learning Research, 2012
The hurried pace of the modern classroom does not permit formative feedback on writing assignments at the frequency or quality recommended by the research literature. One solution for increasing individual feedback to students is to incorporate some form of computer-generated assessment. This study explores the use of automated assessment of…
Descriptors: Feedback (Response), Scripts, Formative Evaluation, Essays
Jeffery, Jill V. – ProQuest LLC, 2010
"Voice" is widely considered to be a feature of effective writing. It's no surprise, then, that voice criteria frequently appear on rubrics used to score student essays in large-scale writing assessments. However, composition theorists hold vastly different views regarding voice and how it should be applied in the evaluation of student writing, if…
Descriptors: Expository Writing, Evaluators, Writing Evaluation, Writing Tests
Dikli, Semire – Journal of Technology, Learning, and Assessment, 2006
Automated Essay Scoring (AES) is defined as the computer technology that evaluates and scores the written prose (Shermis & Barrera, 2002; Shermis & Burstein, 2003; Shermis, Raymat, & Barrera, 2003). AES systems are mainly used to overcome time, cost, reliability, and generalizability issues in writing assessment (Bereiter, 2003; Burstein,…
Descriptors: Scoring, Writing Evaluation, Writing Tests, Standardized Tests