NotesFAQContact Us
Collection
Advanced
Search Tips
Publication Date
In 20250
Since 20240
Since 2021 (last 5 years)0
Since 2016 (last 10 years)0
Since 2006 (last 20 years)43
Source
Assessing Writing53
Laws, Policies, & Programs
No Child Left Behind Act 20011
What Works Clearinghouse Rating
Showing 1 to 15 of 53 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Serviss, Tricia – Assessing Writing, 2012
Drawing upon archival materials, I describe the history, design, and assessment of literacy tests from early 20th century New York state. Practitioners working with these early standardized writing tests grappled with tensions created by public Nativist sentiment, the legislation of "literacy," and calls to score the tests in…
Descriptors: Literacy, Writing Tests, Standardized Tests, Scoring
Peer reviewed Peer reviewed
Direct linkDirect link
Bridgeman, Brent; Trapani, Catherine; Bivens-Tatum, Jennifer – Assessing Writing, 2011
Writing task variants can increase test security in high-stakes essay assessments by substantially increasing the pool of available writing stimuli and by making the specific writing task less predictable. A given prompt (parent) may be used as the basis for one or more different variants. Six variant types based on argument essay prompts from a…
Descriptors: Writing Evaluation, Writing Tests, Tests, Writing Instruction
Peer reviewed Peer reviewed
Direct linkDirect link
Slomp, David H. – Assessing Writing, 2012
This article discusses three sets of challenges involved in the assessment of writing from a developmental perspective. These challenges include defining a workable theory of development, developing a suitable construct, and overcoming limitations in technocentric approaches to writing assessment. In North America in recent years, a burgeoning…
Descriptors: Writing (Composition), Writing Evaluation, Writing Tests, Writing Ability
Peer reviewed Peer reviewed
Direct linkDirect link
Wardle, Elizabeth; Roozen, Kevin – Assessing Writing, 2012
This article offers one potential response to Yancey's (1999) call for a fourth wave of writing assessment able to capture writing development in all of its complexity. Based on an ecological perspective of literate development that situates students' growth as writers across multiple engagements with writing, including those outside of school,…
Descriptors: Writing Evaluation, Writing Tests, Ecology, Writing Instruction
Peer reviewed Peer reviewed
Direct linkDirect link
Johnson, David; VanBrackle, Lewis – Assessing Writing, 2012
Raters of Georgia's (USA) state-mandated college-level writing exam, which is intended to ensure a minimal university-level writing competency, are trained to grade holistically when assessing these exams. A guiding principle in holistic grading is to not focus exclusively on any one aspect of writing but rather to give equal weight to style,…
Descriptors: Writing Evaluation, Linguistics, Writing Tests, English (Second Language)
Peer reviewed Peer reviewed
Direct linkDirect link
Zainal, Azlin – Assessing Writing, 2012
The present study was conducted with a twofold purpose. First, I aim to apply the socio-cognitive framework by Shaw and Weir (2007) in order to validate a summative writing test used in a Malaysian ESL secondary school context. Secondly, by applying the framework I also aim to illustrate practical ways in which teachers can gather validity…
Descriptors: Foreign Countries, Student Evaluation, Writing Tests, Test Validity
Peer reviewed Peer reviewed
Direct linkDirect link
Knoch, Ute – Assessing Writing, 2011
Rating scales act as the de facto test construct in a writing assessment, although inevitably as a simplification of the construct (North, 2003). However, it is often not reported how rating scales are constructed. Unless the underlying framework of a rating scale takes some account of linguistic theory and research in the definition of…
Descriptors: Writing Evaluation, Writing Tests, Rating Scales, Linguistic Theory
Peer reviewed Peer reviewed
Direct linkDirect link
Condon, William – Assessing Writing, 2013
Automated Essay Scoring (AES) has garnered a great deal of attention from the rhetoric and composition/writing studies community since the Educational Testing Service began using e-rater[R] and the "Criterion"[R] Online Writing Evaluation Service as products in scoring writing tests, and most of the responses have been negative. While the…
Descriptors: Measurement, Psychometrics, Evaluation Methods, Educational Testing
Peer reviewed Peer reviewed
Direct linkDirect link
Huang, Jinyan – Assessing Writing, 2012
Using generalizability (G-) theory, this study examined the accuracy and validity of the writing scores assigned to secondary school ESL students in the provincial English examinations in Canada. The major research question that guided this study was: Are there any differences between the accuracy and construct validity of the analytic scores…
Descriptors: Foreign Countries, Generalizability Theory, Writing Evaluation, Writing Tests
Peer reviewed Peer reviewed
Direct linkDirect link
Good, Jennifer M.; Osborne, Kevin; Birchfield, Kelly – Assessing Writing, 2012
Writing is complex, and assessment of writing is equally complex, particularly when considering the need to measure outcomes at the institutional level while providing meaningful data that informs curriculum reform and supports learning at the discipline-level. Using a multi-layered assessment that incorporates standardized measures of writing…
Descriptors: Curriculum Development, Writing Evaluation, Student Evaluation, Writing Tests
Peer reviewed Peer reviewed
Direct linkDirect link
East, Martin – Assessing Writing, 2009
The demand for valid and reliable methods of assessing second and foreign language writing has grown in significance in recent years. One such method is the timed writing test which has a central place in many testing contexts internationally. The reliability of this test method is heavily influenced by the scoring procedures, including the rating…
Descriptors: Scoring Rubrics, Reliability, Second Languages, Writing Tests
Peer reviewed Peer reviewed
Direct linkDirect link
Chen, Jing; White, Sheida; McCloskey, Michael; Soroui, Jaleh; Chun, Young – Assessing Writing, 2011
This study investigated the comparability of paper and computer versions of a functional writing assessment administered to adults 16 and older. Three writing tasks were administered in both paper and computer modes to volunteers in the field test of an assessment of adult literacy in 2008. One set of analyses examined mode effects on scoring by…
Descriptors: Writing Evaluation, Writing Tests, Computer Assisted Testing, Educational Technology
Peer reviewed Peer reviewed
Direct linkDirect link
Behizadeh, Nadia; Engelhard, George, Jr. – Assessing Writing, 2011
The purpose of this study is to examine the interactions among measurement theories, writing theories, and writing assessments in the United States from an historical perspective. The assessment of writing provides a useful framework for examining how theories influence, and in some cases fail to influence actual practice. Two research traditions…
Descriptors: Writing (Composition), Intellectual Disciplines, Writing Evaluation, Writing Tests
Peer reviewed Peer reviewed
Direct linkDirect link
McCurry, Doug – Assessing Writing, 2010
This article considers the claim that machine scoring of writing test responses agrees with human readers as much as humans agree with other humans. These claims about the reliability of machine scoring of writing are usually based on specific and constrained writing tasks, and there is reason for asking whether machine scoring of writing requires…
Descriptors: Writing Tests, Scoring, Interrater Reliability, Computer Assisted Testing
Peer reviewed Peer reviewed
Direct linkDirect link
Parr, Judy M.; Timperley, Helen S. – Assessing Writing, 2010
Traditionally, feedback to writing is written on drafts or given orally in roving or more formal conferences and is considered a significant part of instruction. This paper locates written response within an assessment for learning framework in the writing classroom. Within this framework, quality of response was defined in terms of providing…
Descriptors: Feedback (Response), Pedagogical Content Knowledge, Writing Evaluation, Writing Instruction
Previous Page | Next Page ยป
Pages: 1  |  2  |  3  |  4