Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 0 |
Since 2016 (last 10 years) | 2 |
Since 2006 (last 20 years) | 6 |
Descriptor
Source
Grantee Submission | 3 |
Journal of Research in Reading | 1 |
Reading and Writing: An… | 1 |
Written Communication | 1 |
Author
Publication Type
Journal Articles | 6 |
Reports - Research | 5 |
Tests/Questionnaires | 2 |
Reports - Evaluative | 1 |
Education Level
High Schools | 3 |
Higher Education | 2 |
Postsecondary Education | 1 |
Secondary Education | 1 |
Audience
Location
Hong Kong | 1 |
Laws, Policies, & Programs
Assessments and Surveys
Test of English as a Foreign… | 1 |
What Works Clearinghouse Rating
Crossley, Scott A.; Rose, Dani Francuz; Danekes, Cassondra; Rose, Charles Wesley; McNamara, Danielle S. – Reading and Writing: An Interdisciplinary Journal, 2017
This paper examines the effects of attended and unattended demonstratives on text processing, comprehension, and writing quality in two studies. In the first study, participants (n = 45) read 64 mini-stories in a self-paced reading task and identified the main referent in the clauses. The sentences varied in the type of demonstratives (i.e., this,…
Descriptors: Nouns, Phrase Structure, Form Classes (Languages), Connected Discourse
Crossley, Scott A.; McNamara, Danielle S. – Grantee Submission, 2016
This study examines links between essay quality and text elaboration and text cohesion. For this study, 35 students wrote two essays (on two different prompts) and for each, were given 15 minutes to elaborate on their original text. An expert in discourse comprehension then modified the original and elaborated essays to increase cohesion,…
Descriptors: Essays, Writing Assignments, Writing Skills, Connected Discourse
Crossley, Scott A.; Kyle, Kristopher; Allen, Laura K.; Guo, Liang; McNamara, Danielle S. – Grantee Submission, 2014
This study investigates the potential for linguistic microfeatures related to length, complexity, cohesion, relevance, topic, and rhetorical style to predict L2 writing proficiency. Computational indices were calculated by two automated text analysis tools (Coh- Metrix and the Writing Assessment Tool) and used to predict human essay ratings in a…
Descriptors: Computational Linguistics, Essays, Scoring, Writing Evaluation
Roscoe, Rod D.; Varner, Laura K.; Crossley, Scott A.; McNamara, Danielle S. – Grantee Submission, 2013
Various computer tools have been developed to support educators' assessment of student writing, including automated essay scoring and automated writing evaluation systems. Research demonstrates that these systems exhibit relatively high scoring accuracy but uncertain instructional efficacy. Students' writing proficiency does not necessarily…
Descriptors: Writing Instruction, Intelligent Tutoring Systems, Computer Assisted Testing, Writing Evaluation
Crossley, Scott A.; McNamara, Danielle S. – Journal of Research in Reading, 2012
This study addresses research gaps in predicting second language (L2) writing proficiency using linguistic features. Key to this analysis is the inclusion of linguistic measures at the surface, textbase and situation model level that assess text cohesion and linguistic sophistication. The results of this study demonstrate that five variables…
Descriptors: Writing Instruction, Familiarity, Second Language Learning, Word Frequency
McNamara, Danielle S.; Crossley, Scott A.; McCarthy, Philip M. – Written Communication, 2010
In this study, a corpus of expert-graded essays, based on a standardized scoring rubric, is computationally evaluated so as to distinguish the differences between those essays that were rated as high and those rated as low. The automated tool, Coh-Metrix, is used to examine the degree to which high- and low-proficiency essays can be predicted by…
Descriptors: Essays, Undergraduate Students, Educational Quality, Computational Linguistics