Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 1 |
Since 2016 (last 10 years) | 6 |
Since 2006 (last 20 years) | 13 |
Descriptor
Source
Grantee Submission | 9 |
International Educational… | 1 |
Journal of Educational Data… | 1 |
Journal of Educational… | 1 |
Journal of Learning Analytics | 1 |
Author
Publication Type
Reports - Research | 13 |
Speeches/Meeting Papers | 6 |
Journal Articles | 5 |
Tests/Questionnaires | 3 |
Education Level
High Schools | 13 |
Secondary Education | 11 |
Higher Education | 3 |
Postsecondary Education | 3 |
Grade 10 | 1 |
Audience
Researchers | 1 |
Teachers | 1 |
Location
Arizona | 1 |
Arizona (Phoenix) | 1 |
Laws, Policies, & Programs
Assessments and Surveys
Gates MacGinitie Reading Tests | 8 |
Writing Apprehension Test | 1 |
What Works Clearinghouse Rating
Sonia, Allison N.; Joseph, Magliano P.; McCarthy, Kathryn S.; Creer, Sarah D.; McNamara, Danielle S.; Allen, Laura K. – Grantee Submission, 2022
The constructed responses individuals generate while reading can provide insights into their coherence-building processes. The current study examined how the cohesion of constructed responses relates to performance on an integrated writing task. Participants (N = 95) completed a multiple document reading task wherein they were prompted to think…
Descriptors: Natural Language Processing, Connected Discourse, Reading Processes, Writing Skills
Allen, Laura K.; Likens, Aaron D.; McNamara, Danielle S. – Grantee Submission, 2018
The assessment of writing proficiency generally includes analyses of the specific linguistic and rhetorical features contained in the singular essays produced by students. However, researchers have recently proposed that an individual's ability to flexibly adapt the linguistic properties of their writing might more closely capture writing skill.…
Descriptors: Writing Evaluation, Writing Tests, Computer Assisted Testing, Writing Skills
Allen, Laura K.; McNamara, Danielle S. – International Educational Data Mining Society, 2015
The current study investigates the degree to which the lexical properties of students' essays can inform stealth assessments of their vocabulary knowledge. In particular, we used indices calculated with the natural language processing tool, TAALES, to predict students' performance on a measure of vocabulary knowledge. To this end, two corpora were…
Descriptors: Vocabulary, Knowledge Level, Models, Natural Language Processing
Allen, Laura K.; Likens, Aaron D.; McNamara, Danielle S. – Grantee Submission, 2018
The assessment of argumentative writing generally includes analyses of the specific linguistic and rhetorical features contained in the individual essays produced by students. However, researchers have recently proposed that an individual's ability to flexibly adapt the linguistic properties of their writing may more accurately capture their…
Descriptors: Writing (Composition), Persuasive Discourse, Essays, Language Usage
Crossley, Scott; Allen, Laura K.; Snow, Erica L.; McNamara, Danielle S. – Grantee Submission, 2015
This study investigates a new approach to automatically assessing essay quality that combines traditional approaches based on assessing textual features with new approaches that measure student attributes such as demographic information, standardized test scores, and survey results. The results demonstrate that combining both text features and…
Descriptors: Automation, Scoring, Essays, Evaluation Methods
Allen, Laura K.; Crossley, Scott A.; McNamara, Danielle S. – Grantee Submission, 2015
We investigated linguistic factors that relate to misalignment between students' and teachers' ratings of essay quality. Students (n = 126) wrote essays and rated the quality of their work. Teachers then provided their own ratings of the essays. Results revealed that students who were less accurate in their self-assessments produced essays that…
Descriptors: Essays, Scores, Natural Language Processing, Interrater Reliability
Allen, Laura K.; Snow, Erica L.; McNamara, Danielle S. – Grantee Submission, 2014
In the current study, we utilize natural language processing techniques to examine relations between the linguistic properties of students' self-explanations and their reading comprehension skills. Linguistic features of students' aggregated self-explanations were analyzed using the Linguistic Inquiry and Word Count (LIWC) software. Results…
Descriptors: Natural Language Processing, Reading Comprehension, Linguistics, Predictor Variables
Allen, Laura K.; Snow, Erica L.; McNamara, Danielle S. – Grantee Submission, 2015
This study builds upon previous work aimed at developing a student model of reading comprehension ability within the intelligent tutoring system, iSTART. Currently, the system evaluates students' self-explanation performance using a local, sentence-level algorithm and does not adapt content based on reading ability. The current study leverages…
Descriptors: Reading Comprehension, Reading Skills, Natural Language Processing, Intelligent Tutoring Systems
Allen, Laura K.; Snow, Erica L.; McNamara, Danielle S. – Grantee Submission, 2016
A commonly held belief among educators, researchers, and students is that high-quality texts are easier to read than low-quality texts, as they contain more engaging narrative and story-like elements. Interestingly, these assumptions have typically failed to be supported by the literature on writing. Previous research suggests that higher quality…
Descriptors: Role, Writing (Composition), Natural Language Processing, Hypothesis Testing
Allen, Laura K.; Snow, Erica L.; McNamara, Danielle S. – Journal of Educational Psychology, 2016
A commonly held belief among educators, researchers, and students is that high-quality texts are easier to read than low-quality texts, as they contain more engaging narrative and story-like elements. Interestingly, these assumptions have typically failed to be supported by the literature on writing. Previous research suggests that higher quality…
Descriptors: Role, Writing (Composition), Natural Language Processing, Hypothesis Testing
Crossley, Scott A.; Allen, Laura K.; Snow, Erica L.; McNamara, Danielle S. – Journal of Educational Data Mining, 2016
This study investigates a novel approach to automatically assessing essay quality that combines natural language processing approaches that assess text features with approaches that assess individual differences in writers such as demographic information, standardized test scores, and survey results. The results demonstrate that combining text…
Descriptors: Essays, Scoring, Writing Evaluation, Natural Language Processing
Snow, Erica L.; Allen, Laura K.; Jacovina, Matthew E.; Crossley, Scott A.; Perret, Cecile A.; McNamara, Danielle S. – Journal of Learning Analytics, 2015
Writing researchers have suggested that students who are perceived as strong writers (i.e., those who generate texts rated as high quality) demonstrate flexibility in their writing style. While anecdotally this has been a commonly held belief among researchers and educators, there is little empirical research to support this claim. This study…
Descriptors: Writing (Composition), Writing Strategies, Hypothesis Testing, Essays
Snow, Erica L.; Allen, Laura K.; Jacovina, Matthew E.; Crossley, Scott A.; Perret, Cecile A.; McNamara, Danielle S. – Grantee Submission, 2015
Writing researchers have suggested that students who are perceived as strong writers (i.e., those who generate texts rated as high quality) demonstrate flexibility in their writing style. While anecdotally this has been a commonly held belief among researchers and educators, there is little empirical research to support this claim. This study…
Descriptors: Writing (Composition), Writing Strategies, Hypothesis Testing, Essays