Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 3 |
Since 2016 (last 10 years) | 10 |
Since 2006 (last 20 years) | 19 |
Descriptor
Source
Grantee Submission | 19 |
Author
McNamara, Danielle S. | 14 |
Allen, Laura K. | 9 |
Snow, Erica L. | 6 |
Crossley, Scott A. | 4 |
Katz, Sandra | 4 |
Albacete, Patricia | 3 |
Jordan, Pamela | 3 |
Varner, Laura K. | 3 |
Creer, Sarah D. | 2 |
Jacovina, Matthew E. | 2 |
Likens, Aaron D. | 2 |
More ▼ |
Publication Type
Reports - Research | 17 |
Speeches/Meeting Papers | 11 |
Journal Articles | 4 |
Tests/Questionnaires | 2 |
Reports - Descriptive | 1 |
Reports - Evaluative | 1 |
Education Level
High Schools | 19 |
Secondary Education | 15 |
Higher Education | 4 |
Postsecondary Education | 4 |
Elementary Education | 2 |
Grade 7 | 2 |
Grade 8 | 2 |
Junior High Schools | 2 |
Middle Schools | 2 |
Grade 10 | 1 |
Grade 4 | 1 |
More ▼ |
Audience
Researchers | 1 |
Teachers | 1 |
Laws, Policies, & Programs
Assessments and Surveys
Gates MacGinitie Reading Tests | 6 |
Writing Apprehension Test | 1 |
What Works Clearinghouse Rating
Allen, Laura Kristen; Magliano, Joseph P.; McCarthy, Kathryn S.; Sonia, Allison N.; Creer, Sarah D.; McNamara, Danielle S. – Grantee Submission, 2021
The current study examined the extent to which the cohesion detected in readers' constructed responses to multiple documents was predictive of persuasive, source-based essay quality. Participants (N=95) completed multiple-documents reading tasks wherein they were prompted to think-aloud, self-explain, or evaluate the sources while reading a set of…
Descriptors: Reading Comprehension, Connected Discourse, Reader Response, Natural Language Processing
Sonia, Allison N.; Joseph, Magliano P.; McCarthy, Kathryn S.; Creer, Sarah D.; McNamara, Danielle S.; Allen, Laura K. – Grantee Submission, 2022
The constructed responses individuals generate while reading can provide insights into their coherence-building processes. The current study examined how the cohesion of constructed responses relates to performance on an integrated writing task. Participants (N = 95) completed a multiple document reading task wherein they were prompted to think…
Descriptors: Natural Language Processing, Connected Discourse, Reading Processes, Writing Skills
Michelle P. Banawan; Jinnie Shin; Tracy Arner; Renu Balyan; Walter L. Leite; Danielle S. McNamara – Grantee Submission, 2023
Academic discourse communities and learning circles are characterized by collaboration, sharing commonalities in terms of social interactions and language. The discourse of these communities is composed of jargon, common terminologies, and similarities in how they construe and communicate meaning. This study examines the extent to which discourse…
Descriptors: Algebra, Discourse Analysis, Semantics, Syntax
Albacete, Patricia; Jordan, Pamela; Katz, Sandra; Chounta, Irene-Angelica; McLaren, Bruce M. – Grantee Submission, 2019
This paper describes an initial pilot study of Rimac, a natural-language tutoring system for physics. Rimac uses a student model to guide decisions about "what content to discuss next" during reflective dialogues that are initiated after students solve quantitative physics problems, and "how much support to provide" during…
Descriptors: Natural Language Processing, Teaching Methods, Educational Technology, Technology Uses in Education
Jordan, Pamela; Albacete, Patricia; Katz, Sandra – Grantee Submission, 2016
We explore the effectiveness of a simple algorithm for adaptively deciding whether to further decompose a step in a line of reasoning during tutorial dialogue. We compare two versions of a tutorial dialogue system, Rimac: one that always decomposes a step to its simplest sub-steps and one that adaptively decides to decompose a step based on a…
Descriptors: Algorithms, Decision Making, Intelligent Tutoring Systems, Scaffolding (Teaching Technique)
Allen, Laura K.; Likens, Aaron D.; McNamara, Danielle S. – Grantee Submission, 2018
The assessment of writing proficiency generally includes analyses of the specific linguistic and rhetorical features contained in the singular essays produced by students. However, researchers have recently proposed that an individual's ability to flexibly adapt the linguistic properties of their writing might more closely capture writing skill.…
Descriptors: Writing Evaluation, Writing Tests, Computer Assisted Testing, Writing Skills
Katz, Sandra; Albacete, Patricia; Jordan, Pamela – Grantee Submission, 2016
This poster reports on a study that compared three types of summaries at the end of natural-language tutorial dialogues and a no-dialogue control, to determine which type of summary, if any, best predicted learning gains. Although we found no significant differences between conditions, analyses of gender differences indicate that female students…
Descriptors: Natural Language Processing, Intelligent Tutoring Systems, Reflection, Dialogs (Language)
Allen, Laura K.; Likens, Aaron D.; McNamara, Danielle S. – Grantee Submission, 2018
The assessment of argumentative writing generally includes analyses of the specific linguistic and rhetorical features contained in the individual essays produced by students. However, researchers have recently proposed that an individual's ability to flexibly adapt the linguistic properties of their writing may more accurately capture their…
Descriptors: Writing (Composition), Persuasive Discourse, Essays, Language Usage
Crossley, Scott; Allen, Laura K.; Snow, Erica L.; McNamara, Danielle S. – Grantee Submission, 2015
This study investigates a new approach to automatically assessing essay quality that combines traditional approaches based on assessing textual features with new approaches that measure student attributes such as demographic information, standardized test scores, and survey results. The results demonstrate that combining both text features and…
Descriptors: Automation, Scoring, Essays, Evaluation Methods
Allen, Laura K.; Crossley, Scott A.; McNamara, Danielle S. – Grantee Submission, 2015
We investigated linguistic factors that relate to misalignment between students' and teachers' ratings of essay quality. Students (n = 126) wrote essays and rated the quality of their work. Teachers then provided their own ratings of the essays. Results revealed that students who were less accurate in their self-assessments produced essays that…
Descriptors: Essays, Scores, Natural Language Processing, Interrater Reliability
Allen, Laura K.; Snow, Erica L.; McNamara, Danielle S. – Grantee Submission, 2014
In the current study, we utilize natural language processing techniques to examine relations between the linguistic properties of students' self-explanations and their reading comprehension skills. Linguistic features of students' aggregated self-explanations were analyzed using the Linguistic Inquiry and Word Count (LIWC) software. Results…
Descriptors: Natural Language Processing, Reading Comprehension, Linguistics, Predictor Variables
Allen, Laura K.; Snow, Erica L.; McNamara, Danielle S. – Grantee Submission, 2015
This study builds upon previous work aimed at developing a student model of reading comprehension ability within the intelligent tutoring system, iSTART. Currently, the system evaluates students' self-explanation performance using a local, sentence-level algorithm and does not adapt content based on reading ability. The current study leverages…
Descriptors: Reading Comprehension, Reading Skills, Natural Language Processing, Intelligent Tutoring Systems
Allen, Laura K.; Snow, Erica L.; McNamara, Danielle S. – Grantee Submission, 2016
A commonly held belief among educators, researchers, and students is that high-quality texts are easier to read than low-quality texts, as they contain more engaging narrative and story-like elements. Interestingly, these assumptions have typically failed to be supported by the literature on writing. Previous research suggests that higher quality…
Descriptors: Role, Writing (Composition), Natural Language Processing, Hypothesis Testing
Varner, Laura K.; Jackson, G. Tanner; Snow, Erica L.; McNamara, Danielle S. – Grantee Submission, 2013
This study expands upon an existing model of students' reading comprehension ability within an intelligent tutoring system. The current system evaluates students' natural language input using a local student model. We examine the potential to expand this model by assessing the linguistic features of self-explanations aggregated across entire…
Descriptors: Reading Comprehension, Intelligent Tutoring Systems, Natural Language Processing, Reading Ability
Snow, Erica L.; Allen, Laura K.; Jacovina, Matthew E.; Crossley, Scott A.; Perret, Cecile A.; McNamara, Danielle S. – Grantee Submission, 2015
Writing researchers have suggested that students who are perceived as strong writers (i.e., those who generate texts rated as high quality) demonstrate flexibility in their writing style. While anecdotally this has been a commonly held belief among researchers and educators, there is little empirical research to support this claim. This study…
Descriptors: Writing (Composition), Writing Strategies, Hypothesis Testing, Essays
Previous Page | Next Page »
Pages: 1 | 2