ERIC Number: ED662774
Record Type: Non-Journal
Publication Date: 2024
Pages: 279
Abstractor: As Provided
ISBN: 979-8-3840-4339-3
ISSN: N/A
EISSN: N/A
Prompt and Circumstance: Investigating the Relationship between College Writing and Postsecondary Policy
ProQuest LLC, Ph.D. Dissertation, University of Michigan
In US-based postsecondary education, first-year students commonly have their compositional ability consequentially assessed on the basis of standardized tests. As a result, students who score above certain thresholds on ACT, SAT, or AP exams often are placed into honors or remedial courses; receive credit remissions; and/or test out of general education classes such as first-year composition. While the thresholds and applicable tests vary from institution to institution, over 2000 Title IV schools implement policies based on such tests. However, there is little evidence that the linguistic patterns that correlate with success on timed, high-stakes tests carry forward to college-level writing tasks. Consequently, contemporary composition scholars call for research that centers examinations of student writing itself rather than assessments of writing quality such as standardized tests. This dissertation responds to that call by answering the questions, How do linguistic features observed in college-level writing relate to institutionally sanctioned measures of writing quality? And, what are the implications for policy levers based on those measures? To answer these questions, I leverage a longitudinal corpus (2009-2019) of approximately 47,000 student essays, matched with data on test scores. Together, these data allow me to investigate whether the test scores, implemented as boolean policy levers, meaningfully distinguish between students who write using measurably distinct linguistic patterns. To measure such distinctions, this study employs natural language processing by incorporating large language models designed for text classification tasks: BERT, RoBERTa, and XLNet. The methods employed in this study identify a quadratic weighted kappa of 0.43, which indicates that the model was able to classify student essays better than random assignment; however, the relationship between student writing and test scores maintain a minimal relationship. Ideally, educational policy that consequentially sorts students into different educational tracks at the most vulnerable point of their college career would bear more than a weak relationship to their college-level performance. To uncover which linguistic features are most correlated with higher scores, I employ OLS, multiple, and logistic regression. These models find significant differences between the essays of students with high and low test scores. Across most models, students with higher test scores have on average fewer clauses per sentence; more prepositions, adverbs, colons, and adjectives; and write with the same number of personal pronouns. While these findings are statistically significant, they only weakly describe the differences between high- and low-scoring, such that distinguishing between essays of students who are near common policy thresholds would be an error-prone task for any human or algorithm. Additionally, while the logistic regression based on the existing policy threshold at University of Michigan had the greatest explanatory power ("Pseudo R[superscript 2]" 0.09), linear regressions based on a normalized ACT-SAT score had more explanatory power ("R[superscript 2]" 0.161). While these metrics cannot be directly compared, the difference in their relative strength nonetheless reveals a disparity in goodness-of-fit that demonstrates how educational policy based on a boolean threshold from one test is functionally less discriminating than the metric that is based on multiple measures. Significance notwithstanding, the overall weak correlation between standardized test scores and college-level writing evidences the inability for a timed, high-stakes writing test to relate to writing in other circumstances, including college-level writing tasks. These results evidence the brittleness of these test scores as measures of writing quality and cast doubt as to their utility as policy levers. [The dissertation citations contained here are published with the permission of ProQuest LLC. Further reproduction is prohibited without permission. Copies of dissertations may be obtained by Telephone (800) 1-800-521-0600. Web page: http://bibliotheek.ehb.be:2222/en-US/products/dissertations/individuals.shtml.]
Descriptors: Freshman Composition, Postsecondary Education, Standardized Tests, College Entrance Examinations, Advanced Placement Programs, Writing Ability, Educational Policy, Educational Legislation, Elementary Secondary Education, Federal Legislation, Outcome Based Education, Test Coaching, High Stakes Tests, Teaching Methods, Secondary Education, College Readiness, Quality Assurance, Essays
ProQuest LLC. 789 East Eisenhower Parkway, P.O. Box 1346, Ann Arbor, MI 48106. Tel: 800-521-0600; Web site: http://bibliotheek.ehb.be:2222/en-US/products/dissertations/individuals.shtml
Publication Type: Dissertations/Theses - Doctoral Dissertations
Education Level: Higher Education; Postsecondary Education; Elementary Secondary Education; Secondary Education
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: N/A
Identifiers - Laws, Policies, & Programs: Elementary and Secondary Education Act Title IV
Identifiers - Assessments and Surveys: ACT Assessment; SAT (College Admission Test); Advanced Placement Examinations (CEEB)
Grant or Contract Numbers: N/A