NotesFAQContact Us
Collection
Advanced
Search Tips
Laws, Policies, & Programs
What Works Clearinghouse Rating
Showing 31 to 45 of 240 results Save | Export
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Bejar, Isaac I.; VanWinkle, Waverely; Madnani, Nitin; Lewis, William; Steier, Michael – ETS Research Report Series, 2013
The paper applies a natural language computational tool to study a potential construct-irrelevant response strategy, namely the use of "shell language." Although the study is motivated by the impending increase in the volume of scoring of students responses from assessments to be developed in response to the Race to the Top initiative,…
Descriptors: Responses, Language Usage, Natural Language Processing, Computational Linguistics
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Breyer, F. Jay; Attali, Yigal; Williamson, David M.; Ridolfi-McCulla, Laura; Ramineni, Chaitanya; Duchnowski, Matthew; Harris, April – ETS Research Report Series, 2014
In this research, we investigated the feasibility of implementing the "e-rater"® scoring engine as a check score in place of all-human scoring for the "Graduate Record Examinations"® ("GRE"®) revised General Test (rGRE) Analytical Writing measure. This report provides the scientific basis for the use of e-rater as a…
Descriptors: Computer Software, Computer Assisted Testing, Scoring, College Entrance Examinations
Peer reviewed Peer reviewed
Direct linkDirect link
Mya Poe; Norbert Elliot; John Aloysius Cogan; Tito G. Nurudeen – College Composition and Communication, 2014
In this article, we investigate disparate impact analysis as a validation tool for understanding the local effects of writing assessment on diverse groups of students. Using a case study data set from a university that we call Brick City University, we explain how Brick City's writing program undertook a self-study of its placement exam using the…
Descriptors: Writing Instruction, Writing Evaluation, Writing Tests, Student Placement
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Liaghat, Farahnaz; Biria, Reza – International Journal of Instruction, 2018
This study aimed at exploring the impact of mentor text modelling on Iranian English as a Foreign Language (EFL) learners' accuracy and fluency in writing tasks with different cognitive complexity in comparison with two conventional approaches to teaching writing; namely, process-based and product-based approaches. To this end, 60 Iranian EFL…
Descriptors: Foreign Countries, Comparative Analysis, Teaching Methods, Writing Instruction
Navarro, Maria V. – Montgomery County Public Schools, 2015
This memorandum describes the SAT participation and performance for the Montgomery County Public Schools (MCPS) Class of 2015 compared with the graduating seniors in Maryland and the nation. The attachment provides the detailed results of SAT and ACT by high school and student group for graduates in 2011-2015. The mean SAT combined score for the…
Descriptors: College Entrance Examinations, Student Participation, Public Schools, High School Graduates
Peer reviewed Peer reviewed
Direct linkDirect link
Wiley, Edward W.; Shavelson, Richard J.; Kurpius, Amy A. – Educational and Psychological Measurement, 2014
The name "SAT" has become synonymous with college admissions testing; it has been dubbed "the gold standard." Numerous studies on its reliability and predictive validity show that the SAT predicts college performance beyond high school grade point average. Surprisingly, studies of the factorial structure of the current version…
Descriptors: College Readiness, College Admission, College Entrance Examinations, Factor Analysis
Peer reviewed Peer reviewed
Direct linkDirect link
Müller, Amanda – Higher Education Research and Development, 2015
This paper attempts to demonstrate the differences in writing between International English Language Testing System (IELTS) bands 6.0, 6.5 and 7.0. An analysis of exemplars provided from the IELTS test makers reveals that IELTS 6.0, 6.5 and 7.0 writers can make a minimum of 206 errors, 96 errors and 35 errors per 1000 words. The following section…
Descriptors: English (Second Language), Second Language Learning, Language Tests, Scores
Wilson, Janet S. – Montgomery County Public Schools, 2016
This memorandum describes the SAT participation and performance for the Montgomery County Public Schools (MCPS) Class of 2016 compared with the graduating seniors in Maryland and the nation. Montgomery County Public Schools (MCPS) students continued to outperform their peers in the state of Maryland and the nation on the SAT according to the…
Descriptors: College Entrance Examinations, Scores, Student Participation, High School Graduates
Wilson, Janet S. – Montgomery County Public Schools, 2017
The purpose of this memorandum is to provide SAT participation and performance for the 2017 graduates in Montgomery County Public Schools (MCPS). The College Board redesigned the SAT with the goal of measuring essential skills for college and career as well as relating the test content to everyday learning in classrooms. The redesigned SAT was…
Descriptors: College Entrance Examinations, Scores, Student Participation, High School Graduates
Peer reviewed Peer reviewed
Direct linkDirect link
Sato, Takanori; Ikeda, Naoki – Language Testing in Asia, 2015
Background: High-stakes tests have an immense washback effect on what students learn and affect the content of student learning. However, if students fail to recognize the abilities that the test developers intend to measure, they are less likely to learn what the test developers wish them to learn. This study aims to investigate test-taker…
Descriptors: High Stakes Tests, Testing Problems, Test Items, College Students
Peer reviewed Peer reviewed
Direct linkDirect link
Engelhard, George, Jr.; Kobrin, Jennifer L.; Wind, Stefanie A. – International Journal of Testing, 2014
The purpose of this study is to explore patterns in model-data fit related to subgroups of test takers from a large-scale writing assessment. Using data from the SAT, a calibration group was randomly selected to represent test takers who reported that English was their best language from the total population of test takers (N = 322,011). A…
Descriptors: College Entrance Examinations, Writing Tests, Goodness of Fit, English
Looby, Karen – Online Submission, 2013
The SAT is an assessment of reading, math, and writing administered by the College Board. SAT results for Austin ISD students in the 2012-2013 school year remained stable compared with results from past years. See full report for details.
Descriptors: College Entrance Examinations, School Districts, Trend Analysis, Scores
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Attali, Yigal; Sinharay, Sandip – ETS Research Report Series, 2015
The "e-rater"® automated essay scoring system is used operationally in the scoring of the argument and issue tasks that form the Analytical Writing measure of the "GRE"® General Test. For each of these tasks, this study explored the value added of reporting 4 trait scores for each of these 2 tasks over the total e-rater score.…
Descriptors: Scores, Computer Assisted Testing, Computer Software, Grammar
Patricia Anders Jones – ProQuest LLC, 2012
Using a causal-comparative design, this quantitative study investigated whether or not the curriculum integration of academic subjects with career and technical education classes affected secondary students' academic performance as assessed by scores on standardized tests. The purposive sample was drawn from students in Trade and Industry classes…
Descriptors: Integrated Curriculum, Academic Achievement, Secondary School Students, Standardized Tests
Peer reviewed Peer reviewed
Direct linkDirect link
Shaw, Emily J.; Mattern, Krista D.; Patterson, Brian F. – Educational Assessment, 2011
Despite the similarities that researchers note between the cognitive processes and knowledge involved in reading and writing, there are students who are much stronger readers than writers and those who are much stronger writers than readers. The addition of the writing section to the SAT provides an opportunity to examine whether certain groups of…
Descriptors: College Entrance Examinations, Critical Reading, Reading Tests, Writing Tests
Pages: 1  |  2  |  3  |  4  |  5  |  6  |  7  |  8  |  9  |  10  |  11  |  ...  |  16