NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 9 results Save | Export
Lee, Eunjung; Lee, Won-Chan; Brennan, Robert L. – College Board, 2012
In almost all high-stakes testing programs, test equating is necessary to ensure that test scores across multiple test administrations are equivalent and can be used interchangeably. Test equating becomes even more challenging in mixed-format tests, such as Advanced Placement Program® (AP®) Exams, that contain both multiple-choice and constructed…
Descriptors: Test Construction, Test Interpretation, Test Norms, Test Reliability
Kim, YoungKoung; Hendrickson, Amy; Patel, Priyank; Melican, Gerald; Sweeney, Kevin – College Board, 2013
The purpose of this report is to describe the procedure for revising the ReadiStep™ score scale using the field trial data, and to provide technical information about the development of the new ReadiStep scale score. In doing so, this report briefly introduces the three assessments--ReadiStep, PSAT/NMSQT®, and SAT®--in the College Board Pathway…
Descriptors: College Entrance Examinations, Educational Assessment, High School Students, Scores
Mattern, Krista D.; Packman, Sheryl – College Board, 2009
A disconnect between the educational requirements of secondary institutions and postsecondary institutions often results in a large percentage of first-year college students requiring remediation (Moss & Bordelon, 2007). As such, postsecondary institutions administer tests to incoming students for placement into courses of the appropriate…
Descriptors: Student Placement, College Students, College Entrance Examinations, Academic Achievement
Hendrickson, Amy; Patterson, Brian; Ewing, Maureen – College Board, 2010
The psychometric considerations and challenges associated with including constructed response items on tests are discussed along with how these issues affect the form assembly specifications for mixed-format exams. Reliability and validity, security and fairness, pretesting, content and skills coverage, test length and timing, weights, statistical…
Descriptors: Multiple Choice Tests, Test Format, Test Construction, Test Validity
Schmitt, Neal; Billington, Abigail; Keeney, Jessica; Reeder, Matthew; Pleskac, Timothy J.; Sinha, Ruchi; Zorzie, Mark – College Board, 2011
Noncognitive attributes as the researchers have measured them do correlate with college GPA, but the incremental validity associated with these measures is relatively small. The noncognitive measures are correlated with other valued dimensions of student performance beyond the achievement reflected in college grades. There were much smaller…
Descriptors: College Students, Gender Differences, Ethnic Groups, Correlation
Shaw, Emily J.; Mattern, Krista D. – College Board, 2009
This study examined the relationship between students' self-reported high school grade point average (HSGPA) from the SAT Questionnaire and their HSGPA provided by the colleges and universities they attend. The purpose of this research was to offer updated information on the relatedness of self-reported (by the student) and school-reported (by the…
Descriptors: High School Students, Grade Point Average, Accuracy, Aptitude Tests
Mattern, Krista; Camara, Wayne; Kobrin, Jennifer L. – College Board, 2007
The focus of this report is to summarize the research that has been conducted thus far on the new SAT Writing section. The evidence provided reveals that the new writing section has satisfactory psychometric quality in that its reliability is acceptable; it is significantly related to first-year college GPA and college English grades; it has been…
Descriptors: College Entrance Examinations, Writing Tests, Educational Research, Psychometrics
Ewing, Maureen; Huff, Kristen; Andrews, Melissa; King, Kinda – College Board, 2005
In connection with the new SAT that was introduced in March 2005, research has been under way to investigate the feasibility of providing examinees with score reports that contain feedback on skills measured by the critical reading, mathematics, and writing sections of the test. The main purpose of this study was to estimate the reliability of the…
Descriptors: College Entrance Examinations, Scores, Feedback (Response), Reading Skills
Kobrin, Jennifer L.; Kimmel, Ernest W. – College Board, 2006
Based on statistics from the first few administrations of the SAT writing section, the test is performing as expected. The reliability of the writing section is very similar to that of other writing assessments. Based on preliminary validity research, the writing section is expected to add modestly to the prediction of college performance when…
Descriptors: Test Construction, Writing Tests, Cognitive Tests, College Entrance Examinations