Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 0 |
Since 2016 (last 10 years) | 0 |
Since 2006 (last 20 years) | 21 |
Descriptor
Source
College Board | 18 |
Educational and Psychological… | 2 |
Educational Assessment | 1 |
Journal of Advanced Academics | 1 |
Author
Publication Type
Reports - Research | 18 |
Numerical/Quantitative Data | 7 |
Non-Print Media | 5 |
Reference Materials - General | 5 |
Journal Articles | 4 |
Speeches/Meeting Papers | 1 |
Tests/Questionnaires | 1 |
Education Level
Higher Education | 21 |
Postsecondary Education | 20 |
Secondary Education | 11 |
High Schools | 10 |
Elementary Secondary Education | 1 |
Audience
Location
United States | 1 |
Laws, Policies, & Programs
Assessments and Surveys
SAT (College Admission Test) | 20 |
What Works Clearinghouse Rating
Shaw, Emily J.; Kobrin, Jennifer L. – College Board, 2013
This study examines the relationship between students' SAT essay scores and college outcomes, including first-year grade point average (FYGPA) and first-year English course grade average (FY EngGPA), overall and by various demographic and academic performance subgroups. Results showed that the SAT essay score has a positive relationship with both…
Descriptors: College Entrance Examinations, Scores, Essays, Writing Skills
Kobrin, Jennifer L.; Patterson, Brian F.; Wiley, Andrew; Mattern, Krista D. – College Board, 2012
In 2011, the College Board released its SAT college and career readiness benchmark, which represents the level of academic preparedness associated with a high likelihood of college success and completion. The goal of this study, which was conducted in 2008, was to establish college success criteria to inform the development of the benchmark. The…
Descriptors: College Entrance Examinations, Standard Setting, College Readiness, Career Readiness
Kobrin, Jennifer L.; Sinharay, Sandip; Haberman, Shelby J.; Chajewski, Michael – College Board, 2011
This study examined the adequacy of a multiple linear regression model for predicting first-year college grade point average (FYGPA) using SAT[R] scores and high school grade point average (HSGPA). A variety of techniques, both graphical and statistical, were used to examine if it is possible to improve on the linear regression model. The results…
Descriptors: Multiple Regression Analysis, Goodness of Fit, College Entrance Examinations, Test Validity
Mattern, Krista D.; Shaw, Emily J.; Kobrin, Jennifer L. – Educational and Psychological Measurement, 2011
This study examined discrepant high school grade point average (HSGPA) and SAT performance as measured by the difference between a student's standardized SAT composite score and standardized HSGPA. The SAT-HSGPA discrepancy measure was used to examine whether certain students are more likely to exhibit discrepant performance and in what direction.…
Descriptors: Grade Point Average, College Entrance Examinations, Predictive Validity, College Admission
Kobrin, Jennifer L.; Patterson, Brian F. – Educational Assessment, 2011
Prior research has shown that there is substantial variability in the degree to which the SAT and high school grade point average (HSGPA) predict 1st-year college performance at different institutions. This article demonstrates the usefulness of multilevel modeling as a tool to uncover institutional characteristics that are associated with this…
Descriptors: College Entrance Examinations, Scores, Grade Point Average, High School Students
Shaw, Emily J.; Kobrin, Jennifer L.; Patterson, Brian F.; Mattern, Krista D. – College Board, 2012
The current study examined the differential validity of the SAT for predicting cumulative GPA (cGPA) through the second year of college by college major, as well as the differential prediction of cGPA by college major across student subgroups. The relationship between the SAT and cGPA varied somewhat by major, as well as by major and subgroup…
Descriptors: College Entrance Examinations, Predictive Validity, Test Validity, Scores
Kobrin, Jennifer L.; Kim, YoungKoung; Sackett, Paul R. – Educational and Psychological Measurement, 2012
There is much debate on the merits and pitfalls of standardized tests for college admission, with questions regarding the format (multiple-choice vs. constructed response), cognitive complexity, and content of these assessments (achievement vs. aptitude) at the forefront of the discussion. This study addressed these questions by investigating the…
Descriptors: Grade Point Average, Standardized Tests, Predictive Validity, Predictor Variables
Patterson, Brian F.; Kobrin, Jennifer L. – College Board, 2011
This study presents a case for applying a transformation (Box and Cox, 1964) of the criterion used in predictive validity studies. The goals of the transformation were to better meet the assumptions of the linear regression model and to reduce the residual variance of fitted (i.e., predicted) values. Using data for the 2008 cohort of first-time,…
Descriptors: Predictive Validity, Evaluation Criteria, Regression (Statistics), College Freshmen
Kobrin, Jennifer L.; Patterson, Brian F. – College Board, 2012
This study examines student performance on the SAT and SAT Subject Tests in order to identify groups of students who score differently on these two tests, and to determine whether certain demographic groups score higher on one test compared to the other. Discrepancy scores were created to capture individuals' performance differences on the…
Descriptors: College Entrance Examinations, Scores, Performance, Standardized Tests
Mattern, Krista D.; Patterson, Brian F.; Kobrin, Jennifer L. – College Board, 2012
This study examined the validity of the SAT for predicting performance in first-year English and mathematics courses. Results reveal a significant positive relationship between SAT scores and course grades, with slightly higher correlations for mathematics courses compared to English courses. Correlations were estimated by student characteristics…
Descriptors: College Entrance Examinations, Predictive Validity, Test Validity, Scores
Patterson, Brian F.; Packman, Sheryl; Kobrin, Jennifer L. – College Board, 2011
The purpose of this study was to examine the effects of Advanced Placement[R] (AP[R]) exam participation and performance on college grades for courses taken in the same subject area as students' AP Exam(s). Students' first-year college subject area grade point averages (SGPAs) were examined in nine subject areas: mathematics, computer science,…
Descriptors: Advanced Placement, Intellectual Disciplines, Engineering, Natural Sciences
Kobrin, Jennifer L.; Patterson, Brian F. – College Board, 2010
There is substantial variability in the degree to which the SAT and high school grade point average (HSGPA) predict first-year college performance at different institutions. This paper demonstrates the usefulness of multilevel modeling as a tool to uncover institutional characteristics that are associated with this variability. In a model that…
Descriptors: Scores, Validity, Prediction, College Freshmen
Shaw, Emily J.; Kobrin, Jennifer L.; Patterson, Brian F.; Mattern, Krista D. – College Board, 2011
Presented at the Annual Meeting of the American Educational Research Association (AERA) in New Orleans, LA in April 2011. The current study examined the differential validity of the SAT for predicting cumulative GPA through the second-year of college by college major, as well as the differential prediction of cumulative GPA by college major among…
Descriptors: College Entrance Examinations, Predictive Validity, Grade Point Average, College Students
Kobrin, Jennifer L.; Kim, Rachel; Sackett, Paul – College Board, 2011
There is much debate on the merits and pitfalls of standardized tests for college admission, with questions regarding the format (multiple-choice versus constructed response), cognitive complexity, and content of these assessments (achievement versus aptitude) at the forefront of the discussion. This study addressed these questions by…
Descriptors: College Entrance Examinations, Mathematics Tests, Test Items, Predictive Validity
Mattern, Krista D.; Shaw, Emily J.; Kobrin, Jennifer L. – College Board, 2010
Presented at the national conference for the American Educational Research Association (AERA) in 2010. This presentation describes an alternative way of presenting the unique information provided by the SAT over HSGPA, namely examining students with discrepant SAT-HSGPA performance.
Descriptors: College Entrance Examinations, Grade Point Average, High School Students, Scores
Previous Page | Next Page »
Pages: 1 | 2