NotesFAQContact Us
Collection
Advanced
Search Tips
Audience
Laws, Policies, & Programs
Individuals with Disabilities…1
What Works Clearinghouse Rating
Showing 1 to 15 of 36 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Gilbert, Joshua B.; Kim, James S.; Miratrix, Luke W. – Journal of Educational and Behavioral Statistics, 2023
Analyses that reveal how treatment effects vary allow researchers, practitioners, and policymakers to better understand the efficacy of educational interventions. In practice, however, standard statistical methods for addressing heterogeneous treatment effects (HTE) fail to address the HTE that may exist "within" outcome measures. In…
Descriptors: Test Items, Item Response Theory, Computer Assisted Testing, Program Effectiveness
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Li, Feifei – ETS Research Report Series, 2017
An information-correction method for testlet-based tests is introduced. This method takes advantage of both generalizability theory (GT) and item response theory (IRT). The measurement error for the examinee proficiency parameter is often underestimated when a unidimensional conditional-independence IRT model is specified for a testlet dataset. By…
Descriptors: Item Response Theory, Generalizability Theory, Tests, Error of Measurement
Peer reviewed Peer reviewed
Direct linkDirect link
Joshua B. Gilbert; James S. Kim; Luke W. Miratrix – Applied Measurement in Education, 2024
Longitudinal models typically emphasize between-person predictors of change but ignore how growth varies "within" persons because each person contributes only one data point at each time. In contrast, modeling growth with multi-item assessments allows evaluation of how relative item performance may shift over time. While traditionally…
Descriptors: Vocabulary Development, Item Response Theory, Test Items, Student Development
Peer reviewed Peer reviewed
Direct linkDirect link
Vista, Alvin; Alahmadi, Maisaa Taleb – Journal of Psychoeducational Assessment, 2022
The relationship between latent trait and test-taking speed is an important area of study in assessment research. In addition to contributions of such studies to psychometrics, the factors that affect both ability and speed have implications for test development and have policy consequences especially if the tests are high stakes. This study…
Descriptors: Cognitive Ability, Foreign Countries, Academically Gifted, Gifted Education
Schoen, Robert C.; Liu, Sicong; Yang, Xiaotong; Paek, Insu – Grantee Submission, 2017
The Early Fractions Test is a paper-pencil test designed to measure mathematics achievement of third- and fourth-grade students in the domain of fractions. The purpose, or intended use, of the Early Fractions Test is to serve as a student pretest covariate and a test of baseline equivalence in the larger study. In this report, we discuss our…
Descriptors: Mathematics Achievement, Fractions, Mathematics Tests, Grade 3
Peer reviewed Peer reviewed
Direct linkDirect link
Liao, Xiangyi; Bolt, Daniel M. – Journal of Educational and Behavioral Statistics, 2021
Four-parameter models have received increasing psychometric attention in recent years, as a reduced upper asymptote for item characteristic curves can be appealing for measurement applications such as adaptive testing and person-fit assessment. However, applications can be challenging due to the large number of parameters in the model. In this…
Descriptors: Test Items, Models, Mathematics Tests, Item Response Theory
Peer reviewed Peer reviewed
Direct linkDirect link
Woodcock, Stuart; Howard, Steven J.; Ehrich, John – School Psychology, 2020
Standardized testing is ubiquitous in educational assessment, but questions have been raised about the extent to which these test scores accurately reflect students' genuine knowledge and skills. To more rigorously investigate this issue, the current study employed a within-subject experimental design to examine item format effects on primary…
Descriptors: Elementary School Students, Grade 3, Test Items, Test Format
Peer reviewed Peer reviewed
Direct linkDirect link
Ngware, Moses W.; Hungi, Njora; Mutisya, Maurice – Assessment in Education: Principles, Policy & Practice, 2019
This paper examines a procedure measuring student competencies in numeracy using school-based assessments, and demonstrates how the procedure informs the school system on quality improvement. The sample consisted of 7648 students, attending three different types of urban schools including government, formal private and low cost private in poor…
Descriptors: Student Evaluation, Assessment Literacy, Numeracy, Urban Schools
Peer reviewed Peer reviewed
Direct linkDirect link
Herzog, Moritz; Ehlert, Antje; Fritz, Annemarie – African Journal of Research in Mathematics, Science and Technology Education, 2017
Although the general development of mathematical abilities in primary school has been the focus of many researchers, the development of place value understanding has rarely been investigated to date. This is possibly due to the lack of conceptual approaches and empirical studies related to this topic. To fill this gap, a theory-driven and…
Descriptors: Models, Number Concepts, Foreign Countries, Grade 2
Peer reviewed Peer reviewed
Direct linkDirect link
Arce, Alvaro J.; Wang, Ze – International Journal of Testing, 2012
The traditional approach to scale modified-Angoff cut scores transfers the raw cuts to an existing raw-to-scale score conversion table. Under the traditional approach, cut scores and conversion table raw scores are not only seen as interchangeable but also as originating from a common scaling process. In this article, we propose an alternative…
Descriptors: Generalizability Theory, Item Response Theory, Cutting Scores, Scaling
Schoen, Robert C.; Yang, Xiaotong; Liu, Sicong; Paek, Insu – Grantee Submission, 2017
The Early Fractions Test v2.2 is a paper-pencil test designed to measure mathematics achievement of third- and fourth-grade students in the domain of fractions. The purpose, or intended use, of the Early Fractions Test v2.2 is to serve as a measure of student outcomes in a randomized trial designed to estimate the effect of an educational…
Descriptors: Psychometrics, Mathematics Tests, Mathematics Achievement, Fractions
Schoen, Robert C.; Anderson, Daniel; Riddell, Claire M.; Bauduin, Charity – Online Submission, 2018
This report provides a description of the development process, field testing, and psychometric properties of the fall 2015 grades 3-5 Elementary Mathematics Student Assessment (EMSA), a student mathematics test designed to be administered in a whole-group setting to students in grades 3, 4, and 5. The test was administered to 2,614 participating…
Descriptors: Elementary School Students, Elementary School Mathematics, Grade 3, Grade 4
Liu, Junhui; Brown, Terran; Chen, Jianshen; Ali, Usama; Hou, Likun; Costanzo, Kate – Partnership for Assessment of Readiness for College and Careers, 2016
The Partnership for Assessment of Readiness for College and Careers (PARCC) is a state-led consortium working to develop next-generation assessments that more accurately, compared to previous assessments, measure student progress toward college and career readiness. The PARCC assessments include both English Language Arts/Literacy (ELA/L) and…
Descriptors: Testing, Achievement Tests, Test Items, Test Bias
Peer reviewed Peer reviewed
Direct linkDirect link
Ye, Meng; Xin, Tao – Educational and Psychological Measurement, 2014
The authors explored the effects of drifting common items on vertical scaling within the higher order framework of item parameter drift (IPD). The results showed that if IPD occurred between a pair of test levels, the scaling performance started to deviate from the ideal state, as indicated by bias of scaling. When there were two items drifting…
Descriptors: Scaling, Test Items, Equated Scores, Achievement Gains
Steedle, Jeffrey; McBride, Malena; Johnson, Marc; Keng, Leslie – Partnership for Assessment of Readiness for College and Careers, 2016
The first operational administration of the Partnership for Assessment of Readiness for College and Careers (PARCC) took place during the 2014-2015 school year. In addition to the traditional paper-and-pencil format, the assessments were available for administration on a variety of electronic devices, including desktop computers, laptop computers,…
Descriptors: Computer Assisted Testing, Difficulty Level, Test Items, Scores
Previous Page | Next Page ยป
Pages: 1  |  2  |  3