NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 5 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Sparks, Jesse R.; van Rijn, Peter W.; Deane, Paul – Educational Assessment, 2021
Effectively evaluating the credibility and accuracy of multiple sources is critical for college readiness. We developed 24 source evaluation tasks spanning four predicted difficulty levels of a hypothesized learning progression (LP) and piloted these tasks to evaluate the utility of an LP-based approach to designing formative literacy assessments.…
Descriptors: Middle School Students, Information Sources, Grade 6, Grade 7
Peer reviewed Peer reviewed
Direct linkDirect link
Bartholomew, Scott R.; Nadelson, Louis S.; Goodridge, Wade H.; Reeve, Edward M. – Educational Assessment, 2018
We investigated the use of adaptive comparative judgment to evaluate the middle school student learning, engagement, and experience with the design process in an open-ended problem assigned in a technology and engineering education course. Our results indicate that the adaptive comparative judgment tool effectively facilitated the grading of the…
Descriptors: Middle School Students, Evaluative Thinking, Learner Engagement, Design
Peer reviewed Peer reviewed
Direct linkDirect link
Tindal, Gerald; Nese, Joseph F. T.; Stevens, Joseph J. – Educational Assessment, 2017
For the past decade, the accountability model associated with No Child Left Behind (NCLB) emphasized proficiency on end of year tests; with Every Student Succeeds Act (ESSA) the emphasis on proficiency within statewide testing programs, though now integrated with other measures of student learning, nevertheless remains a primary metric for…
Descriptors: Testing Programs, Middle School Students, Models, State Standards
Peer reviewed Peer reviewed
Direct linkDirect link
Liu, Ou Lydia; Lee, Hee-Sun; Linn, Marcia C. – Educational Assessment, 2011
Both multiple-choice and constructed-response items have known advantages and disadvantages in measuring scientific inquiry. In this article we explore the function of explanation multiple-choice (EMC) items and examine how EMC items differ from traditional multiple-choice and constructed-response items in measuring scientific reasoning. A group…
Descriptors: Science Tests, Multiple Choice Tests, Responses, Test Items
Peer reviewed Peer reviewed
Direct linkDirect link
Ketterlin-Geller, Leanne R.; Yovanoff, Paul; Jung, EunJu; Liu, Kimy; Geller, Josh – Educational Assessment, 2013
In this article, we highlight the need for a precisely defined construct in score-based validation and discuss the contribution of cognitive theories to accurately and comprehensively defining the construct. We propose a framework for integrating cognitively based theoretical and empirical evidence to specify and evaluate the construct. We apply…
Descriptors: Test Validity, Construct Validity, Scores, Evidence