NotesFAQContact Us
Collection
Advanced
Search Tips
Laws, Policies, & Programs
What Works Clearinghouse Rating
Showing 1 to 15 of 51 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Katharina Meitinger; Tanja Kunz – Sociological Methods & Research, 2024
Previous research reveals that the visual design of open-ended questions should match the response task so that respondents can infer the expected response format. Based on a web survey including specific probes in a list-style open-ended question format, we experimentally tested the effects of varying numbers of answer boxes on several indicators…
Descriptors: Visual Aids, Design, Cognitive Processes, Test Items
Peer reviewed Peer reviewed
Direct linkDirect link
Musa Adekunle Ayanwale; Jamiu Oluwadamilare Amusa; Adekunle Ibrahim Oladejo; Funmilayo Ayedun – Interchange: A Quarterly Review of Education, 2024
The study focuses on assessing the proficiency levels of higher education students, specifically the physics achievement test (PHY 101) at the National Open University of Nigeria (NOUN). This test, like others, evaluates various aspects of knowledge and skills simultaneously. However, relying on traditional models for such tests can result in…
Descriptors: Item Response Theory, Difficulty Level, Item Analysis, Test Items
Peer reviewed Peer reviewed
Direct linkDirect link
Roelofs, Erik C.; Emons, Wilco H. M.; Verschoor, Angela J. – International Journal of Testing, 2021
This study reports on an Evidence Centered Design (ECD) project in the Netherlands, involving the theory exam for prospective car drivers. In particular, we illustrate how cognitive load theory, task-analysis, response process models, and explanatory item-response theory can be used to systematically develop and refine task models. Based on a…
Descriptors: Foreign Countries, Psychometrics, Test Items, Evidence Based Practice
Peer reviewed Peer reviewed
Direct linkDirect link
Moon, Jung Aa; Sinharay, Sandip; Keehner, Madeleine; Katz, Irvin R. – International Journal of Testing, 2020
The current study examined the relationship between test-taker cognition and psychometric item properties in multiple-selection multiple-choice and grid items. In a study with content-equivalent mathematics items in alternative item formats, adult participants' tendency to respond to an item was affected by the presence of a grid and variations of…
Descriptors: Computer Assisted Testing, Multiple Choice Tests, Test Wiseness, Psychometrics
Peer reviewed Peer reviewed
Direct linkDirect link
Yildirim, Hüseyin H. – Educational Assessment, Evaluation and Accountability, 2021
From a sociocognitive perspective, item parameters in a test represent regularities in examinees' item responses. These regularities are originated from shared experiences among individuals in interacting with their environment. Theories explaining the relationship between culture and cognition also acknowledge these shared experiences as the…
Descriptors: Educational Assessment, Test Items, Item Response Theory, Psychometrics
Peer reviewed Peer reviewed
Direct linkDirect link
Embretson, Susan E. – Educational Measurement: Issues and Practice, 2016
Examinees' thinking processes have become an increasingly important concern in testing. The responses processes aspect is a major component of validity, and contemporary tests increasingly involve specifications about the cognitive complexity of examinees' response processes. Yet, empirical research findings on examinees' cognitive processes are…
Descriptors: Testing, Cognitive Processes, Test Construction, Test Items
Peer reviewed Peer reviewed
Direct linkDirect link
Schulz, Andreas; Leuders, Timo; Rangel, Ulrike – Journal of Psychoeducational Assessment, 2020
We provide evidence of validity for a newly developed diagnostic competence model of operation sense, by both (a) describing the theoretically substantiated development of the competence model in close association with its use within a large-scale formative assessment and (b) providing empirical evidence for the theoretically described cognitive…
Descriptors: Diagnostic Tests, Models, Criterion Referenced Tests, Cognitive Measurement
Peer reviewed Peer reviewed
Direct linkDirect link
Morrison, Kristin M.; Schwartz, Robert Andrew – AERA Online Paper Repository, 2016
Certain item features can be used to explain the difference between the difficulties of items. These item features can relate to the steps necessary to solve the problems or ways in which a increased understanding of material is acquired. This study will examine the relationship between item difficulty and the process skills associated with these…
Descriptors: Student Evaluation, Alternative Assessment, Standardized Tests, Test Items
Peer reviewed Peer reviewed
Direct linkDirect link
Baldonado, Angela Argo; Svetina, Dubravka; Gorin, Joanna – Applied Measurement in Education, 2015
Applications of traditional unidimensional item response theory models to passage-based reading comprehension assessment data have been criticized based on potential violations of local independence. However, simple rules for determining dependency, such as including all items associated with a particular passage, may overestimate the dependency…
Descriptors: Reading Tests, Reading Comprehension, Test Items, Item Response Theory
Peer reviewed Peer reviewed
Direct linkDirect link
Choi, Kyong Mi; Lee, Young-Sun; Park, Yoon Soo – EURASIA Journal of Mathematics, Science & Technology Education, 2015
International trended assessments have long attempted to provide instructional information to educational researchers and classroom teachers. Studies have shown that traditional methods of item analysis have not provided specific information that can be directly applicable to improve student performance. To this end, cognitive diagnosis models…
Descriptors: International Assessment, Mathematics Tests, Grade 8, Models
Mark Smith – ProQuest LLC, 2014
Learning standards across the United States have increasingly called for history students to engage in aspects of "historical thinking," a term used to describe the complex disciplinary processes that historians use to make sense of the past. Although students are expected to learn these complex processes, little is known about how to…
Descriptors: History Instruction, Thinking Skills, Validity, National Competency Tests
Xiang, Rui – ProQuest LLC, 2013
A key issue of cognitive diagnostic models (CDMs) is the correct identification of Q-matrix which indicates the relationship between attributes and test items. Previous CDMs typically assumed a known Q-matrix provided by domain experts such as those who developed the questions. However, misspecifications of Q-matrix had been discovered in the past…
Descriptors: Diagnostic Tests, Cognitive Processes, Matrices, Test Items
Peer reviewed Peer reviewed
Direct linkDirect link
Ketterlin-Geller, Leanne R.; Yovanoff, Paul; Jung, EunJu; Liu, Kimy; Geller, Josh – Educational Assessment, 2013
In this article, we highlight the need for a precisely defined construct in score-based validation and discuss the contribution of cognitive theories to accurately and comprehensively defining the construct. We propose a framework for integrating cognitively based theoretical and empirical evidence to specify and evaluate the construct. We apply…
Descriptors: Test Validity, Construct Validity, Scores, Evidence
Peer reviewed Peer reviewed
Direct linkDirect link
Schnotz, Wolfgang; Ludewig, Ulrich; Ullrich, Mark; Horz, Holger; McElvany, Nele; Baumert, Jürgen – Journal of Educational Psychology, 2014
Reading for learning frequently requires integrating text and picture information into coherent knowledge structures. This article presents an experimental study aimed at analyzing the strategies used by students for integrating text and picture information. Four combinations of texts and pictures (text-picture units) were selected from textbooks…
Descriptors: Teaching Methods, Pictorial Stimuli, Learning Strategies, Printed Materials
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Baghaei, Purya; Carstensen, Claus H. – Practical Assessment, Research & Evaluation, 2013
Standard unidimensional Rasch models assume that persons with the same ability parameters are comparable. That is, the same interpretation applies to persons with identical ability estimates as regards the underlying mental processes triggered by the test. However, research in cognitive psychology shows that persons at the same trait level may…
Descriptors: Item Response Theory, Models, Reading Comprehension, Reading Tests
Previous Page | Next Page »
Pages: 1  |  2  |  3  |  4