NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 7 results Save | Export
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Stanke, Luke; Bulut, Okan – International Journal of Assessment Tools in Education, 2019
Item response theory is a widely used framework for the design, scoring, and scaling of measurement instruments. Item response models are typically used for dichotomously scored questions that have only two score points (e.g., multiple-choice items). However, given the increasing use of instruments that include questions with multiple response…
Descriptors: Item Response Theory, Test Items, Responses, College Freshmen
Peer reviewed Peer reviewed
Direct linkDirect link
Shin, Jinnie; Bulut, Okan; Gierl, Mark J. – Journal of Experimental Education, 2020
The arrangement of response options in multiple-choice (MC) items, especially the location of the most attractive distractor, is considered critical in constructing high-quality MC items. In the current study, a sample of 496 undergraduate students taking an educational assessment course was given three test forms consisting of the same items but…
Descriptors: Foreign Countries, Undergraduate Students, Multiple Choice Tests, Item Response Theory
Peer reviewed Peer reviewed
Direct linkDirect link
Arce-Ferrer, Alvaro J.; Bulut, Okan – Journal of Experimental Education, 2019
This study investigated the performance of four widely used data-collection designs in detecting test-mode effects (i.e., computer-based versus paper-based testing). The experimental conditions included four data-collection designs, two test-administration modes, and the availability of an anchor assessment. The test-level and item-level results…
Descriptors: Data Collection, Test Construction, Test Format, Computer Assisted Testing
Peer reviewed Peer reviewed
Direct linkDirect link
Kan, Adnan; Bulut, Okan; Cormier, Damien C. – Educational Assessment, 2019
Item stem formats can alter the cognitive complexity as well as the type of abilities required for solving mathematics items. Consequently, it is possible that item stem formats can affect the dimensional structure of mathematics assessments. This empirical study investigated the relationship between item stem format and the dimensionality of…
Descriptors: Mathematics Tests, Test Items, Test Format, Problem Solving
Peer reviewed Peer reviewed
Direct linkDirect link
Kan, Adnan; Bulut, Okan – Education Research and Perspectives, 2015
This study investigates whether word problems and mathematically expressed items can be used interchangeably regardless of their linguistic complexities. A sample of sixth grade students was given two forms of a mathematics assessment. The first form included mathematics items with mathematical terms, expressions, and equations whereas the second…
Descriptors: Foreign Countries, Middle School Students, Grade 6, Secondary School Mathematics
Peer reviewed Peer reviewed
Direct linkDirect link
Kan, Adnan; Bulut, Okan – International Journal of Testing, 2014
This study investigated whether the linguistic complexity of items leads to gender differential item functioning (DIF) on mathematics assessments. Two forms of a mathematics test were developed. The first form consisted of algebra items based on mathematical expressions, terms, and equations. In the second form, the same items were written as word…
Descriptors: Gender Differences, Test Bias, Difficulty Level, Test Items
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Bulut, Okan; Kan, Adnan – Eurasian Journal of Educational Research, 2012
Problem Statement: Computerized adaptive testing (CAT) is a sophisticated and efficient way of delivering examinations. In CAT, items for each examinee are selected from an item bank based on the examinee's responses to the items. In this way, the difficulty level of the test is adjusted based on the examinee's ability level. Instead of…
Descriptors: Adaptive Testing, Computer Assisted Testing, College Entrance Examinations, Graduate Students