NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 7 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Lee, HyeSun; Smith, Weldon; Martinez, Angel; Ferris, Heather; Bova, Joe – Applied Measurement in Education, 2021
The aim of the current research was to provide recommendations to facilitate the development and use of anchoring vignettes (AVs) for cross-cultural comparisons in education. Study 1 identified six factors leading to order violations and ties in AV responses based on cognitive interviews with 15-year-old students. The factors were categorized into…
Descriptors: Vignettes, Test Items, Equated Scores, Nonparametric Statistics
Peer reviewed Peer reviewed
Direct linkDirect link
Park, Minjeong; Wu, Amery D. – Educational and Psychological Measurement, 2019
Item response tree (IRTree) models are recently introduced as an approach to modeling response data from Likert-type rating scales. IRTree models are particularly useful to capture a variety of individuals' behaviors involving in item responding. This study employed IRTree models to investigate response styles, which are individuals' tendencies to…
Descriptors: Item Response Theory, Models, Likert Scales, Response Style (Tests)
Madni, Ayesha; Kao, Jenny C.; Rivera, Nichole M.; Baker, Eva L.; Cai, Li – National Center for Research on Evaluation, Standards, and Student Testing (CRESST), 2018
This report is the first in a series of five reports considering career-readiness features within high school assessments. Utilizing feature analysis and cognitive lab interviews, the primary objective of this study was to verify and validate the existence of specific career-readiness features in select math and English language arts (ELA) test…
Descriptors: Career Readiness, High School Students, Test Items, Student Evaluation
Peer reviewed Peer reviewed
Direct linkDirect link
Dube, Chad; Rotello, Caren M.; Heit, Evan – Psychological Review, 2010
A belief bias effect in syllogistic reasoning (Evans, Barston, & Pollard, 1983) is observed when subjects accept more valid than invalid arguments and more believable than unbelievable conclusions and show greater overall accuracy in judging arguments with unbelievable conclusions. The effect is measured with a contrast of contrasts, comparing…
Descriptors: Response Style (Tests), Item Analysis, Error of Measurement, Replication (Evaluation)
Peer reviewed Peer reviewed
Marshall, Sandra P. – Journal for Research in Mathematics Education, 1983
Assessment test data from 1976-79 on California children in grade six were analyzed. Loglinear models were used to evaluate the consistency of response of each sex. A significant interaction between sex and choice of distractor occurred for a large majority of the items. (MNS)
Descriptors: Educational Research, Elementary Education, Elementary School Mathematics, Grade 6
Jakwerth, Pamela R.; Stancavage, Frances B.; Reed, Ellen D. – National Center for Education Statistics, 2003
Over the past decade, developers of the National Assessment of Educational Progress (NAEP) have changed substantially the mix of item types on the NAEP assessments by decreasing the numbers of multiple-choice questions and increasing the numbers of questions requiring short- or extended-constructed responses. These changes have been motivated…
Descriptors: National Competency Tests, Response Style (Tests), Test Validity, Qualitative Research
Jensen, Arthur R. – 1972
Contrary to popular opinion, it is very difficult to find any objective evidence of culture bias that could account for social class and racial differences in performance on current standard tests of intelligence, even those like the Peabody Picture Vocabulary Test (PPVT), which give the appearance of being highly culture-loaded. They may be…
Descriptors: Academic Achievement, Cultural Influences, Educational Diagnosis, Factor Analysis