Publication Date
In 2025 | 0 |
Since 2024 | 6 |
Since 2021 (last 5 years) | 33 |
Since 2016 (last 10 years) | 64 |
Since 2006 (last 20 years) | 87 |
Descriptor
Source
Author
Huggins-Manley, Anne Corinne | 3 |
Kam, Chester Chun Seng | 3 |
Plake, Barbara S. | 3 |
Ames, Allison J. | 2 |
Bejar, Isaac I. | 2 |
Belov, Dmitry I. | 2 |
Benson, Jeri | 2 |
Bolt, Daniel M. | 2 |
Braeken, Johan | 2 |
Bulut, Okan | 2 |
Cheng, Ying | 2 |
More ▼ |
Publication Type
Education Level
Secondary Education | 21 |
Higher Education | 13 |
Elementary Education | 10 |
Postsecondary Education | 9 |
Middle Schools | 8 |
Intermediate Grades | 7 |
Grade 3 | 5 |
Grade 4 | 5 |
Grade 5 | 5 |
High Schools | 5 |
Primary Education | 5 |
More ▼ |
Audience
Researchers | 10 |
Practitioners | 3 |
Teachers | 1 |
Location
Germany | 8 |
Canada | 6 |
Australia | 5 |
United States | 4 |
California | 3 |
Finland | 3 |
Netherlands | 3 |
United Kingdom | 3 |
Belgium | 2 |
China | 2 |
Denmark | 2 |
More ▼ |
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Vésteinsdóttir, Vaka; Asgeirsdottir, Ragnhildur Lilja; Reips, Ulf-Dietrich; Thorsdottir, Fanney – International Journal of Social Research Methodology, 2021
The purpose of this study was to evaluate the role of socially desirable responding in an item-pair measure of acquiescence from the Big Five Inventory. If both items in an item-pair have desirable content, the likelihood of agreeing with both items is increased, and consequently, the type of responding that would be taken to indicate…
Descriptors: Social Desirability, Response Style (Tests), Personality Measures, Test Items
Nana Kim – ProQuest LLC, 2022
In educational and psychological assessments, attending to item response process can be useful in understanding and improving the validity of measurement. This dissertation consists of three studies each of which proposes and applies item response theory (IRT) methods for modeling and understanding cognitive/psychological response process in…
Descriptors: Psychometrics, Item Response Theory, Test Items, Cognitive Tests
Huang, Hung-Yu – Educational and Psychological Measurement, 2023
The forced-choice (FC) item formats used for noncognitive tests typically develop a set of response options that measure different traits and instruct respondents to make judgments among these options in terms of their preference to control the response biases that are commonly observed in normative tests. Diagnostic classification models (DCMs)…
Descriptors: Test Items, Classification, Bayesian Statistics, Decision Making
Ulitzsch, Esther; von Davier, Matthias; Pohl, Steffi – Educational and Psychological Measurement, 2020
So far, modeling approaches for not-reached items have considered one single underlying process. However, missing values at the end of a test can occur for a variety of reasons. On the one hand, examinees may not reach the end of a test due to time limits and lack of working speed. On the other hand, examinees may not attempt all items and quit…
Descriptors: Item Response Theory, Test Items, Response Style (Tests), Computer Assisted Testing
Liu, Yue; Cheng, Ying; Liu, Hongyun – Educational and Psychological Measurement, 2020
The responses of non-effortful test-takers may have serious consequences as non-effortful responses can impair model calibration and latent trait inferences. This article introduces a mixture model, using both response accuracy and response time information, to help differentiating non-effortful and effortful individuals, and to improve item…
Descriptors: Item Response Theory, Test Wiseness, Response Style (Tests), Reaction Time
Höhne, Jan Karem; Yan, Ting – International Journal of Social Research Methodology, 2020
Web surveys are an established data collection mode that use written language to provide information. The written language is accompanied by visual elements, such as presentation formats and shapes. However, research has shown that visual elements influence response behavior because respondents sometimes use interpretive heuristics to make sense…
Descriptors: Heuristics, Visual Aids, Online Surveys, Response Style (Tests)
Zebing Wu – ProQuest LLC, 2024
Response style, one common aberrancy in non-cognitive assessments in psychological fields, is problematic in terms of inaccurate estimation of item and person parameters, which leads to serious reliability, validity, and fairness issues (Baumgartner & Steenkamp, 2001; Bolt & Johnson, 2009; Bolt & Newton, 2011). Response style refers to…
Descriptors: Response Style (Tests), Accuracy, Preferences, Psychological Testing
Lee, HyeSun; Smith, Weldon; Martinez, Angel; Ferris, Heather; Bova, Joe – Applied Measurement in Education, 2021
The aim of the current research was to provide recommendations to facilitate the development and use of anchoring vignettes (AVs) for cross-cultural comparisons in education. Study 1 identified six factors leading to order violations and ties in AV responses based on cognitive interviews with 15-year-old students. The factors were categorized into…
Descriptors: Vignettes, Test Items, Equated Scores, Nonparametric Statistics
Scanlon, Paul J. – Field Methods, 2019
Web, or online, probing has the potential to supplement existing questionnaire design processes by providing structured cognitive data on a wider sample than typical qualitative-only question evaluation methods can achieve. One of the practical impediments to the further integration of web probing is the concern of survey managers about how the…
Descriptors: Online Surveys, Questionnaires, Response Style (Tests), Test Items
Babcock, Ben; Siegel, Zachary D. – Practical Assessment, Research & Evaluation, 2022
Research about repeated testing has revealed that retaking the same exam form generally does not advantage or disadvantage failing candidates in selected response-style credentialing exams. Feinberg, Raymond, and Haist (2015) found a contributing factor to this phenomenon: people answering items incorrectly on both attempts give the same incorrect…
Descriptors: Multiple Choice Tests, Item Analysis, Test Items, Response Style (Tests)
Ivanova, Militsa; Michaelides, Michalis; Eklöf, Hanna – Educational Research and Evaluation, 2020
Collecting process data in computer-based assessments provides opportunities to describe examinee behaviour during a test-taking session. The number of actions taken by students while interacting with an item is in this context a variable that has been gaining attention. The present study aims to investigate how the number of actions performed on…
Descriptors: Foreign Countries, Secondary School Students, Achievement Tests, International Assessment
OECD Publishing, 2019
Log files from computer-based assessment can help better understand respondents' behaviours and cognitive strategies. Analysis of timing information from Programme for the International Assessment of Adult Competencies (PIAAC) reveals large differences in the time participants take to answer assessment items, as well as large country differences…
Descriptors: Adults, Computer Assisted Testing, Test Items, Reaction Time
Steinmann, Isa; Sánchez, Daniel; van Laar, Saskia; Braeken, Johan – Assessment in Education: Principles, Policy & Practice, 2022
Questionnaire scales that are mixed-worded, i.e. include both positively and negatively worded items, often suffer from issues like low reliability and more complex latent structures than intended. Part of the problem might be that some responders fail to respond consistently to the mixed-worded items. We investigated the prevalence and impact of…
Descriptors: Response Style (Tests), Test Items, Achievement Tests, Foreign Countries
Bulut, Okan; Bulut, Hatice Cigdem; Cormier, Damien C.; Ilgun Dibek, Munevver; Sahin Kursad, Merve – Educational Assessment, 2023
Some statewide testing programs allow students to receive corrective feedback and revise their answers during testing. Despite its pedagogical benefits, the effects of providing revision opportunities remain unknown in the context of alternate assessments. Therefore, this study examined student data from a large-scale alternate assessment that…
Descriptors: Error Correction, Alternative Assessment, Feedback (Response), Multiple Choice Tests
Liu, Yuan; Hau, Kit-Tai – Educational and Psychological Measurement, 2020
In large-scale low-stake assessment such as the Programme for International Student Assessment (PISA), students may skip items (missingness) which are within their ability to complete. The detection and taking care of these noneffortful responses, as a measure of test-taking motivation, is an important issue in modern psychometric models.…
Descriptors: Response Style (Tests), Motivation, Test Items, Statistical Analysis