Publication Date
In 2025 | 1 |
Since 2024 | 30 |
Since 2021 (last 5 years) | 100 |
Descriptor
Response Style (Tests) | 100 |
Item Response Theory | 38 |
Foreign Countries | 36 |
Test Items | 33 |
Reaction Time | 19 |
Achievement Tests | 16 |
International Assessment | 14 |
Models | 14 |
Accuracy | 13 |
Scores | 13 |
Secondary School Students | 13 |
More ▼ |
Source
Author
Ames, Allison J. | 3 |
Bolt, Daniel M. | 3 |
Hsieh, Shu-Hui | 3 |
Leventhal, Brian C. | 3 |
Braeken, Johan | 2 |
Bulut, Hatice Cigdem | 2 |
Bulut, Okan | 2 |
Esther Ulitzsch | 2 |
Gummer, Tobias | 2 |
Jesper Tijmstra | 2 |
Nana Kim | 2 |
More ▼ |
Publication Type
Journal Articles | 93 |
Reports - Research | 88 |
Dissertations/Theses -… | 5 |
Reports - Descriptive | 3 |
Reports - Evaluative | 3 |
Tests/Questionnaires | 3 |
Speeches/Meeting Papers | 2 |
Information Analyses | 1 |
Education Level
Audience
Practitioners | 2 |
Researchers | 2 |
Location
Germany | 13 |
Czech Republic | 4 |
Greece | 4 |
Taiwan | 4 |
Australia | 3 |
China | 3 |
Italy | 3 |
Lithuania | 3 |
New Zealand | 3 |
Norway | 3 |
South Korea | 3 |
More ▼ |
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Martin, Silke; Lechner, Clemens; Kleinert, Corinna; Rammstedt, Beatrice – International Journal of Social Research Methodology, 2021
Selective nonresponse can introduce bias in longitudinal surveys. The present study examines the role of cognitive skills (more specifically, literacy skills), as measured in large-scale assessment surveys, in selective nonresponse in longitudinal surveys. We assume that low-skilled respondents perceive the cognitive assessment as a higher burden…
Descriptors: Literacy, Response Style (Tests), Longitudinal Studies, Foreign Countries
Steven R. Hiner – ProQuest LLC, 2023
The purpose of this study was to determine if there were significant statistical differences between scores on constructed response and computer-scorable questions on an accelerated middle school math placement test in a large urban school district in Ohio, and to ensure that all students have an opportunity to take the test. Five questions on a…
Descriptors: Scores, Middle Schools, Mathematics Tests, Placement Tests
Liu, Yue; Liu, Hongyun – Journal of Educational and Behavioral Statistics, 2021
The prevalence and serious consequences of noneffortful responses from unmotivated examinees are well-known in educational measurement. In this study, we propose to apply an iterative purification process based on a response time residual method with fixed item parameter estimates to detect noneffortful responses. The proposed method is compared…
Descriptors: Response Style (Tests), Reaction Time, Test Items, Accuracy
Gummer, Tobias; Roßmann, Joss; Silber, Henning – Sociological Methods & Research, 2021
Identifying inattentive respondents in self-administered surveys is a challenging goal for survey researchers. Instructed response items (IRIs) provide a measure for inattentiveness in grid questions that is easy to implement. The present article adds to the sparse research on the use and implementation of attention checks by addressing three…
Descriptors: Online Surveys, Attention, Response Style (Tests), Context Effect
Vésteinsdóttir, Vaka; Asgeirsdottir, Ragnhildur Lilja; Reips, Ulf-Dietrich; Thorsdottir, Fanney – International Journal of Social Research Methodology, 2021
The purpose of this study was to evaluate the role of socially desirable responding in an item-pair measure of acquiescence from the Big Five Inventory. If both items in an item-pair have desirable content, the likelihood of agreeing with both items is increased, and consequently, the type of responding that would be taken to indicate…
Descriptors: Social Desirability, Response Style (Tests), Personality Measures, Test Items
Nana Kim – ProQuest LLC, 2022
In educational and psychological assessments, attending to item response process can be useful in understanding and improving the validity of measurement. This dissertation consists of three studies each of which proposes and applies item response theory (IRT) methods for modeling and understanding cognitive/psychological response process in…
Descriptors: Psychometrics, Item Response Theory, Test Items, Cognitive Tests
Boško, Martin; Vonková, Hana; Papajoanu, Ondrej; Moore, Angie – Bulgarian Comparative Education Society, 2023
International large-scale assessments, such as Programme for International Student Assessment (PISA), are a crucial source of information for education researchers and policymakers. The assessment also includes a student questionnaire, however, the data can be biased by the differences in reporting behavior between students. In this paper, we…
Descriptors: Comparative Analysis, Response Style (Tests), Foreign Countries, Institutional Characteristics
Huang, Hung-Yu – Educational and Psychological Measurement, 2023
The forced-choice (FC) item formats used for noncognitive tests typically develop a set of response options that measure different traits and instruct respondents to make judgments among these options in terms of their preference to control the response biases that are commonly observed in normative tests. Diagnostic classification models (DCMs)…
Descriptors: Test Items, Classification, Bayesian Statistics, Decision Making
D'Urso, E. Damiano; Tijmstra, Jesper; Vermunt, Jeroen K.; De Roover, Kim – Educational and Psychological Measurement, 2023
Assessing the measurement model (MM) of self-report scales is crucial to obtain valid measurements of individuals' latent psychological constructs. This entails evaluating the number of measured constructs and determining which construct is measured by which item. Exploratory factor analysis (EFA) is the most-used method to evaluate these…
Descriptors: Factor Analysis, Measurement Techniques, Self Evaluation (Individuals), Psychological Patterns
Fabiola Reiber; Donna Bryce; Rolf Ulrich – Sociological Methods & Research, 2024
Randomized response techniques (RRTs) are applied to reduce response biases in self-report surveys on sensitive research questions (e.g., on socially undesirable characteristics). However, there is evidence that they cannot completely eliminate self-protecting response strategies. To address this problem, there are RRTs specifically designed to…
Descriptors: Foreign Countries, Family Violence, COVID-19, Pandemics
Zebing Wu – ProQuest LLC, 2024
Response style, one common aberrancy in non-cognitive assessments in psychological fields, is problematic in terms of inaccurate estimation of item and person parameters, which leads to serious reliability, validity, and fairness issues (Baumgartner & Steenkamp, 2001; Bolt & Johnson, 2009; Bolt & Newton, 2011). Response style refers to…
Descriptors: Response Style (Tests), Accuracy, Preferences, Psychological Testing
Rios, Joseph A. – Educational and Psychological Measurement, 2021
Low test-taking effort as a validity threat is common when examinees perceive an assessment context to have minimal personal value. Prior research has shown that in such contexts, subgroups may differ in their effort, which raises two concerns when making subgroup mean comparisons. First, it is unclear how differential effort could influence…
Descriptors: Response Style (Tests), Statistical Analysis, Measurement, Comparative Analysis
Lee, HyeSun; Smith, Weldon; Martinez, Angel; Ferris, Heather; Bova, Joe – Applied Measurement in Education, 2021
The aim of the current research was to provide recommendations to facilitate the development and use of anchoring vignettes (AVs) for cross-cultural comparisons in education. Study 1 identified six factors leading to order violations and ties in AV responses based on cognitive interviews with 15-year-old students. The factors were categorized into…
Descriptors: Vignettes, Test Items, Equated Scores, Nonparametric Statistics
Vonkova, Hana; Hrabak, Jan; Kralova, Katerina; Papajoanu, Ondrej – Field Methods, 2021
Self-assessment measures are commonly used in questionnaire surveys. However, one of the problems with self-reports is that they may be prone to differences in scale usage among respondents. The anchoring vignette method addresses this issue. It relies on two assumptions: response consistency and vignette equivalence. Here we aim to develop a…
Descriptors: Vignettes, Interviews, Self Evaluation (Individuals), Reliability
Ulitzsch, Esther; Penk, Christiane; von Davier, Matthias; Pohl, Steffi – Educational Assessment, 2021
Identifying and considering test-taking effort is of utmost importance for drawing valid inferences on examinee competency in low-stakes tests. Different approaches exist for doing so. The speed-accuracy+engagement model aims at identifying non-effortful test-taking behavior in terms of nonresponse and rapid guessing based on responses and…
Descriptors: Response Style (Tests), Guessing (Tests), Reaction Time, Measurement Techniques