Publication Date
In 2025 | 1 |
Since 2024 | 30 |
Since 2021 (last 5 years) | 100 |
Since 2016 (last 10 years) | 214 |
Since 2006 (last 20 years) | 414 |
Descriptor
Source
Author
Weiss, David J. | 12 |
Wise, Steven L. | 9 |
Bolt, Daniel M. | 7 |
Benson, Jeri | 6 |
Fiske, Donald W. | 6 |
Holden, Ronald R. | 6 |
Jackson, Douglas N. | 6 |
Adkins, Dorothy C. | 5 |
Birenbaum, Menucha | 5 |
Crocker, Linda | 5 |
Greve, Kevin W. | 5 |
More ▼ |
Publication Type
Education Level
Audience
Researchers | 58 |
Practitioners | 17 |
Teachers | 6 |
Administrators | 3 |
Counselors | 2 |
Students | 1 |
Location
Germany | 27 |
Canada | 20 |
Australia | 17 |
United States | 12 |
South Korea | 10 |
United Kingdom | 10 |
China | 9 |
Denmark | 9 |
France | 9 |
Italy | 9 |
Norway | 9 |
More ▼ |
Laws, Policies, & Programs
Elementary and Secondary… | 1 |
Assessments and Surveys
What Works Clearinghouse Rating
Gummer, Tobias; Roßmann, Joss; Silber, Henning – Sociological Methods & Research, 2021
Identifying inattentive respondents in self-administered surveys is a challenging goal for survey researchers. Instructed response items (IRIs) provide a measure for inattentiveness in grid questions that is easy to implement. The present article adds to the sparse research on the use and implementation of attention checks by addressing three…
Descriptors: Online Surveys, Attention, Response Style (Tests), Context Effect
Vésteinsdóttir, Vaka; Asgeirsdottir, Ragnhildur Lilja; Reips, Ulf-Dietrich; Thorsdottir, Fanney – International Journal of Social Research Methodology, 2021
The purpose of this study was to evaluate the role of socially desirable responding in an item-pair measure of acquiescence from the Big Five Inventory. If both items in an item-pair have desirable content, the likelihood of agreeing with both items is increased, and consequently, the type of responding that would be taken to indicate…
Descriptors: Social Desirability, Response Style (Tests), Personality Measures, Test Items
Nana Kim – ProQuest LLC, 2022
In educational and psychological assessments, attending to item response process can be useful in understanding and improving the validity of measurement. This dissertation consists of three studies each of which proposes and applies item response theory (IRT) methods for modeling and understanding cognitive/psychological response process in…
Descriptors: Psychometrics, Item Response Theory, Test Items, Cognitive Tests
Boško, Martin; Vonková, Hana; Papajoanu, Ondrej; Moore, Angie – Bulgarian Comparative Education Society, 2023
International large-scale assessments, such as Programme for International Student Assessment (PISA), are a crucial source of information for education researchers and policymakers. The assessment also includes a student questionnaire, however, the data can be biased by the differences in reporting behavior between students. In this paper, we…
Descriptors: Comparative Analysis, Response Style (Tests), Foreign Countries, Institutional Characteristics
Huang, Hung-Yu – Educational and Psychological Measurement, 2023
The forced-choice (FC) item formats used for noncognitive tests typically develop a set of response options that measure different traits and instruct respondents to make judgments among these options in terms of their preference to control the response biases that are commonly observed in normative tests. Diagnostic classification models (DCMs)…
Descriptors: Test Items, Classification, Bayesian Statistics, Decision Making
D'Urso, E. Damiano; Tijmstra, Jesper; Vermunt, Jeroen K.; De Roover, Kim – Educational and Psychological Measurement, 2023
Assessing the measurement model (MM) of self-report scales is crucial to obtain valid measurements of individuals' latent psychological constructs. This entails evaluating the number of measured constructs and determining which construct is measured by which item. Exploratory factor analysis (EFA) is the most-used method to evaluate these…
Descriptors: Factor Analysis, Measurement Techniques, Self Evaluation (Individuals), Psychological Patterns
Nichols, Austin Lee; Edlund, John E. – International Journal of Social Research Methodology, 2020
Although careless respondents have wreaked havoc on research for decades, the prevalence and implications of these participants has likely increased due to many new methodological techniques currently in use. Across three studies, we examined the prevalence of careless responding in participants, several means of predicting careless respondents,…
Descriptors: Response Style (Tests), Incidence, Geographic Location, Foreign Countries
Ulitzsch, Esther; von Davier, Matthias; Pohl, Steffi – Educational and Psychological Measurement, 2020
So far, modeling approaches for not-reached items have considered one single underlying process. However, missing values at the end of a test can occur for a variety of reasons. On the one hand, examinees may not reach the end of a test due to time limits and lack of working speed. On the other hand, examinees may not attempt all items and quit…
Descriptors: Item Response Theory, Test Items, Response Style (Tests), Computer Assisted Testing
Chylíková, Johana – International Journal of Social Research Methodology, 2020
This study explores the acquiescent response style (ARS) among respondents in the Czech Republic. To analyse ARS, confirmatory factor analysis (CFA) was employed and the response style (RS) was modelled as a latent variable. The RS factor in the CFA model must be validated by its relationship to education and age, i.e. proxies of cognitive…
Descriptors: Foreign Countries, Response Style (Tests), Age Differences, Educational Attainment
Liu, Yue; Cheng, Ying; Liu, Hongyun – Educational and Psychological Measurement, 2020
The responses of non-effortful test-takers may have serious consequences as non-effortful responses can impair model calibration and latent trait inferences. This article introduces a mixture model, using both response accuracy and response time information, to help differentiating non-effortful and effortful individuals, and to improve item…
Descriptors: Item Response Theory, Test Wiseness, Response Style (Tests), Reaction Time
Spratto, Elisabeth M.; Bandalos, Deborah L. – Journal of Experimental Education, 2020
Research suggests that certain characteristics of survey items may impact participants' responses. In this study we investigated the impact of several of these characteristics: vague wording, question-versus-statement phrasing, and full-versus-partial labeling of response options. We manipulated survey items per these characteristics and randomly…
Descriptors: Attitude Measures, Test Format, Test Construction, Factor Analysis
Höhne, Jan Karem; Yan, Ting – International Journal of Social Research Methodology, 2020
Web surveys are an established data collection mode that use written language to provide information. The written language is accompanied by visual elements, such as presentation formats and shapes. However, research has shown that visual elements influence response behavior because respondents sometimes use interpretive heuristics to make sense…
Descriptors: Heuristics, Visual Aids, Online Surveys, Response Style (Tests)
Fabiola Reiber; Donna Bryce; Rolf Ulrich – Sociological Methods & Research, 2024
Randomized response techniques (RRTs) are applied to reduce response biases in self-report surveys on sensitive research questions (e.g., on socially undesirable characteristics). However, there is evidence that they cannot completely eliminate self-protecting response strategies. To address this problem, there are RRTs specifically designed to…
Descriptors: Foreign Countries, Family Violence, COVID-19, Pandemics
Zebing Wu – ProQuest LLC, 2024
Response style, one common aberrancy in non-cognitive assessments in psychological fields, is problematic in terms of inaccurate estimation of item and person parameters, which leads to serious reliability, validity, and fairness issues (Baumgartner & Steenkamp, 2001; Bolt & Johnson, 2009; Bolt & Newton, 2011). Response style refers to…
Descriptors: Response Style (Tests), Accuracy, Preferences, Psychological Testing
Rios, Joseph A. – Educational and Psychological Measurement, 2021
Low test-taking effort as a validity threat is common when examinees perceive an assessment context to have minimal personal value. Prior research has shown that in such contexts, subgroups may differ in their effort, which raises two concerns when making subgroup mean comparisons. First, it is unclear how differential effort could influence…
Descriptors: Response Style (Tests), Statistical Analysis, Measurement, Comparative Analysis