Publication Date
In 2025 | 0 |
Since 2024 | 1 |
Since 2021 (last 5 years) | 7 |
Since 2016 (last 10 years) | 18 |
Since 2006 (last 20 years) | 43 |
Descriptor
Source
Author
Publication Type
Education Level
Higher Education | 41 |
Postsecondary Education | 34 |
Two Year Colleges | 2 |
Elementary Education | 1 |
High Schools | 1 |
Secondary Education | 1 |
Location
Canada | 6 |
Germany | 4 |
United Kingdom | 2 |
Australia | 1 |
Belgium | 1 |
Brunei | 1 |
California (Irvine) | 1 |
China | 1 |
Denmark | 1 |
Fiji | 1 |
France | 1 |
More ▼ |
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Sijia Huang; Seungwon Chung; Carl F. Falk – Journal of Educational Measurement, 2024
In this study, we introduced a cross-classified multidimensional nominal response model (CC-MNRM) to account for various response styles (RS) in the presence of cross-classified data. The proposed model allows slopes to vary across items and can explore impacts of observed covariates on latent constructs. We applied a recently developed variant of…
Descriptors: Response Style (Tests), Classification, Data, Models
Papanastasiou, Elena C.; Stylianou-Georgiou, Agni – Assessment in Education: Principles, Policy & Practice, 2022
? frequently used indicator to reflect student performance is that of a test score. However, although tests are designed to assess students' knowledge or skills, other factors can also affect test results such as test-taking strategies. Therefore, the purpose of this study was to model the interrelationships among test-taking strategy instruction…
Descriptors: Test Wiseness, Metacognition, Multiple Choice Tests, Response Style (Tests)
Courey, Karyssa A.; Lee, Michael D. – AERA Open, 2021
Student evaluations of teaching are widely used to assess instructors and courses. Using a model-based approach and Bayesian methods, we examine how the direction of the scale, labels on scales, and the number of options affect the ratings. We conduct a within-participants experiment in which respondents evaluate instructors and lectures using…
Descriptors: Student Evaluation of Teacher Performance, Rating Scales, Response Style (Tests), College Students
Rocconi, Louis M.; Dumford, Amber D.; Butler, Brenna – Research in Higher Education, 2020
Researchers, assessment professionals, and faculty in higher education increasingly depend on survey data from students to make pivotal curricular and programmatic decisions. The surveys collecting these data often require students to judge frequency (e.g., how often), quantity (e.g., how much), or intensity (e.g., how strongly). The response…
Descriptors: Student Surveys, College Students, Rating Scales, Response Style (Tests)
Iannario, Maria; Manisera, Marica; Piccolo, Domenico; Zuccolotto, Paola – Sociological Methods & Research, 2020
In analyzing data from attitude surveys, it is common to consider the "don't know" responses as missing values. In this article, we present a statistical model commonly used for the analysis of responses/evaluations expressed on Likert scales and extended to take into account the presence of don't know responses. The main objective is to…
Descriptors: Response Style (Tests), Likert Scales, Statistical Analysis, Models
Vésteinsdóttir, Vaka; Asgeirsdottir, Ragnhildur Lilja; Reips, Ulf-Dietrich; Thorsdottir, Fanney – International Journal of Social Research Methodology, 2021
The purpose of this study was to evaluate the role of socially desirable responding in an item-pair measure of acquiescence from the Big Five Inventory. If both items in an item-pair have desirable content, the likelihood of agreeing with both items is increased, and consequently, the type of responding that would be taken to indicate…
Descriptors: Social Desirability, Response Style (Tests), Personality Measures, Test Items
Ulitzsch, Esther; Penk, Christiane; von Davier, Matthias; Pohl, Steffi – Educational Assessment, 2021
Identifying and considering test-taking effort is of utmost importance for drawing valid inferences on examinee competency in low-stakes tests. Different approaches exist for doing so. The speed-accuracy+engagement model aims at identifying non-effortful test-taking behavior in terms of nonresponse and rapid guessing based on responses and…
Descriptors: Response Style (Tests), Guessing (Tests), Reaction Time, Measurement Techniques
Magraw-Mickelson, Zoe; Wang, Harry H.; Gollwitzer, Mario – International Journal of Testing, 2022
Much psychological research depends on participants' diligence in filling out materials such as surveys. However, not all participants are motivated to respond attentively, which leads to unintended issues with data quality, known as careless responding. Our question is: how do different modes of data collection--paper/pencil, computer/web-based,…
Descriptors: Response Style (Tests), Surveys, Data Collection, Test Format
Rios, Joseph A.; Guo, Hongwen – Applied Measurement in Education, 2020
The objective of this study was to evaluate whether differential noneffortful responding (identified via response latencies) was present in four countries administered a low-stakes college-level critical thinking assessment. Results indicated significant differences (as large as 0.90 "SD") between nearly all country pairings in the…
Descriptors: Response Style (Tests), Cultural Differences, Critical Thinking, Cognitive Tests
Höhne, Jan Karem; Schlosser, Stephan – International Journal of Social Research Methodology, 2019
Participation in web surveys via smartphones increased continuously in recent years. The reasons for this increase are a growing proportion of smartphone owners and an increase in mobile Internet access. However, research has shown that smartphone respondents are frequently distracted and/or multitasking, which might affect completion and response…
Descriptors: Online Surveys, Handheld Devices, Response Rates (Questionnaires), Response Style (Tests)
Kam, Chester Chun Seng – Sociological Methods & Research, 2018
The item wording (or keying) effect is respondents' differential response style to positively and negatively worded items. Despite decades of research, the nature of the effect is still unclear. This article proposes a potential reason; namely, that the item wording effect is scale-specific, and thus findings are applicable only to a particular…
Descriptors: Response Style (Tests), Test Items, Language Usage, College Students
Hedlefs-Aguilar, Maria Isolde; Morales-Martinez, Guadalupe Elizabeth; Villarreal-Lozano, Ricardo Jesus; Moreno-Rodriguez, Claudia; Gonzalez-Rodriguez, Erick Alejandro – European Journal of Educational Research, 2021
This study explored the cognitive mechanism behind information integration in the test anxiety judgments in 140 engineering students. An experiment was designed to test four factors combined (test goal orientation, test cognitive functioning level, test difficulty and test mode). The experimental task required participants to read 36 scenarios,…
Descriptors: Test Anxiety, Engineering Education, Algebra, College Students
Kolski, Tammi; Weible, Jennifer L. – Community College Journal of Research and Practice, 2019
eLearning instruction has become an accepted means of delivering a quality education to higher education students, with community college online learning enrollment rates rising annually. Consistent with the desires of eLearning students for convenience and flexibility, educators utilize virtual proctored exams to safeguard against academic…
Descriptors: Community Colleges, Two Year College Students, College Students, Student Behavior
Montagni, Ilaria; Cariou, Tanguy; Tzourio, Christophe; González-Caballero, Juan-Luis – International Journal of Social Research Methodology, 2019
Online surveys are increasingly used to investigate health conditions, especially among young people. However, this methodology presents some limitations including item nonresponse concerning sensitive and knowledge-related questions. This study aimed to suggest latent class analysis as the methodology to statistically deal with item nonresponse…
Descriptors: Online Surveys, Response Style (Tests), Multivariate Analysis, Health
Kam, Chester Chun Seng – Educational and Psychological Measurement, 2016
To measure the response style of acquiescence, researchers recommend the use of at least 15 items with heterogeneous content. Such an approach is consistent with its theoretical definition and is a substantial improvement over traditional methods. Nevertheless, measurement of acquiescence can be enhanced by two additional considerations: first, to…
Descriptors: Test Items, Response Style (Tests), Test Content, Measurement