NotesFAQContact Us
Collection
Advanced
Search Tips
Publication Date
In 20251
Since 202430
Since 2021 (last 5 years)100
Since 2016 (last 10 years)214
Laws, Policies, & Programs
What Works Clearinghouse Rating
Showing 46 to 60 of 214 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Silber, Henning; Roßmann, Joss; Gummer, Tobias – Field Methods, 2022
Attention checks detect inattentiveness by instructing respondents to perform a specific task. However, while respondents may correctly process the task, they may choose to not comply with the instructions. We investigated the issue of noncompliance in attention checks in two web surveys. In Study 1, we measured respondents' attitudes toward…
Descriptors: Compliance (Psychology), Attention, Task Analysis, Online Surveys
Peer reviewed Peer reviewed
Direct linkDirect link
Papanastasiou, Elena C.; Stylianou-Georgiou, Agni – Assessment in Education: Principles, Policy & Practice, 2022
? frequently used indicator to reflect student performance is that of a test score. However, although tests are designed to assess students' knowledge or skills, other factors can also affect test results such as test-taking strategies. Therefore, the purpose of this study was to model the interrelationships among test-taking strategy instruction…
Descriptors: Test Wiseness, Metacognition, Multiple Choice Tests, Response Style (Tests)
Peer reviewed Peer reviewed
Direct linkDirect link
J. A. Bialo; H. Li – Educational Assessment, 2024
This study evaluated differential item functioning (DIF) in achievement motivation items before and after using anchoring vignettes as a statistical tool to account for group differences in response styles across gender and ethnicity. We applied the nonparametric scoring of the vignettes to motivation items from the 2015 Programme for…
Descriptors: Test Bias, Student Motivation, Achievement Tests, Secondary School Students
Peer reviewed Peer reviewed
Direct linkDirect link
Viola Merhof; Caroline M. Böhm; Thorsten Meiser – Educational and Psychological Measurement, 2024
Item response tree (IRTree) models are a flexible framework to control self-reported trait measurements for response styles. To this end, IRTree models decompose the responses to rating items into sub-decisions, which are assumed to be made on the basis of either the trait being measured or a response style, whereby the effects of such person…
Descriptors: Item Response Theory, Test Interpretation, Test Reliability, Test Validity
Peer reviewed Peer reviewed
Direct linkDirect link
Wim J. van der Linden; Luping Niu; Seung W. Choi – Journal of Educational and Behavioral Statistics, 2024
A test battery with two different levels of adaptation is presented: a within-subtest level for the selection of the items in the subtests and a between-subtest level to move from one subtest to the next. The battery runs on a two-level model consisting of a regular response model for each of the subtests extended with a second level for the joint…
Descriptors: Adaptive Testing, Test Construction, Test Format, Test Reliability
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Bulut, Hatice Cigdem – International Journal of Assessment Tools in Education, 2021
Several studies have been published on disengaged test respondents, and others have analyzed disengaged survey respondents separately. For many large-scale assessments, students answer questionnaire and test items in succession. This study examines the percentage of students who continuously engage in disengaged responding behaviors across…
Descriptors: Reaction Time, Response Style (Tests), Foreign Countries, International Assessment
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Courey, Karyssa A.; Lee, Michael D. – AERA Open, 2021
Student evaluations of teaching are widely used to assess instructors and courses. Using a model-based approach and Bayesian methods, we examine how the direction of the scale, labels on scales, and the number of options affect the ratings. We conduct a within-participants experiment in which respondents evaluate instructors and lectures using…
Descriptors: Student Evaluation of Teacher Performance, Rating Scales, Response Style (Tests), College Students
Peer reviewed Peer reviewed
Direct linkDirect link
Jacobs, Laura; Loosveldt, Geert; Beullens, Koen – Field Methods, 2020
A growing body of literature points to the possibilities offered by postsurvey interviewer observations as a source of paradata to obtain insights into data quality. However, their utility in predicting actual behavior of respondents has attracted limited scholarly attention so far. Using data from Round 7 of the European Social Survey, we aim to…
Descriptors: Interviews, Accuracy, Observation, Response Style (Tests)
Peer reviewed Peer reviewed
Direct linkDirect link
Cannon, Edmund; Cipriani, Giam Pietro – Assessment & Evaluation in Higher Education, 2022
Student evaluations of teaching may be subject to halo effects, where answers to one question are contaminated by answers to the other questions. Quantifying halo effects is difficult since correlation between answers may be due to underlying correlation of the items being tested. We use a novel identification procedure to test for a halo effect…
Descriptors: Student Evaluation of Teacher Performance, Bias, Response Style (Tests), Foreign Countries
Peer reviewed Peer reviewed
Direct linkDirect link
He, Qingping; Meadows, Michelle; Black, Beth – Research Papers in Education, 2022
A potential negative consequence of high-stakes testing is inappropriate test behaviour involving individuals and/or institutions. Inappropriate test behaviour and test collusion can result in aberrant response patterns and anomalous test scores and invalidate the intended interpretation and use of test results. A variety of statistical techniques…
Descriptors: Statistical Analysis, High Stakes Tests, Scores, Response Style (Tests)
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Herwin, Herwin; Dahalan, Shakila Che – Pegem Journal of Education and Instruction, 2022
This study aims to analyze and describe the response patterns of school exam participants based on the person fit method. This research is a quantitative study with a focus on research on social science elementary school examinations as many as 15 multiple choice items and 137 participant answer sheets. Data collection techniques were carried out…
Descriptors: Response Style (Tests), Multiple Choice Tests, Emotional Response, Psychological Patterns
Peer reviewed Peer reviewed
Direct linkDirect link
Leventhal, Brian C.; Gregg, Nikole; Ames, Allison J. – Measurement: Interdisciplinary Research and Perspectives, 2022
Response styles introduce construct-irrelevant variance as a result of respondents systematically responding to Likert-type items regardless of content. Methods to account for response styles through data analysis as well as approaches to mitigating the effects of response styles during data collection have been well-documented. Recent approaches…
Descriptors: Response Style (Tests), Item Response Theory, Test Items, Likert Scales
Peer reviewed Peer reviewed
Direct linkDirect link
Faran, Yifat; Zanbar, Lea – International Journal of Social Research Methodology, 2019
The present study is the first to examine empirically whether required fields in online surveys impair reliability and response pattern, as participants forced to respond to all items may provide arbitrary answers. Two hundred and thirteen participants completed a survey consisting of six questionnaires testing personal and social issues and…
Descriptors: Online Surveys, Test Reliability, Response Style (Tests), Questionnaires
Peer reviewed Peer reviewed
Direct linkDirect link
Esther Ulitzsch; Steffi Pohl; Lale Khorramdel; Ulf Kroehne; Matthias von Davier – Journal of Educational and Behavioral Statistics, 2024
Questionnaires are by far the most common tool for measuring noncognitive constructs in psychology and educational sciences. Response bias may pose an additional source of variation between respondents that threatens validity of conclusions drawn from questionnaire data. We present a mixture modeling approach that leverages response time data from…
Descriptors: Item Response Theory, Response Style (Tests), Questionnaires, Secondary School Students
Peer reviewed Peer reviewed
Direct linkDirect link
Clariana, Roy B.; Park, Eunsung – Educational Technology Research and Development, 2021
Cognitive and metacognitive processes during learning depend on accurate monitoring, this investigation examines the influence of immediate item-level knowledge of correct response feedback on cognition monitoring accuracy. In an optional end-of-course computer-based review lesson, participants (n = 68) were randomly assigned to groups to receive…
Descriptors: Feedback (Response), Cognitive Processes, Accuracy, Difficulty Level
Pages: 1  |  2  |  3  |  4  |  5  |  6  |  7  |  8  |  9  |  10  |  11  |  ...  |  15