Publication Date
In 2025 | 1 |
Since 2024 | 30 |
Since 2021 (last 5 years) | 100 |
Since 2016 (last 10 years) | 214 |
Since 2006 (last 20 years) | 414 |
Descriptor
Source
Author
Weiss, David J. | 12 |
Wise, Steven L. | 9 |
Bolt, Daniel M. | 7 |
Benson, Jeri | 6 |
Fiske, Donald W. | 6 |
Holden, Ronald R. | 6 |
Jackson, Douglas N. | 6 |
Adkins, Dorothy C. | 5 |
Birenbaum, Menucha | 5 |
Crocker, Linda | 5 |
Greve, Kevin W. | 5 |
More ▼ |
Publication Type
Education Level
Audience
Researchers | 58 |
Practitioners | 17 |
Teachers | 6 |
Administrators | 3 |
Counselors | 2 |
Students | 1 |
Location
Germany | 27 |
Canada | 20 |
Australia | 17 |
United States | 12 |
South Korea | 10 |
United Kingdom | 10 |
China | 9 |
Denmark | 9 |
France | 9 |
Italy | 9 |
Norway | 9 |
More ▼ |
Laws, Policies, & Programs
Elementary and Secondary… | 1 |
Assessments and Surveys
What Works Clearinghouse Rating
Lee, HyeSun; Smith, Weldon; Martinez, Angel; Ferris, Heather; Bova, Joe – Applied Measurement in Education, 2021
The aim of the current research was to provide recommendations to facilitate the development and use of anchoring vignettes (AVs) for cross-cultural comparisons in education. Study 1 identified six factors leading to order violations and ties in AV responses based on cognitive interviews with 15-year-old students. The factors were categorized into…
Descriptors: Vignettes, Test Items, Equated Scores, Nonparametric Statistics
Vonkova, Hana; Hrabak, Jan; Kralova, Katerina; Papajoanu, Ondrej – Field Methods, 2021
Self-assessment measures are commonly used in questionnaire surveys. However, one of the problems with self-reports is that they may be prone to differences in scale usage among respondents. The anchoring vignette method addresses this issue. It relies on two assumptions: response consistency and vignette equivalence. Here we aim to develop a…
Descriptors: Vignettes, Interviews, Self Evaluation (Individuals), Reliability
Ulitzsch, Esther; Penk, Christiane; von Davier, Matthias; Pohl, Steffi – Educational Assessment, 2021
Identifying and considering test-taking effort is of utmost importance for drawing valid inferences on examinee competency in low-stakes tests. Different approaches exist for doing so. The speed-accuracy+engagement model aims at identifying non-effortful test-taking behavior in terms of nonresponse and rapid guessing based on responses and…
Descriptors: Response Style (Tests), Guessing (Tests), Reaction Time, Measurement Techniques
Stiglbauer, Barbara; Zuber, Julia – Educational Psychology, 2019
Regulatory focus is a strong predictor for a person's behaviour in signal detection tasks. While a promotion focus is related to a risky response strategy (hits, false alarms), a prevention focus is associated with a conservative strategy (correct rejections, misses). The present research is based on the assumption that multiple-choice (MC)…
Descriptors: Multiple Choice Tests, Response Style (Tests), Bias, Test Wiseness
The Impact of Students' Test-Taking Effort on Growth Estimates in Low-Stakes Educational Assessments
Yildirim-Erbasli, Seyma Nur; Bulut, Okan – Educational Research and Evaluation, 2020
This study investigated the impact of students' test-taking effort on their growth estimates in reading. The sample consisted of 7,602 students (Grades 1 to 4) in the United States who participated in the fall and spring administrations of a computer-based reading assessment. First, a new response dataset was created by flagging both…
Descriptors: Response Style (Tests), Reading Tests, Guessing (Tests), Reaction Time
Ames, Allison J.; Myers, Aaron J. – Educational and Psychological Measurement, 2021
Contamination of responses due to extreme and midpoint response style can confound the interpretation of scores, threatening the validity of inferences made from survey responses. This study incorporated person-level covariates in the multidimensional item response tree model to explain heterogeneity in response style. We include an empirical…
Descriptors: Response Style (Tests), Item Response Theory, Longitudinal Studies, Adolescents
Scanlon, Paul J. – Field Methods, 2019
Web, or online, probing has the potential to supplement existing questionnaire design processes by providing structured cognitive data on a wider sample than typical qualitative-only question evaluation methods can achieve. One of the practical impediments to the further integration of web probing is the concern of survey managers about how the…
Descriptors: Online Surveys, Questionnaires, Response Style (Tests), Test Items
OECD Publishing, 2019
Computer-based administration of large-scale assessments makes it possible to collect a rich set of information on test takers, through analysis of the log files recording interactions between the computer interface and the server. This report examines timing and engagement indicators from the Survey of Adult Skills, a product of the Programme for…
Descriptors: Adults, Surveys, International Assessment, Responses
Babcock, Ben; Siegel, Zachary D. – Practical Assessment, Research & Evaluation, 2022
Research about repeated testing has revealed that retaking the same exam form generally does not advantage or disadvantage failing candidates in selected response-style credentialing exams. Feinberg, Raymond, and Haist (2015) found a contributing factor to this phenomenon: people answering items incorrectly on both attempts give the same incorrect…
Descriptors: Multiple Choice Tests, Item Analysis, Test Items, Response Style (Tests)
Magraw-Mickelson, Zoe; Wang, Harry H.; Gollwitzer, Mario – International Journal of Testing, 2022
Much psychological research depends on participants' diligence in filling out materials such as surveys. However, not all participants are motivated to respond attentively, which leads to unintended issues with data quality, known as careless responding. Our question is: how do different modes of data collection--paper/pencil, computer/web-based,…
Descriptors: Response Style (Tests), Surveys, Data Collection, Test Format
Thompson, James J. – Measurement: Interdisciplinary Research and Perspectives, 2022
With the use of computerized testing, ordinary assessments can capture both answer accuracy and answer response time. For the Canadian Programme for the International Assessment of Adult Competencies (PIAAC) numeracy and literacy subtests, person ability, person speed, question difficulty, question time intensity, fluency (rate), person fluency…
Descriptors: Foreign Countries, Adults, Computer Assisted Testing, Network Analysis
Rios, Joseph A.; Guo, Hongwen – Applied Measurement in Education, 2020
The objective of this study was to evaluate whether differential noneffortful responding (identified via response latencies) was present in four countries administered a low-stakes college-level critical thinking assessment. Results indicated significant differences (as large as 0.90 "SD") between nearly all country pairings in the…
Descriptors: Response Style (Tests), Cultural Differences, Critical Thinking, Cognitive Tests
Ivanova, Militsa; Michaelides, Michalis; Eklöf, Hanna – Educational Research and Evaluation, 2020
Collecting process data in computer-based assessments provides opportunities to describe examinee behaviour during a test-taking session. The number of actions taken by students while interacting with an item is in this context a variable that has been gaining attention. The present study aims to investigate how the number of actions performed on…
Descriptors: Foreign Countries, Secondary School Students, Achievement Tests, International Assessment
Abbakumov, Dmitry; Desmet, Piet; Van den Noortgate, Wim – Applied Measurement in Education, 2020
Formative assessments are an important component of massive open online courses (MOOCs), online courses with open access and unlimited student participation. Accurate conclusions on students' proficiency via formative, however, face several challenges: (a) students are typically allowed to make several attempts; and (b) student performance might…
Descriptors: Item Response Theory, Formative Evaluation, Online Courses, Response Style (Tests)
Zehner, Fabian; Harrison, Scott; Eichmann, Beate; Deribo, Tobias; Bengs, Daniel; Andersen, Nico; Hahnel, Carolin – International Educational Data Mining Society, 2020
The "2nd Annual WPI-UMASS-UPENN EDM Data Mining Challenge" required contestants to predict efficient testtaking based on log data. In this paper, we describe our theory-driven and psychometric modeling approach. For feature engineering, we employed the Log-Normal Response Time Model for estimating latent person speed, and the Generalized…
Descriptors: Data Analysis, Competition, Classification, Prediction