NotesFAQContact Us
Collection
Advanced
Search Tips
Laws, Policies, & Programs
Elementary and Secondary…1
What Works Clearinghouse Rating
Showing 151 to 165 of 1,389 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Pastor, Dena A.; Ong, Thai Q.; Strickman, Scott N. – Educational Assessment, 2019
The trustworthiness of low-stakes assessment results largely depends on examinee effort, which can be measured by the amount of time examinees devote to items using solution behavior (SB) indices. Because SB indices are calculated for each item, they can be used to understand how examinee motivation changes across items within a test. Latent class…
Descriptors: Behavior Patterns, Test Items, Time, Response Style (Tests)
Peer reviewed Peer reviewed
Direct linkDirect link
Sachse, Karoline A.; Mahler, Nicole; Pohl, Steffi – Educational and Psychological Measurement, 2019
Mechanisms causing item nonresponses in large-scale assessments are often said to be nonignorable. Parameter estimates can be biased if nonignorable missing data mechanisms are not adequately modeled. In trend analyses, it is plausible for the missing data mechanism and the percentage of missing values to change over time. In this article, we…
Descriptors: International Assessment, Response Style (Tests), Achievement Tests, Foreign Countries
Peer reviewed Peer reviewed
Direct linkDirect link
Cetin-Berber, Dee Duygu; Sari, Halil Ibrahim; Huggins-Manley, Anne Corinne – Educational and Psychological Measurement, 2019
Routing examinees to modules based on their ability level is a very important aspect in computerized adaptive multistage testing. However, the presence of missing responses may complicate estimation of examinee ability, which may result in misrouting of individuals. Therefore, missing responses should be handled carefully. This study investigated…
Descriptors: Computer Assisted Testing, Adaptive Testing, Error of Measurement, Research Problems
Peer reviewed Peer reviewed
Direct linkDirect link
Yousefi-Nooraie, Reza; Marin, Alexandra; Hanneman, Robert; Pullenayegum, Eleanor; Lohfeld, Lynne; Dobbins, Maureen – Sociological Methods & Research, 2019
Using randomly ordered name generators, we tested the effect of name generators' relative position on the likelihood of respondents' declining to respond or satisficing in their response. An online survey of public health staff elicited names of information sources, information seekers, perceived experts, and friends. Results show that when name…
Descriptors: Online Surveys, Response Style (Tests), Test Format, Health Personnel
Peer reviewed Peer reviewed
Direct linkDirect link
Vijver, Fons J. R. – Educational Measurement: Issues and Practice, 2018
A conceptual framework of measurement bias in cross-cultural comparisons, distinguishing between construct, method, and item bias (differential item functioning), is used to describe a methodological framework addressing assessment of noncognitive variables in international large-scale studies. It is argued that the treatment of bias, coming from…
Descriptors: Educational Assessment, Achievement Tests, Foreign Countries, International Assessment
Peer reviewed Peer reviewed
Direct linkDirect link
Shukla, Kathan; Konold, Timothy – Journal of Experimental Education, 2018
Insincere respondents can have an adverse impact on the validity of substantive inferences arising from self-administered questionnaires (SAQs). The current study introduces a new method for identifying potentially invalid respondents from their atypical response patterns. The two-step procedure involves generating a response inconsistency (RI)…
Descriptors: Measurement Techniques, Validity, Response Style (Tests), Reliability
Peer reviewed Peer reviewed
Direct linkDirect link
Plieninger, Hansjörg – Educational and Psychological Measurement, 2017
Even though there is an increasing interest in response styles, the field lacks a systematic investigation of the bias that response styles potentially cause. Therefore, a simulation was carried out to study this phenomenon with a focus on applied settings (reliability, validity, scale scores). The influence of acquiescence and extreme response…
Descriptors: Response Style (Tests), Test Bias, Item Response Theory, Correlation
Peer reviewed Peer reviewed
Direct linkDirect link
Halpin, Peter F. – Measurement: Interdisciplinary Research and Perspectives, 2017
The target paper, "Rethinking Traditional Methods of Survey Validation" (Andrew Maul), raises some interesting critical ideas, both old and new, about the validation of self-report surveys. As indicated by Dr. Maul, recent policy initiatives in the United States (e.g., ESSA) have led to a demand for assessments of…
Descriptors: Self Evaluation (Individuals), Evaluation Methods, Measurement Techniques, Response Style (Tests)
Peer reviewed Peer reviewed
Direct linkDirect link
Valencia, Edgar – Assessment & Evaluation in Higher Education, 2020
The validity of student evaluation of teaching (SET) scores depends on minimum effect of extraneous response processes or biases. A bias may increase or decrease scores and change the relationship with other variables. In contrast, SET literature defines bias as an irrelevant variable correlated with SET scores, and among many, a relevant biasing…
Descriptors: Student Evaluation of Teacher Performance, College Faculty, Student Attitudes, Gender Bias
Peer reviewed Peer reviewed
Direct linkDirect link
Kim, Nana; Bolt, Daniel M. – Educational and Psychological Measurement, 2021
This paper presents a mixture item response tree (IRTree) model for extreme response style. Unlike traditional applications of single IRTree models, a mixture approach provides a way of representing the mixture of respondents following different underlying response processes (between individuals), as well as the uncertainty present at the…
Descriptors: Item Response Theory, Response Style (Tests), Models, Test Items
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Hedlefs-Aguilar, Maria Isolde; Morales-Martinez, Guadalupe Elizabeth; Villarreal-Lozano, Ricardo Jesus; Moreno-Rodriguez, Claudia; Gonzalez-Rodriguez, Erick Alejandro – European Journal of Educational Research, 2021
This study explored the cognitive mechanism behind information integration in the test anxiety judgments in 140 engineering students. An experiment was designed to test four factors combined (test goal orientation, test cognitive functioning level, test difficulty and test mode). The experimental task required participants to read 36 scenarios,…
Descriptors: Test Anxiety, Engineering Education, Algebra, College Students
Peer reviewed Peer reviewed
Direct linkDirect link
Kolski, Tammi; Weible, Jennifer L. – Community College Journal of Research and Practice, 2019
eLearning instruction has become an accepted means of delivering a quality education to higher education students, with community college online learning enrollment rates rising annually. Consistent with the desires of eLearning students for convenience and flexibility, educators utilize virtual proctored exams to safeguard against academic…
Descriptors: Community Colleges, Two Year College Students, College Students, Student Behavior
Peer reviewed Peer reviewed
Direct linkDirect link
von Davier, Matthias – Quality Assurance in Education: An International Perspective, 2018
Purpose: Surveys that include skill measures may suffer from additional sources of error compared to those containing questionnaires alone. Examples are distractions such as noise or interruptions of testing sessions, as well as fatigue or lack of motivation to succeed. This paper aims to provide a review of statistical tools based on latent…
Descriptors: Statistical Analysis, Surveys, International Assessment, Error Patterns
Peer reviewed Peer reviewed
Direct linkDirect link
Guo, Hongwen; Ercikan, Kadriye – Educational Research and Evaluation, 2020
Rapid response behaviour, a type of test disengagement, cannot be interpreted as a true indicator of the targeted constructs and may compromise score accuracy as well as score validity for interpretation. Rapid responding may be due to multiple factors for diverse populations. In this study, using Programme for International Student Assessment…
Descriptors: Response Style (Tests), Foreign Countries, International Assessment, Achievement Tests
Peer reviewed Peer reviewed
Direct linkDirect link
Harbaugh, Allen G.; Liu, Min – AERA Online Paper Repository, 2017
This research examines the effects of nonattending response pattern contamination and select response style patterns on measures of model fit (CFI) and internal reliability (Cronbach's [alpha]). A simulation study examines the effects resulting from percentage of contamination, number of manifest items measured and sample size. Initial results…
Descriptors: Factor Analysis, Response Style (Tests), Goodness of Fit, Test Reliability
Pages: 1  |  ...  |  7  |  8  |  9  |  10  |  11  |  12  |  13  |  14  |  15  |  ...  |  93