Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 4 |
Descriptor
Item Response Theory | 4 |
Test Items | 4 |
Difficulty Level | 3 |
Item Analysis | 2 |
Reaction Time | 2 |
Sample Size | 2 |
Scores | 2 |
Test Wiseness | 2 |
College Students | 1 |
Educational Assessment | 1 |
Effect Size | 1 |
More ▼ |
Source
ETS Research Report Series | 4 |
Author
Guo, Hongwen | 4 |
Lu, Ru | 2 |
Dorans, Neil J. | 1 |
Ercikan, Kadriye | 1 |
Gu, Lin | 1 |
Johnson, Matthew S. | 1 |
Ling, Guangming | 1 |
Liu, Lydia O. | 1 |
McCaffrey, Dan F. | 1 |
Rios, Joseph A. | 1 |
Wang, Zhen | 1 |
More ▼ |
Publication Type
Journal Articles | 4 |
Reports - Research | 4 |
Education Level
Higher Education | 1 |
Postsecondary Education | 1 |
Audience
Location
Canada | 1 |
United States | 1 |
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Guo, Hongwen; Lu, Ru; Johnson, Matthew S.; McCaffrey, Dan F. – ETS Research Report Series, 2022
It is desirable for an educational assessment to be constructed of items that can differentiate different performance levels of test takers, and thus it is important to estimate accurately the item discrimination parameters in either classical test theory or item response theory. It is particularly challenging to do so when the sample sizes are…
Descriptors: Test Items, Item Response Theory, Item Analysis, Educational Assessment
Guo, Hongwen; Rios, Joseph A.; Ling, Guangming; Wang, Zhen; Gu, Lin; Yang, Zhitong; Liu, Lydia O. – ETS Research Report Series, 2022
Different variants of the selected-response (SR) item type have been developed for various reasons (i.e., simulating realistic situations, examining critical-thinking and/or problem-solving skills). Generally, the variants of SR item format are more complex than the traditional multiple-choice (MC) items, which may be more challenging to test…
Descriptors: Test Format, Test Wiseness, Test Items, Item Response Theory
Guo, Hongwen; Ercikan, Kadriye – ETS Research Report Series, 2021
In this report, we demonstrate use of differential response time (DRT) methodology, an extension of differential item functioning methodology, for examining differences in how students from different backgrounds engage with assessment tasks. We analyze response time data from a digitally delivered mathematics assessment to examine timing…
Descriptors: Test Wiseness, English Language Learners, Reaction Time, Mathematics Tests
Lu, Ru; Guo, Hongwen; Dorans, Neil J. – ETS Research Report Series, 2021
Two families of analysis methods can be used for differential item functioning (DIF) analysis. One family is DIF analysis based on observed scores, such as the Mantel-Haenszel (MH) and the standardized proportion-correct metric for DIF procedures; the other is analysis based on latent ability, in which the statistic is a measure of departure from…
Descriptors: Robustness (Statistics), Weighted Scores, Test Items, Item Analysis