Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 2 |
Since 2016 (last 10 years) | 3 |
Since 2006 (last 20 years) | 6 |
Descriptor
Models | 6 |
Statistical Analysis | 6 |
Item Response Theory | 3 |
Classification | 2 |
Computation | 2 |
Goodness of Fit | 2 |
Maximum Likelihood Statistics | 2 |
Test Items | 2 |
Accuracy | 1 |
Bayesian Statistics | 1 |
Cognitive Measurement | 1 |
More ▼ |
Source
Educational and Psychological… | 2 |
Journal of Educational and… | 2 |
Grantee Submission | 1 |
International Journal of… | 1 |
Author
Wang, Chun | 6 |
Chang, Hua-Hua | 1 |
Cho, April E. | 1 |
Douglas, Jeffrey A. | 1 |
Fan, Zhewen | 1 |
Huebner, Alan | 1 |
Lu, Jing | 1 |
Su, Shiyang | 1 |
Tay, Louis | 1 |
Vermunt, Jeroen K. | 1 |
Weiss, David J. | 1 |
More ▼ |
Publication Type
Reports - Research | 6 |
Journal Articles | 5 |
Education Level
Audience
Location
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Wang, Chun; Lu, Jing – Journal of Educational and Behavioral Statistics, 2021
In cognitive diagnostic assessment, multiple fine-grained attributes are measured simultaneously. Attribute hierarchies are considered important structural features of cognitive diagnostic models (CDMs) that provide useful information about the nature of attributes. Templin and Bradshaw first introduced a hierarchical diagnostic classification…
Descriptors: Cognitive Measurement, Models, Vertical Organization, Classification
Su, Shiyang; Wang, Chun; Weiss, David J. – Educational and Psychological Measurement, 2021
S-X[superscript 2] is a popular item fit index that is available in commercial software packages such as "flex"MIRT. However, no research has systematically examined the performance of S-X[superscript 2] for detecting item misfit within the context of the multidimensional graded response model (MGRM). The primary goal of this study was…
Descriptors: Statistics, Goodness of Fit, Test Items, Models
Cho, April E.; Wang, Chun; Zhang, Xue; Xu, Gongjun – Grantee Submission, 2020
Multidimensional Item Response Theory (MIRT) is widely used in assessment and evaluation of educational and psychological tests. It models the individual response patterns by specifying functional relationship between individuals' multiple latent traits and their responses to test items. One major challenge in parameter estimation in MIRT is that…
Descriptors: Item Response Theory, Mathematics, Statistical Inference, Maximum Likelihood Statistics
Tay, Louis; Vermunt, Jeroen K.; Wang, Chun – International Journal of Testing, 2013
We evaluate the item response theory with covariates (IRT-C) procedure for assessing differential item functioning (DIF) without preknowledge of anchor items (Tay, Newman, & Vermunt, 2011). This procedure begins with a fully constrained baseline model, and candidate items are tested for uniform and/or nonuniform DIF using the Wald statistic.…
Descriptors: Item Response Theory, Test Bias, Models, Statistical Analysis
Wang, Chun; Fan, Zhewen; Chang, Hua-Hua; Douglas, Jeffrey A. – Journal of Educational and Behavioral Statistics, 2013
The item response times (RTs) collected from computerized testing represent an underutilized type of information about items and examinees. In addition to knowing the examinees' responses to each item, we can investigate the amount of time examinees spend on each item. Current models for RTs mainly focus on parametric models, which have the…
Descriptors: Reaction Time, Computer Assisted Testing, Test Items, Accuracy
Huebner, Alan; Wang, Chun – Educational and Psychological Measurement, 2011
Cognitive diagnosis models have received much attention in the recent psychometric literature because of their potential to provide examinees with information regarding multiple fine-grained discretely defined skills, or attributes. This article discusses the issue of methods of examinee classification for cognitive diagnosis models, which are…
Descriptors: Classification, Diagnostic Tests, Thinking Skills, Models