NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 11 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Jiaying Xiao; Chun Wang; Gongjun Xu – Grantee Submission, 2024
Accurate item parameters and standard errors (SEs) are crucial for many multidimensional item response theory (MIRT) applications. A recent study proposed the Gaussian Variational Expectation Maximization (GVEM) algorithm to improve computational efficiency and estimation accuracy (Cho et al., 2021). However, the SE estimation procedure has yet to…
Descriptors: Error of Measurement, Models, Evaluation Methods, Item Analysis
Jing Ouyang; Gongjun Xu – Grantee Submission, 2022
Latent class models with covariates are widely used for psychological, social, and educational research. Yet the fundamental identifiability issue of these models has not been fully addressed. Among the previous research on the identifiability of latent class models with covariates, Huang and Bandeen-Roche (Psychometrika 69:5-32, 2004) studied the…
Descriptors: Item Response Theory, Models, Identification, Psychological Studies
Yuqi Gu; Elena A. Erosheva; Gongjun Xu; David B. Dunson – Grantee Submission, 2023
Mixed Membership Models (MMMs) are a popular family of latent structure models for complex multivariate data. Instead of forcing each subject to belong to a single cluster, MMMs incorporate a vector of subject-specific weights characterizing partial membership across clusters. With this flexibility come challenges in uniquely identifying,…
Descriptors: Multivariate Analysis, Item Response Theory, Bayesian Statistics, Models
Peer reviewed Peer reviewed
Direct linkDirect link
Weicong Lyu; Chun Wang; Gongjun Xu – Grantee Submission, 2024
Data harmonization is an emerging approach to strategically combining data from multiple independent studies, enabling addressing new research questions that are not answerable by a single contributing study. A fundamental psychometric challenge for data harmonization is to create commensurate measures for the constructs of interest across…
Descriptors: Data Analysis, Test Items, Psychometrics, Item Response Theory
Chenchen Ma; Jing Ouyang; Gongjun Xu – Grantee Submission, 2023
Cognitive Diagnosis Models (CDMs) are a special family of discrete latent variable models that are widely used in educational and psychological measurement. A key component of CDMs is the Q-matrix characterizing the dependence structure between the items and the latent attributes. Additionally, researchers also assume in many applications certain…
Descriptors: Psychological Evaluation, Clinical Diagnosis, Item Analysis, Algorithms
Tianci Liu; Chun Wang; Gongjun Xu – Grantee Submission, 2022
Multidimensional Item Response Theory (MIRT) is widely used in educational and psychological assessment and evaluation. With the increasing size of modern assessment data, many existing estimation methods become computationally demanding and hence they are not scalable to big data, especially for the multidimensional three-parameter and…
Descriptors: Item Response Theory, Computation, Monte Carlo Methods, Algorithms
Peer reviewed Peer reviewed
Direct linkDirect link
Chenchen Ma; Jing Ouyang; Chun Wang; Gongjun Xu – Grantee Submission, 2024
Survey instruments and assessments are frequently used in many domains of social science. When the constructs that these assessments try to measure become multifaceted, multidimensional item response theory (MIRT) provides a unified framework and convenient statistical tool for item analysis, calibration, and scoring. However, the computational…
Descriptors: Algorithms, Item Response Theory, Scoring, Accuracy
April E. Cho; Jiaying Xiao; Chun Wang; Gongjun Xu – Grantee Submission, 2022
Item factor analysis (IFA), also known as Multidimensional Item Response Theory (MIRT), is a general framework for specifying the functional relationship between a respondent's multiple latent traits and their response to assessment items. The key element in MIRT is the relationship between the items and the latent traits, so-called item factor…
Descriptors: Factor Analysis, Item Response Theory, Mathematics, Computation
Chun Wang; Ruoyi Zhu; Gongjun Xu – Grantee Submission, 2022
Differential item functioning (DIF) analysis refers to procedures that evaluate whether an item's characteristic differs for different groups of persons after controlling for overall differences in performance. DIF is routinely evaluated as a screening step to ensure items behavior the same across groups. Currently, the majority DIF studies focus…
Descriptors: Models, Item Response Theory, Item Analysis, Comparative Analysis
Peer reviewed Peer reviewed
Direct linkDirect link
Sainan Xu; Jing Lu; Jiwei Zhang; Chun Wang; Gongjun Xu – Grantee Submission, 2024
With the growing attention on large-scale educational testing and assessment, the ability to process substantial volumes of response data becomes crucial. Current estimation methods within item response theory (IRT), despite their high precision, often pose considerable computational burdens with large-scale data, leading to reduced computational…
Descriptors: Educational Assessment, Bayesian Statistics, Statistical Inference, Item Response Theory
Peer reviewed Peer reviewed
Direct linkDirect link
Chengyu Cui; Chun Wang; Gongjun Xu – Grantee Submission, 2024
Multidimensional item response theory (MIRT) models have generated increasing interest in the psychometrics literature. Efficient approaches for estimating MIRT models with dichotomous responses have been developed, but constructing an equally efficient and robust algorithm for polytomous models has received limited attention. To address this gap,…
Descriptors: Item Response Theory, Accuracy, Simulation, Psychometrics