ERIC Number: ED637865
Record Type: Non-Journal
Publication Date: 2023
Pages: 133
Abstractor: As Provided
ISBN: 979-8-3801-5654-7
ISSN: N/A
EISSN: N/A
Using Item Response Models and Analysis to Address Practical Measurement Questions
Weicong Lyu
ProQuest LLC, Ph.D. Dissertation, The University of Wisconsin - Madison
Item response theory (IRT) is currently the dominant methodological paradigm in educational and psychological measurement. IRT models are based on assumptions about the relationship between latent traits and observed responses, so the accuracy of the methodology depends heavily on the reasonableness of these assumptions. This dissertation consists of three studies, all of which focus on different scenarios where existing IRT models do not agree closely with reality and thus may provide misleading or insufficient characterizations of measurement phenomena. In the first study, I discuss anchoring, the tendency for respondents to select categories near the rating category used for the immediately preceding item in self-report rating scale assessments. I propose a psychometric model based on a multidimensional nominal model for response style that also simultaneously accommodates a respondent-level anchoring tendency. This model is applied to a real dataset measuring extraversion, and empirical results support attending to both anchoring and midpoint response styles as ways of assessing respondent engagement. In the second study, I examine the simultaneous relevance of content trait level and response styles as predictive factors of response time on noncognitive assessments, and the potential for omitted variable bias when ignoring either factor. Using response time data from several noncognitive assessments, I demonstrate how a multilevel model leads to consistent findings that support the simultaneous relevance of both factors. The average effects of response style consistently emerge as stronger, although also show greater respondent-level variability, than those of content traits. In the third study, test items whose scores reflect sequential or IRTree modeling outcomes are considered. For such items, I argue that item specific factors, although not empirically measurable, are often present across stages of the same item. A conceptual model that incorporates such factors is proposed and used to demonstrate how they create ambiguity in the interpretations of item and person parameters beyond the first stage. Various empirical applications show patterns of violations of item parameter invariance across stages that are highly suggestive of item specific factors. These studies reflect some recent advances in IRT modeling applied to practical issues, which hopefully will benefit both methodologists and practitioners. [The dissertation citations contained here are published with the permission of ProQuest LLC. Further reproduction is prohibited without permission. Copies of dissertations may be obtained by Telephone (800) 1-800-521-0600. Web page: http://bibliotheek.ehb.be:2222/en-US/products/dissertations/individuals.shtml.]
Descriptors: Item Response Theory, Educational Assessment, Psychological Testing, Psychometrics, Response Style (Tests), Reaction Time, Evaluation Methods
ProQuest LLC. 789 East Eisenhower Parkway, P.O. Box 1346, Ann Arbor, MI 48106. Tel: 800-521-0600; Web site: http://bibliotheek.ehb.be:2222/en-US/products/dissertations/individuals.shtml
Publication Type: Dissertations/Theses - Doctoral Dissertations
Education Level: N/A
Audience: Researchers; Practitioners
Language: English
Sponsor: N/A
Authoring Institution: N/A
Grant or Contract Numbers: N/A