Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 0 |
Since 2016 (last 10 years) | 1 |
Since 2006 (last 20 years) | 5 |
Descriptor
Test Items | 7 |
Item Response Theory | 6 |
Foreign Countries | 4 |
Difficulty Level | 3 |
Statistical Analysis | 3 |
Elementary School Students | 2 |
Guessing (Tests) | 2 |
Multiple Choice Tests | 2 |
Test Bias | 2 |
Test Theory | 2 |
Adaptive Testing | 1 |
More ▼ |
Author
Andrich, David | 7 |
Hagquist, Curt | 2 |
Marais, Ida | 2 |
Humphry, Stephen | 1 |
Humphry, Stephen Mark | 1 |
Kreiner, Svend | 1 |
Styles, Irene | 1 |
Publication Type
Journal Articles | 6 |
Reports - Research | 5 |
Reports - Descriptive | 2 |
Speeches/Meeting Papers | 2 |
Education Level
Elementary Education | 1 |
Audience
Researchers | 1 |
Laws, Policies, & Programs
Assessments and Surveys
Raven Advanced Progressive… | 1 |
Raven Progressive Matrices | 1 |
What Works Clearinghouse Rating
Andrich, David; Hagquist, Curt – Educational and Psychological Measurement, 2015
Differential item functioning (DIF) for an item between two groups is present if, for the same person location on a variable, persons from different groups have different expected values for their responses. Applying only to dichotomously scored items in the popular Mantel-Haenszel (MH) method for detecting DIF in which persons are classified by…
Descriptors: Test Bias, Test Items, Item Response Theory, Statistical Analysis
Andrich, David; Marais, Ida; Humphry, Stephen Mark – Educational and Psychological Measurement, 2016
Recent research has shown how the statistical bias in Rasch model difficulty estimates induced by guessing in multiple-choice items can be eliminated. Using vertical scaling of a high-profile national reading test, it is shown that the dominant effect of removing such bias is a nonlinear change in the unit of scale across the continuum. The…
Descriptors: Guessing (Tests), Statistical Bias, Item Response Theory, Multiple Choice Tests
Andrich, David; Hagquist, Curt – Journal of Educational and Behavioral Statistics, 2012
The literature in modern test theory on procedures for identifying items with differential item functioning (DIF) among two groups of persons includes the Mantel-Haenszel (MH) procedure. Generally, it is not recognized explicitly that if there is real DIF in some items which favor one group, then as an artifact of this procedure, artificial DIF…
Descriptors: Test Bias, Test Items, Item Response Theory, Statistical Analysis
Andrich, David; Kreiner, Svend – Applied Psychological Measurement, 2010
Models of modern test theory imply statistical independence among responses, generally referred to as "local independence." One violation of local independence occurs when the response to one item governs the response to a subsequent item. Expanding on a formulation of this kind of violation as a process in the dichotomous Rasch model,…
Descriptors: Test Theory, Item Response Theory, Test Items, Correlation
Andrich, David; Marais, Ida; Humphry, Stephen – Journal of Educational and Behavioral Statistics, 2012
Andersen (1995, 2002) proves a theorem relating variances of parameter estimates from samples and subsamples and shows its use as an adjunct to standard statistical analyses. The authors show an application where the theorem is central to the hypothesis tested, namely, whether random guessing to multiple choice items affects their estimates in the…
Descriptors: Test Items, Item Response Theory, Multiple Choice Tests, Guessing (Tests)
Andrich, David – 1984
Both the attenuation paradox of traditional test theory and the assumption of local independence in person-item response theory have caused problems in interpretation. This paper demonstrates that the two are related concepts, and, through this demonstration, both are clarified. It is demonstrated that the breakdown of local independence leads to…
Descriptors: Latent Trait Theory, Test Interpretation, Test Items, Test Reliability

Styles, Irene; Andrich, David – Educational and Psychological Measurement, 1993
This paper describes the use of the Rasch model to help implement computerized administration of the standard and advanced forms of Raven's Progressive Matrices (RPM), to compare relative item difficulties, and to convert scores between the standard and advanced forms. The sample consisted of 95 girls and 95 boys in Australia. (SLD)
Descriptors: Adaptive Testing, Computer Assisted Testing, Difficulty Level, Elementary Education