NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 6 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Pan, Yiqin; Wollack, James A. – Journal of Educational Measurement, 2021
As technologies have been improved, item preknowledge has become a common concern in the test security area. The present study proposes an unsupervised-learning-based approach to detect compromised items. The unsupervised-learning-based compromised item detection approach contains three steps: (1) classify responses of each examinee as either…
Descriptors: Test Items, Cheating, Artificial Intelligence, Identification
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Gorney, Kylie; Wollack, James A. – Practical Assessment, Research & Evaluation, 2022
Unlike the traditional multiple-choice (MC) format, the discrete-option multiple-choice (DOMC) format does not necessarily reveal all answer options to an examinee. The purpose of this study was to determine whether the reduced exposure of item content affects test security. We conducted an experiment in which participants were allowed to view…
Descriptors: Test Items, Test Format, Multiple Choice Tests, Item Analysis
Peer reviewed Peer reviewed
Direct linkDirect link
Wollack, James A.; Cohen, Allan S.; Eckerly, Carol A. – Educational and Psychological Measurement, 2015
Test tampering, especially on tests for educational accountability, is an unfortunate reality, necessitating that the state (or its testing vendor) perform data forensic analyses, such as erasure analyses, to look for signs of possible malfeasance. Few statistical approaches exist for detecting fraudulent erasures, and those that do largely do not…
Descriptors: Tests, Cheating, Item Response Theory, Accountability
Peer reviewed Peer reviewed
Wollack, James A.; Cohen, Allan S.; Serlin, Ronald C. – Applied Psychological Measurement, 2001
Developed a family wise approach for evaluating the significance of copying indices designed to hold the Type I error rate constant for each examinee. Examined the Type I error rate and power of two indices under a variety of copying situations. Results indicate the superiority of a family wise definition of Type I error rate over a pair-wise…
Descriptors: Cheating, Error of Measurement, Tests
Peer reviewed Peer reviewed
Wollack, James A.; Cohen, Allan S. – Applied Psychological Measurement, 1998
Investigated empirical Type I error rates and the power of omega (index of answer copying developed by J. Wollack, 1997) when item and trait (theta) parameters were unknown and estimated from datasets of 100 and 500 examinees. Type I error was unaffected by estimating item parameters, with power slightly lower for the smaller sample. (SLD)
Descriptors: Cheating, Estimation (Mathematics), Plagiarism, Sample Size
Peer reviewed Peer reviewed
Wollack, James A. – Applied Psychological Measurement, 1997
Introduces a new Item Response Theory (IRT) based statistic for detecting answer copying. Compares this omega statistic with the best classical test theory-based statistic under various conditions, and finds omega superior based on Type I error rate and power. (SLD)
Descriptors: Cheating, Identification, Item Response Theory, Power (Statistics)