Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 2 |
Since 2016 (last 10 years) | 2 |
Since 2006 (last 20 years) | 3 |
Descriptor
Cheating | 6 |
Test Items | 4 |
Item Response Theory | 3 |
Identification | 2 |
Tests | 2 |
Accountability | 1 |
Artificial Intelligence | 1 |
Comparative Analysis | 1 |
Computer Assisted Testing | 1 |
Error Correction | 1 |
Error Patterns | 1 |
More ▼ |
Source
Applied Psychological… | 3 |
Educational and Psychological… | 1 |
Journal of Educational… | 1 |
Practical Assessment,… | 1 |
Author
Wollack, James A. | 6 |
Cohen, Allan S. | 3 |
Eckerly, Carol A. | 1 |
Gorney, Kylie | 1 |
Pan, Yiqin | 1 |
Serlin, Ronald C. | 1 |
Publication Type
Journal Articles | 6 |
Reports - Research | 4 |
Reports - Descriptive | 1 |
Reports - Evaluative | 1 |
Education Level
Higher Education | 1 |
Postsecondary Education | 1 |
Audience
Location
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Pan, Yiqin; Wollack, James A. – Journal of Educational Measurement, 2021
As technologies have been improved, item preknowledge has become a common concern in the test security area. The present study proposes an unsupervised-learning-based approach to detect compromised items. The unsupervised-learning-based compromised item detection approach contains three steps: (1) classify responses of each examinee as either…
Descriptors: Test Items, Cheating, Artificial Intelligence, Identification
Gorney, Kylie; Wollack, James A. – Practical Assessment, Research & Evaluation, 2022
Unlike the traditional multiple-choice (MC) format, the discrete-option multiple-choice (DOMC) format does not necessarily reveal all answer options to an examinee. The purpose of this study was to determine whether the reduced exposure of item content affects test security. We conducted an experiment in which participants were allowed to view…
Descriptors: Test Items, Test Format, Multiple Choice Tests, Item Analysis
Wollack, James A.; Cohen, Allan S.; Eckerly, Carol A. – Educational and Psychological Measurement, 2015
Test tampering, especially on tests for educational accountability, is an unfortunate reality, necessitating that the state (or its testing vendor) perform data forensic analyses, such as erasure analyses, to look for signs of possible malfeasance. Few statistical approaches exist for detecting fraudulent erasures, and those that do largely do not…
Descriptors: Tests, Cheating, Item Response Theory, Accountability

Wollack, James A.; Cohen, Allan S.; Serlin, Ronald C. – Applied Psychological Measurement, 2001
Developed a family wise approach for evaluating the significance of copying indices designed to hold the Type I error rate constant for each examinee. Examined the Type I error rate and power of two indices under a variety of copying situations. Results indicate the superiority of a family wise definition of Type I error rate over a pair-wise…
Descriptors: Cheating, Error of Measurement, Tests

Wollack, James A.; Cohen, Allan S. – Applied Psychological Measurement, 1998
Investigated empirical Type I error rates and the power of omega (index of answer copying developed by J. Wollack, 1997) when item and trait (theta) parameters were unknown and estimated from datasets of 100 and 500 examinees. Type I error was unaffected by estimating item parameters, with power slightly lower for the smaller sample. (SLD)
Descriptors: Cheating, Estimation (Mathematics), Plagiarism, Sample Size

Wollack, James A. – Applied Psychological Measurement, 1997
Introduces a new Item Response Theory (IRT) based statistic for detecting answer copying. Compares this omega statistic with the best classical test theory-based statistic under various conditions, and finds omega superior based on Type I error rate and power. (SLD)
Descriptors: Cheating, Identification, Item Response Theory, Power (Statistics)