Publication Date
In 2025 | 0 |
Since 2024 | 1 |
Since 2021 (last 5 years) | 6 |
Since 2016 (last 10 years) | 19 |
Since 2006 (last 20 years) | 26 |
Descriptor
Multiple Choice Tests | 51 |
Test Items | 51 |
Test Wiseness | 51 |
Test Construction | 21 |
Higher Education | 19 |
Test Format | 15 |
Difficulty Level | 12 |
Guessing (Tests) | 12 |
Scores | 12 |
Item Analysis | 9 |
Test Validity | 9 |
More ▼ |
Source
Author
Katz, Irvin R. | 3 |
Keehner, Madeleine | 3 |
Moon, Jung Aa | 3 |
Plake, Barbara S. | 2 |
White, David M. | 2 |
Albanese, Mark A. | 1 |
Amir Hadifar | 1 |
Ardoin, Scott P. | 1 |
Armstrong, Anne-Marie | 1 |
Binder, Katherine S. | 1 |
Biran, Leonard A. | 1 |
More ▼ |
Publication Type
Education Level
Higher Education | 12 |
Postsecondary Education | 10 |
High Schools | 3 |
Secondary Education | 3 |
Early Childhood Education | 1 |
Elementary Education | 1 |
Grade 10 | 1 |
Grade 3 | 1 |
Grade 9 | 1 |
Primary Education | 1 |
Audience
Students | 3 |
Practitioners | 2 |
Researchers | 1 |
Teachers | 1 |
Location
California | 2 |
Canada | 1 |
Canada (Ottawa) | 1 |
Czech Republic | 1 |
Israel | 1 |
Malaysia | 1 |
New York | 1 |
Nigeria | 1 |
Sweden | 1 |
Taiwan | 1 |
United Kingdom | 1 |
More ▼ |
Laws, Policies, & Programs
Assessments and Surveys
Test of English as a Foreign… | 3 |
ACT Assessment | 1 |
Comprehensive Tests of Basic… | 1 |
Graduate Management Admission… | 1 |
Law School Admission Test | 1 |
Stanford Achievement Tests | 1 |
Test of English for… | 1 |
What Works Clearinghouse Rating
Semere Kiros Bitew; Amir Hadifar; Lucas Sterckx; Johannes Deleu; Chris Develder; Thomas Demeester – IEEE Transactions on Learning Technologies, 2024
Multiple-choice questions (MCQs) are widely used in digital learning systems, as they allow for automating the assessment process. However, owing to the increased digital literacy of students and the advent of social media platforms, MCQ tests are widely shared online, and teachers are continuously challenged to create new questions, which is an…
Descriptors: Multiple Choice Tests, Computer Assisted Testing, Test Construction, Test Items
Thompson, Kathryn N. – ProQuest LLC, 2023
It is imperative to collect validity evidence prior to interpreting and using test scores. During the process of collecting validity evidence, test developers should consider whether test scores are contaminated by sources of extraneous information. This is referred to as construct irrelevant variance, or the "degree to which test scores are…
Descriptors: Test Wiseness, Test Items, Item Response Theory, Scores
DeCarlo, Lawrence T. – Journal of Educational Measurement, 2023
A conceptualization of multiple-choice exams in terms of signal detection theory (SDT) leads to simple measures of item difficulty and item discrimination that are closely related to, but also distinct from, those used in classical item analysis (CIA). The theory defines a "true split," depending on whether or not examinees know an item,…
Descriptors: Multiple Choice Tests, Test Items, Item Analysis, Test Wiseness
Choi, Ikkyu; Zu, Jiyun – ETS Research Report Series, 2022
Synthetically generated speech (SGS) has become an integral part of our oral communication in a wide variety of contexts. It can be generated instantly at a low cost and allows precise control over multiple aspects of output, all of which can be highly appealing to second language (L2) assessment developers who have traditionally relied upon human…
Descriptors: Test Wiseness, Multiple Choice Tests, Test Items, Difficulty Level
Moon, Jung Aa; Sinharay, Sandip; Keehner, Madeleine; Katz, Irvin R. – International Journal of Testing, 2020
The current study examined the relationship between test-taker cognition and psychometric item properties in multiple-selection multiple-choice and grid items. In a study with content-equivalent mathematics items in alternative item formats, adult participants' tendency to respond to an item was affected by the presence of a grid and variations of…
Descriptors: Computer Assisted Testing, Multiple Choice Tests, Test Wiseness, Psychometrics
Jimoh, Mohammed Idris; Daramola, Dorcas Sola; Oladele, Jumoke Iyabode; Sheu, Adaramaja Lukman – Anatolian Journal of Education, 2020
The study investigated items that were prone to guessing in Senior School Certificate Examinations (SSCE) Economics multiple-choice tests among students in Kwara State, Nigeria. The 2016 West African Senior Secondary Certificate Examinations (WASSCE) and National Examinations Council (NECO) Economics multiple-choice test items were subjected to…
Descriptors: Foreign Countries, High School Students, Guessing (Tests), Test Items
Merry, Justin W.; Elenchin, Mary Kate; Surma, Renee N. – Advances in Physiology Education, 2021
Multiple choice exams are ubiquitous, but advice on test-taking strategies varies and is not always well informed by research. This study evaluated the question of whether students benefit or are harmed when they change their initial answers on multiple choice questions in the context of physiology and biology courses. Previously marked…
Descriptors: Multiple Choice Tests, Physiology, Biology, Science Instruction
Moon, Jung Aa; Keehner, Madeleine; Katz, Irvin R. – Educational Assessment, 2020
We investigated how item formats influence test takers' response tendencies under uncertainty. Adult participants solved content-equivalent math items in three formats: multiple-selection multiple-choice, grid with forced-choice (true-false) options, and grid with non-forced-choice options. Participants showed a greater tendency to commit (rather…
Descriptors: College Students, Test Wiseness, Test Format, Test Items
Sadowski, Mary A.; Sorby, Sheryl A. – Engineering Design Graphics Journal, 2018
Grading is often a faculty member's least favorite chore, especially in engineering where open-ended problems prevail. For this reason, multiple-choice test items could be a popular alternative for assessing learning and understanding. In addition, most Learning Management Systems allow the instructor to create multiple-choice questions to be…
Descriptors: Test Items, Multiple Choice Tests, Questioning Techniques, Engineering Education
Moon, Jung Aa; Keehner, Madeleine; Katz, Irvin R. – Educational Measurement: Issues and Practice, 2019
The current study investigated how item formats and their inherent affordances influence test-takers' cognition under uncertainty. Adult participants solved content-equivalent math items in multiple-selection multiple-choice and four alternative grid formats. The results indicated that participants' affirmative response tendency (i.e., judge the…
Descriptors: Affordances, Test Items, Test Format, Test Wiseness
Bramley, Tom; Crisp, Victoria – Assessment in Education: Principles, Policy & Practice, 2019
For many years, question choice has been used in some UK public examinations, with students free to choose which questions they answer from a selection (within certain parameters). There has been little published research on choice of exam questions in recent years in the UK. In this article we distinguish different scenarios in which choice…
Descriptors: Test Items, Test Construction, Difficulty Level, Foreign Countries
Tremblay, Kathryn A.; Binder, Katherine S.; Ardoin, Scott P.; Talwar, Amani; Tighe, Elizabeth L. – Journal of Research in Reading, 2021
Background: Of the myriad of reading comprehension (RC) assessments used in schools, multiple-choice (MC) questions continue to be one of the most prevalent formats used by educators and researchers. Outcomes from RC assessments dictate many critical factors encountered during a student's academic career, and it is crucial that we gain a deeper…
Descriptors: Grade 3, Elementary School Students, Reading Comprehension, Decoding (Reading)
Toker, Deniz – TESL-EJ, 2019
The central purpose of this paper is to examine validity problems arising from the multiple-choice items and technical passages in the Test of English as a Foreign Language Internet-based Test (TOEFL iBT) reading section, primarily concentrating on construct-irrelevant variance (Messick, 1989). My personal TOEFL iBT experience, along with my…
Descriptors: English (Second Language), Language Tests, Second Language Learning, Computer Assisted Testing
Fukuzawa, Sherry; deBraga, Michael – Journal of Curriculum and Teaching, 2019
Graded Response Method (GRM) is an alternative to multiple-choice testing where students rank options according to their relevance to the question. GRM requires discrimination and inference between statements and is a cost-effective critical thinking assessment in large courses where open-ended answers are not feasible. This study examined…
Descriptors: Alternative Assessment, Multiple Choice Tests, Test Items, Test Format
Lee, Jia-Ying – Taiwan Journal of TESOL, 2018
This article examines the test-taking strategies of high- and low-scoring Chinese-speaking participants when they answer English multiple-choice reading comprehension questions. Thirty-two participants took a TOEIC reading test, provided think-aloud protocols, and joined a post-task interview. The data come primarily from qualitative analysis and…
Descriptors: Foreign Countries, Test Wiseness, English (Second Language), Language Tests