Publication Date
In 2025 | 0 |
Since 2024 | 6 |
Since 2021 (last 5 years) | 20 |
Since 2016 (last 10 years) | 47 |
Since 2006 (last 20 years) | 76 |
Descriptor
Multiple Choice Tests | 185 |
Test Wiseness | 185 |
Higher Education | 52 |
Test Items | 51 |
Test Construction | 39 |
Guessing (Tests) | 36 |
Scores | 36 |
Test Format | 31 |
Response Style (Tests) | 29 |
Undergraduate Students | 29 |
Foreign Countries | 26 |
More ▼ |
Source
Author
Moon, Jung Aa | 4 |
Katz, Irvin R. | 3 |
Keehner, Madeleine | 3 |
Plake, Barbara S. | 3 |
Williams, Robert L. | 3 |
Ardoin, Scott P. | 2 |
Binder, Katherine S. | 2 |
Cohen, Andrew D. | 2 |
Crocker, Linda | 2 |
Dolly, John P. | 2 |
Herman, William E. | 2 |
More ▼ |
Publication Type
Education Level
Audience
Students | 8 |
Practitioners | 7 |
Researchers | 6 |
Teachers | 6 |
Parents | 2 |
Location
Canada | 4 |
United Kingdom | 3 |
Australia | 2 |
California | 2 |
Germany | 2 |
United Kingdom (Great Britain) | 2 |
United States | 2 |
Austria | 1 |
Belgium | 1 |
Canada (Ottawa) | 1 |
Czech Republic | 1 |
More ▼ |
Laws, Policies, & Programs
Elementary and Secondary… | 1 |
No Child Left Behind Act 2001 | 1 |
Assessments and Surveys
What Works Clearinghouse Rating

Fagley, N. S. – Journal of Educational Psychology, 1987
This article investigates positional response bias, testwiseness, and guessing strategy as components of variance in test responses on multiple-choice tests. University students responded to two content exams, a testwiseness measure, and a guessing strategy measure. The proportion of variance in test scores accounted for by positional response…
Descriptors: Achievement Tests, Guessing (Tests), Higher Education, Multiple Choice Tests

Cohen, Andrew D. – Language Testing, 1984
Discusses methods for obtaining verbal report data on second language test-taking strategies. Reports on the findings obtained from unpublished studies dealing with how language learners take reading tests. Concludes that there should be a closer fit between how test constructors intend their tests to be taken and how respondents actually take…
Descriptors: Cloze Procedure, Language Tests, Multiple Choice Tests, Reading Tests

Plake, Barbara S.; Huntley, Renee M. – Educational and Psychological Measurement, 1984
Two studies examined the effect of making the correct answer of a multiple choice test item grammatically consistent with the item. American College Testing Assessment experimental items were constructed to investigate grammatical compliance to investigate grammatical compliance for plural-singular and vowel-consonant agreement. Results suggest…
Descriptors: Grammar, Higher Education, Item Analysis, Multiple Choice Tests

Gross, Leon J. – Journal of Optometric Education, 1982
A critique of a variety of formats used in combined-response test items (those in which the respondent must choose the correct combination of options: a and b, all of the above, etc.) illustrates why this kind of testing is inherently flawed and should not be used in optometry examinations. (MSE)
Descriptors: Higher Education, Multiple Choice Tests, Optometry, Standardized Tests

Herman, William E. – Journal of Research and Development in Education, 1997
This study explored the relationship between time needed to complete multiple choice tests and college student test performance. Examination time and performance for midsemester and final exams were measured. Results indicated that the variables were relatively unrelated. Students tended to use a consistent test-taking tempo on relatively untimed…
Descriptors: Academic Achievement, College Students, Higher Education, Multiple Choice Tests

Allan, Alistair – Language Testing, 1992
The design of a valid and reliable test of test-wiseness is reported: a 33-item multiple-choice instrument with 4 subscales trialed with several groups of English-as-a-Second-Language students. Findings indicate differential skills in test-taking; some learner scores are influenced by skills that are not the focus of the test. (13 references)…
Descriptors: English (Second Language), Language Research, Language Tests, Multiple Choice Tests
Schmitt, Alicia P.; Crocker, Linda – 1981
The effectiveness of a strategy for improving performance on multiple choice items for examinees with different levels of test anxiety was assessed. Undergraduate measurement students responded to the Mandler-Sarason Test Anxiety Scale and to an objective test covering course content. Results indicated that, for most examinees, generation of an…
Descriptors: Error of Measurement, Higher Education, Multiple Choice Tests, Response Style (Tests)

Whitby, L. G. – Medical Education, 1977
Advantages and disadvantages of no-penalty and penalty marking systems are discussed. Ways in which examiners have attempted to correct for guessing by students are reviewed, along with the use of "don't know" options and confidence-weighting for attempting to assess the degree of certainty that candidates attach to their answers. (Author/LBH)
Descriptors: Grading, Guessing (Tests), Higher Education, Medical Education

Weiten, Wayne – Journal of Experimental Education, 1984
The effects of violating four item construction principles were examined to assess the validity of the principles and the importance of students' test wiseness. While flawed items were significantly less difficult than sound items, differences in item discrimination, test reliability, and concurrent validity were not observed. (Author/BW)
Descriptors: Difficulty Level, Higher Education, Item Analysis, Multiple Choice Tests

Bliss, Leonard B. – Journal of Educational Measurement, 1980
A mathematics achievement test with instructions to avoid guessing wildly was given to 168 elementary school pupils who were later asked to complete all the questions using a differently colored pencil. Results showed examinees, particularly the more able students, tend to omit too many items. (CTM)
Descriptors: Anxiety, Guessing (Tests), Intermediate Grades, Multiple Choice Tests

Feldt, Ronald C. – Contemporary Educational Psychology, 1990
The effect of test expectancy on preferred strategy use and test performance on factual and higher-level questions in learning from expository text was studied, using 42 undergraduates who reported their study strategies and completed a multiple-choice test. Test expectancy affected neither preferred strategy use nor test performance. (SLD)
Descriptors: Higher Education, Learning Strategies, Multiple Choice Tests, Performance Factors
Plake, Barbara S.; And Others – 1980
Number right and elimination scores were analyzed on a 48-item college level mathematics test that was assembled from pretest data in three forms by varying the item orderings: easy-hard, uniform, or random. Half of the forms contained information explaining the item arrangement and suggesting strategies for taking the test. Several anxiety…
Descriptors: Difficulty Level, Higher Education, Multiple Choice Tests, Quantitative Tests
Mercer, Maryann – 1977
In a 1977 review of the literature on test answer changing, Mueller and Wasser (EJ 163 236) cited 17 studies and concluded that students changing answers on objective tests gain more points than they lost by so doing. Higher scoring students tend to gain more than do the lower scoring students. Six additional studies not reported in the Mueller…
Descriptors: Guessing (Tests), Higher Education, Junior High Schools, Literature Reviews
Jacobs, Stanley S. – 1971
The study was an experimental investigation of the effects of item difficulty and subject ability on subjects answer-changing behaviors. Subjects were administered an achievement test composed of items at three levels of difficulty via slides, followed by a printed copy of the test. Analyses revealed no effects attributable to subject ability.…
Descriptors: Academic Ability, Achievement Tests, Educational Experiments, Graduate Students

D'Ydewalle, Gery; And Others – Contemporary Educational Psychology, 1983
Study time and test performance change as a function of expecting either open or multiple-choice questions on a history test. Subjects tested in either format were led to expect the same test format on a second test. Subjects expecting open questions studied more and performed better on both test formats. (Author/CM)
Descriptors: Essay Tests, Expectation, Foreign Countries, Higher Education