Publication Date
In 2025 | 0 |
Since 2024 | 6 |
Since 2021 (last 5 years) | 33 |
Since 2016 (last 10 years) | 64 |
Since 2006 (last 20 years) | 87 |
Descriptor
Source
Author
Huggins-Manley, Anne Corinne | 3 |
Kam, Chester Chun Seng | 3 |
Plake, Barbara S. | 3 |
Ames, Allison J. | 2 |
Bejar, Isaac I. | 2 |
Belov, Dmitry I. | 2 |
Benson, Jeri | 2 |
Bolt, Daniel M. | 2 |
Braeken, Johan | 2 |
Bulut, Okan | 2 |
Cheng, Ying | 2 |
More ▼ |
Publication Type
Education Level
Secondary Education | 21 |
Higher Education | 13 |
Elementary Education | 10 |
Postsecondary Education | 9 |
Middle Schools | 8 |
Intermediate Grades | 7 |
Grade 3 | 5 |
Grade 4 | 5 |
Grade 5 | 5 |
High Schools | 5 |
Primary Education | 5 |
More ▼ |
Audience
Researchers | 10 |
Practitioners | 3 |
Teachers | 1 |
Location
Germany | 8 |
Canada | 6 |
Australia | 5 |
United States | 4 |
California | 3 |
Finland | 3 |
Netherlands | 3 |
United Kingdom | 3 |
Belgium | 2 |
China | 2 |
Denmark | 2 |
More ▼ |
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Weber, Margaret B. – 1977
The effects of different choice formats on the reliability of teacher-made tests were examined for high and low achievers. The first study examined the effect of 3 and 5 choice items on the reliability of dichotomously scored teacher-made tests. The second study examined the effect of 3 and 4 choice items on the reliability of similarly designed…
Descriptors: Academic Achievement, Achievement Tests, Guessing (Tests), High Achievement
Jakwerth, Pamela R.; Stancavage, Frances B.; Reed, Ellen D. – National Center for Education Statistics, 2003
Over the past decade, developers of the National Assessment of Educational Progress (NAEP) have changed substantially the mix of item types on the NAEP assessments by decreasing the numbers of multiple-choice questions and increasing the numbers of questions requiring short- or extended-constructed responses. These changes have been motivated…
Descriptors: National Competency Tests, Response Style (Tests), Test Validity, Qualitative Research

Friel, S.; Johnstone, A. H. – Education in Chemistry, 1979
Presents the results of an investigation to determine if the position of a distractor in a multiple choice question influences the degree of difficulty of an item. The data support the hypothesis that the placement of the distractor immediately before the key alters the difficulty of the item significantly. (Authors/SA)
Descriptors: Educational Research, Item Analysis, Multiple Choice Tests, Research

Budescu, David V.; Nevo, Baruch – Journal of Educational Measurement, 1985
The proportionality model assumes that total testing time is proportional to the number of test items and the number of options per multiple choice test item. This assumption was examined, using test items having from two to five options. The model was not supported. (Author/GDC)
Descriptors: College Entrance Examinations, Foreign Countries, Higher Education, Item Analysis

Samejima, Fumiko – Applied Psychological Measurement, 1977
The accuracy of estimation of the subjects' latent ability maintained by tailoring for each testee the order of item presentation and the border of item dichotomization was compared to the information provided by the original graded test items. (RC)
Descriptors: Ability, Adaptive Testing, Branching, Computer Assisted Testing
Meld, Andrea – 1990
Surveys used for program and institutional evaluation, such as self-studies conducted for accreditation review, are discussed. Frequently, these evaluations take the form of faculty surveys and student surveys. This paper explores the following general considerations associated with mail surveys and other surveys: avoidance of response bias;…
Descriptors: Accreditation (Institutions), Comparative Analysis, Higher Education, Mail Surveys
Bejar, Issac I.; Yocom, Peter – 1986
This report explores an approach to item development and psychometric modeling which explicitly incorporates knowledge about the mental models used by examinees in the solution of items into a psychometric model that characterize performances on a test, as well as incorporating that knowledge into the item development process. The paper focuses on…
Descriptors: Artificial Intelligence, Computer Assisted Testing, Computer Science, Construct Validity
Campbell, Noma Jo; Grissom, Stephen – 1979
To investigate the effects of wording in attitude test items, a five-point Likert-type rating scale was administered to 173 undergraduate education majors. The test measured attitudes toward college and self, and contained 38 positively-worded items. Thirty-eight negatively-worded items were also written to parallel the positive statements.…
Descriptors: Affective Measures, Attitude Measures, Higher Education, Rating Scales
Huntley, Renee M.; Plake, Barbara S. – 1980
Guidelines for test item-writing have traditionally recommended making the correct answer of a multiple-choice item grammatically consistent with its stem. To investigate the effects of adhering to this practice, certain item formats were designed to determine whether the practice of providing relevant grammatical clues, in itself, created cue…
Descriptors: College Entrance Examinations, Cues, Difficulty Level, Grammar
Leitner, Dennis W.; And Others – 1979
To discover factors which contribute to a high response rate for questionnaire surveys, the preferences of 150 college teachers and teaching assistants were studied. Four different questionnaire formats using 34 common items were sent to the subjects: open-ended; Likert-type (five points, from "strong influence to return," to…
Descriptors: Check Lists, College Faculty, Comparative Testing, Higher Education
Schoen, Harold L.; And Others – 1987
The estimation processes used by fifth through eighth grade students as they responded to computational estimation test items were examined. Interview-based process descriptions were cross-validated using large group test data from an open-ended test and a multiple choice test. Five question formats were used to test different estimation…
Descriptors: Age Differences, Cognitive Processes, Cognitive Structures, Cognitive Style
Douglass, James B. – 1981
Relationships between item bias, item difficulty invariance, Rasch tests of item fit, and item position in a speeded 72-item Michigan State University Vocabulary Placement Test were investigated using 143 black males, 289 black females, 2,953 white males and 3,271 white females. Measures of item bias and item difficulty invariance were determined…
Descriptors: Black Students, Computer Programs, Correlation, Difficulty Level
Plake, Barbara S.; Wise, Steven L. – 1986
One question regarding the utility of adaptive testing is the effect of individualized item arrangements on examinee test scores. The purpose of this study was to analyze the item difficulty choices by examinees as a function of previous item performance. The examination was a 25-item test of basic algebra skills given to 36 students in an…
Descriptors: Adaptive Testing, Algebra, College Students, Computer Assisted Testing
Brown, Alan S.; Itzig, Jerry M. – 1976
The effects of humorous test questions on test performance of high and low-anxious college students was investigated. It was hypothesized that humor should reduce the anxiety level of high-anxious subjects, and thus improve their performance, while having little effect on low-anxious subjects. Students were assigned to a low or high-anxious group…
Descriptors: Academic Achievement, Anxiety, Arousal Patterns, Higher Education
McKee, Barbara G.; Blake, Rowland S. – 1979
A questionnaire concerning attitudes toward the importance of communication skills and toward different modes of communication was administered to 290 incoming freshmen at the National Technical Institute for the Deaf. Students responded to one of two types of a 38-item questionnaire: a multiple choice, or Likert-type (strongly agree...strongly…
Descriptors: Attitude Measures, Communication Skills, Communication (Thought Transfer), Deafness