NotesFAQContact Us
Collection
Advanced
Search Tips
Back to results
ERIC Number: ED279709
Record Type: Non-Journal
Publication Date: 1986-Jun
Pages: 42
Abstractor: N/A
ISBN: N/A
ISSN: N/A
EISSN: N/A
Keylist Items for the Measurement of Verbal Aptitude. Research Report.
Ward, William C.; And Others
The keylist format (rather than the conventional multiple-choice format) for item presentation provides a machine-scorable surrogate for a truly free-response test. In this format, the examinee is required to think of an answer, look it up in a long ordered list, and enter its number on an answer sheet. The introduction of keylist items into standardized tests could potentially offer several important benefits, among them the construction of items requiring production rather than simply recognition of correct answers, ease of item development, and resistance to coaching. A number of questions had to be answered before the keylist format could be considered for use in operational tests. This study addressed several of the most important of these in an examination of two item types employed in verbal aptitude tests--Antonyms and Analogies--and administered in both keylist and multiple-choice formats. These item types were selected for two reasons: (1) there is evidence that multiple-choice forms of these item types are susceptible to coaching; and (2) prior work has shown the feasibility of developing keylist versions of them. Relations among tests employing different response formats were analyzed and their correlations with other measures of aptitude and achievement were compared. The analyses indicated that the format has little or no systematic effect on the construct validity of tests employing item types used in standardized tests of verbal aptitude. There was, in addition, little agreement among experienced test developers on the set of keys that should be supplied for each keylist item. Appendices include instructions and sample items for experimental tests and additional aptitude tests. (JAZ)
Publication Type: Reports - Research
Education Level: N/A
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: Educational Testing Service, Princeton, NJ.
Grant or Contract Numbers: N/A