NotesFAQContact Us
Collection
Advanced
Search Tips
Audience
Location
Germany2
Malaysia1
Laws, Policies, & Programs
Assessments and Surveys
Measures of Academic Progress1
What Works Clearinghouse Rating
Showing all 10 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Ludewig, Ulrich; Schwerter, Jakob; McElvany, Nele – Journal of Psychoeducational Assessment, 2023
A better understanding of how distractor features influence the plausibility of distractors is essential for an efficient multiple-choice (MC) item construction in educational assessment. The plausibility of distractors has a major influence on the psychometric characteristics of MC items. Our analysis utilizes the nominal categories model to…
Descriptors: Vocabulary, Language Tests, German, Grade 4
Peer reviewed Peer reviewed
Direct linkDirect link
Russell, Michael; Szendey, Olivia; Kaplan, Larry – Educational Assessment, 2021
Differential Item Function (DIF) analysis is commonly employed to examine potential bias produced by a test item. Since its introduction DIF analyses have focused on potential bias related to broad categories of oppression, including gender, racial stratification, economic class, and ableness. More recently, efforts to examine the effects of…
Descriptors: Test Bias, Achievement Tests, Individual Characteristics, Disadvantaged
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Shanmugam, S. Kanageswari Suppiah; Wong, Vincent; Rajoo, Murugan – Malaysian Journal of Learning and Instruction, 2020
Purpose: This study examined the quality of English test items using psychometric and linguistic characteristics among Grade Six pupils. Method: Contrary to the conventional approach of relying only on statistics when investigating item quality, this study adopted a mixed-method approach by employing psychometric analysis and cognitive interviews.…
Descriptors: English (Second Language), Second Language Instruction, Language Tests, Psychometrics
Li, Sylvia; Meyer, Patrick – NWEA, 2019
This simulation study examines the measurement precision, item exposure rates, and the depth of the MAP® Growth™ item pools under various grade-level restrictions. Unlike most summative assessments, MAP Growth allows examinees to see items from any grade level, regardless of the examinee's actual grade level. It does not limit the test to items…
Descriptors: Achievement Tests, Item Banks, Test Items, Instructional Program Divisions
Liu, Junhui; Brown, Terran; Chen, Jianshen; Ali, Usama; Hou, Likun; Costanzo, Kate – Partnership for Assessment of Readiness for College and Careers, 2016
The Partnership for Assessment of Readiness for College and Careers (PARCC) is a state-led consortium working to develop next-generation assessments that more accurately, compared to previous assessments, measure student progress toward college and career readiness. The PARCC assessments include both English Language Arts/Literacy (ELA/L) and…
Descriptors: Testing, Achievement Tests, Test Items, Test Bias
Peer reviewed Peer reviewed
Direct linkDirect link
Haag, Nicole; Heppt, Birgit; Roppelt, Alexander; Stanat, Petra – European Journal of Psychology of Education, 2015
In large-scale assessment studies, language minority students typically obtain lower test scores in mathematics than native speakers. Although this performance difference was related to the linguistic complexity of test items in some studies, other studies did not find linguistically demanding math items to be disproportionally more difficult for…
Descriptors: Foreign Countries, Language Minorities, Native Speakers, Monolingualism
Alonzo, Julie; Anderson, Daniel; Park, Bitnara Jasmine; Tindal, Gerald – Behavioral Research and Teaching, 2012
In this technical report, we describe the development and piloting of a series of vocabulary assessments intended for use with students in grades two through eight. These measures, available as part of easyCBM[TM], an online progress monitoring and benchmark/screening assessment system, were developed in 2010 and administered to approximately 1200…
Descriptors: Curriculum Based Assessment, Vocabulary, Language Tests, Test Construction
Alonzo, Julie; Anderson, Daniel; Park, Bitnara Jasmine; Tindal, Gerald – Behavioral Research and Teaching, 2012
In this technical report, we describe the development and piloting of a series of vocabulary assessments intended for use with students in grades two through eight. These measures, available as part of easyCBM[TM], an online progress monitoring and benchmark/screening assessment system, were developed in 2010 and administered to approximately 1200…
Descriptors: Curriculum Based Assessment, Vocabulary, Language Tests, Test Construction
Alonzo, Julie; Anderson, Daniel; Park, Bitnara Jasmine; Tindal, Gerald – Behavioral Research and Teaching, 2012
In this technical report, we describe the development and piloting of a series of vocabulary assessments intended for use with students in grades two through eight. These measures, available as part of easyCBM[TM], an online progress monitoring and benchmark/screening assessment system, were developed in 2010 and administered to approximately 1200…
Descriptors: Curriculum Based Assessment, Vocabulary, Language Tests, Test Construction
Peer reviewed Peer reviewed
Direct linkDirect link
Finch, Holmes; Barton, Karen; Meyer, Patrick – Educational Assessment, 2009
The No Child Left Behind act resulted in an increased reliance on large-scale standardized tests to assess the progress of individual students as well as schools. In addition, emphasis was placed on including all students in the testing programs as well as those with disabilities. As a result, the role of testing accommodations has become more…
Descriptors: Test Bias, Testing Accommodations, Standardized Tests, Mathematics Tests