Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 1 |
Since 2016 (last 10 years) | 2 |
Since 2006 (last 20 years) | 6 |
Descriptor
Evaluation Methods | 6 |
Item Analysis | 6 |
Test Content | 6 |
Test Items | 5 |
Difficulty Level | 3 |
Student Evaluation | 3 |
Test Construction | 3 |
Test Validity | 3 |
Diagnostic Tests | 2 |
Evaluation Research | 2 |
Evidence | 2 |
More ▼ |
Source
Carnegie Corporation of New… | 1 |
Educational Research and… | 1 |
International Journal of… | 1 |
Journal of Chemical Education | 1 |
Ministerial Council on… | 1 |
Online Submission | 1 |
Author
Alghazali, Tawfeeq | 1 |
Camilli, Gregory | 1 |
Dawood, Abdul Kareem Shareef | 1 |
Donovan, Jenny | 1 |
Hutton, Penny | 1 |
Kadhim, Qasim Khlaif | 1 |
Kieffer, Michael | 1 |
Lennon, Melissa | 1 |
Mohammed, Aisha | 1 |
Morsy, Leila | 1 |
Parry, James R. | 1 |
More ▼ |
Publication Type
Journal Articles | 3 |
Reports - Evaluative | 2 |
Reports - Research | 2 |
Guides - Non-Classroom | 1 |
Numerical/Quantitative Data | 1 |
Reports - Descriptive | 1 |
Education Level
Elementary Secondary Education | 3 |
Elementary Education | 1 |
Grade 6 | 1 |
Higher Education | 1 |
Postsecondary Education | 1 |
Audience
Teachers | 2 |
Administrators | 1 |
Policymakers | 1 |
Location
Australia | 1 |
Laws, Policies, & Programs
Assessments and Surveys
International English… | 1 |
Program for International… | 1 |
Test of English as a Foreign… | 1 |
What Works Clearinghouse Rating
Parry, James R. – Online Submission, 2020
This paper presents research and provides a method to ensure that parallel assessments, that are generated from a large test-item database, maintain equitable difficulty and content coverage each time the assessment is presented. To maintain fairness and validity it is important that all instances of an assessment, that is intended to test the…
Descriptors: Culture Fair Tests, Difficulty Level, Test Items, Test Validity
Mohammed, Aisha; Dawood, Abdul Kareem Shareef; Alghazali, Tawfeeq; Kadhim, Qasim Khlaif; Sabti, Ahmed Abdulateef; Sabit, Shaker Holh – International Journal of Language Testing, 2023
Cognitive diagnostic models (CDMs) have received much interest within the field of language testing over the last decade due to their great potential to provide diagnostic feedback to all stakeholders and ultimately improve language teaching and learning. A large number of studies have demonstrated the application of CDMs on advanced large-scale…
Descriptors: Reading Comprehension, Reading Tests, Language Tests, English (Second Language)
Towns, Marcy H. – Journal of Chemical Education, 2014
Chemistry faculty members are highly skilled in obtaining, analyzing, and interpreting physical measurements, but often they are less skilled in measuring student learning. This work provides guidance for chemistry faculty from the research literature on multiple-choice item development in chemistry. Areas covered include content, stem, and…
Descriptors: Multiple Choice Tests, Test Construction, Psychometrics, Test Items
Camilli, Gregory – Educational Research and Evaluation, 2013
In the attempt to identify or prevent unfair tests, both quantitative analyses and logical evaluation are often used. For the most part, fairness evaluation is a pragmatic attempt at determining whether procedural or substantive due process has been accorded to either a group of test takers or an individual. In both the individual and comparative…
Descriptors: Alternative Assessment, Test Bias, Test Content, Test Format
Morsy, Leila; Kieffer, Michael; Snow, Catherine – Carnegie Corporation of New York, 2010
Although millions of dollars and weeks of instructional time are spent nationally on testing students, educators often have little information on how to choose appropriate assessments of adolescent reading for informing instruction. This guide is designed to meet that need, by drawing together evidence about nine of the most commonly-used,…
Descriptors: Reading Comprehension, Reading Tests, Evaluation Methods, Adolescents
Wu, Margaret; Donovan, Jenny; Hutton, Penny; Lennon, Melissa – Ministerial Council on Education, Employment, Training and Youth Affairs (NJ1), 2008
In July 2001, the Ministerial Council on Education, Employment, Training and Youth Affairs (MCEETYA) agreed to the development of assessment instruments and key performance measures for reporting on student skills, knowledge and understandings in primary science. It directed the newly established Performance Measurement and Reporting Taskforce…
Descriptors: Foreign Countries, Scientific Literacy, Science Achievement, Comparative Analysis