Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 1 |
Since 2016 (last 10 years) | 2 |
Since 2006 (last 20 years) | 13 |
Descriptor
Grade 6 | 13 |
Test Items | 13 |
Foreign Countries | 7 |
Grade 7 | 6 |
Item Response Theory | 6 |
Difficulty Level | 5 |
Scaling | 5 |
Student Evaluation | 5 |
Academic Standards | 4 |
Benchmarking | 4 |
Data Analysis | 4 |
More ▼ |
Source
Author
Donovan, Jenny | 3 |
Lennon, Melissa | 3 |
Hutton, Penny | 2 |
Morrissey, Noni | 2 |
O'Connor, Gayl | 2 |
Alonzo, Julie | 1 |
Cesnik, Hermann S. | 1 |
Chen, Deng-Jyi | 1 |
Chen, Shu-Ling | 1 |
Cho, Hyun-Jeong | 1 |
Cidade, Melissa | 1 |
More ▼ |
Publication Type
Journal Articles | 8 |
Reports - Research | 7 |
Reports - Evaluative | 6 |
Numerical/Quantitative Data | 3 |
Tests/Questionnaires | 2 |
Education Level
Elementary Secondary Education | 13 |
Grade 6 | 13 |
Elementary Education | 12 |
Grade 7 | 6 |
Junior High Schools | 5 |
Middle Schools | 5 |
Secondary Education | 4 |
Grade 5 | 3 |
Grade 8 | 3 |
Intermediate Grades | 3 |
Grade 10 | 2 |
More ▼ |
Audience
Laws, Policies, & Programs
Individuals with Disabilities… | 1 |
Assessments and Surveys
Program for International… | 3 |
What Works Clearinghouse Rating
Özkan, Yesim Özer; Güvendir, Meltem Acar – Journal of Pedagogical Research, 2021
Large scale assessment is conducted at different class levels for various purposes such as identifying student success in education, observing the impacts of educational reforms on student achievement, assessment, selection, and placement. It is expected that these tests and their items are used in education do not display different traits with…
Descriptors: Foreign Countries, Test Bias, Student Evaluation, Test Items
Lessne, Deborah; Cidade, Melissa – National Center for Education Statistics, 2016
This report outlines the development, methodology, and results of the split-half administration of the 2015 School Crime Supplement (SCS) to the National Crime Victimization Survey (NCVS). The National Crime Victimization Survey (NCVS) is sponsored by the U.S. Department of Justice, Bureau of Justice Statistics (BJS). The National Center for…
Descriptors: National Surveys, Victims of Crime, Bullying, Schools
Ye, Meng; Xin, Tao – Educational and Psychological Measurement, 2014
The authors explored the effects of drifting common items on vertical scaling within the higher order framework of item parameter drift (IPD). The results showed that if IPD occurred between a pair of test levels, the scaling performance started to deviate from the ideal state, as indicated by bias of scaling. When there were two items drifting…
Descriptors: Scaling, Test Items, Equated Scores, Achievement Gains
Pibal, Florian; Cesnik, Hermann S. – Practical Assessment, Research & Evaluation, 2011
When administering tests across grades, vertical scaling is often employed to place scores from different tests on a common overall scale so that test-takers' progress can be tracked. In order to be able to link the results across grades, however, common items are needed that are included in both test forms. In the literature there seems to be no…
Descriptors: Scaling, Test Items, Equated Scores, Reading Tests
Liu, Ou Lydia; Lee, Hee-Sun; Linn, Marcia C. – Educational Assessment, 2011
Both multiple-choice and constructed-response items have known advantages and disadvantages in measuring scientific inquiry. In this article we explore the function of explanation multiple-choice (EMC) items and examine how EMC items differ from traditional multiple-choice and constructed-response items in measuring scientific reasoning. A group…
Descriptors: Science Tests, Multiple Choice Tests, Responses, Test Items
Cho, Hyun-Jeong; Lee, Jaehoon; Kingston, Neal – Applied Measurement in Education, 2012
This study examined the validity of test accommodation in third-eighth graders using differential item functioning (DIF) and mixture IRT models. Two data sets were used for these analyses. With the first data set (N = 51,591) we examined whether item type (i.e., story, explanation, straightforward) or item features were associated with item…
Descriptors: Testing Accommodations, Test Bias, Item Response Theory, Validity
Yang, Chih-Wei; Kuo, Bor-Chen; Liao, Chen-Huei – Turkish Online Journal of Educational Technology - TOJET, 2011
The aim of the present study was to develop an on-line assessment system with constructed response items in the context of elementary mathematics curriculum. The system recorded the problem solving process of constructed response items and transfered the process to response codes for further analyses. An inference mechanism based on artificial…
Descriptors: Foreign Countries, Mathematics Curriculum, Test Items, Problem Solving
Hickey, Daniel T.; Ingram-Goble, Adam A.; Jameson, Ellen M. – Journal of Science Education and Technology, 2009
This study used innovative assessment practices to obtain and document broad learning outcomes for a 15-hour game-based curriculum in Quest Atlantis, a multi-user virtual environment that supports school-based participation in socio scientific inquiry in ecological sciences. Design-based methods were used to refine and align the enactment of…
Descriptors: Feedback (Response), Test Items, Student Evaluation, Achievement Tests
Lai, Ah-Fur; Chen, Deng-Jyi; Chen, Shu-Ling – Journal of Educational Multimedia and Hypermedia, 2008
The IRT (Item Response Theory) has been studied and applied in computer-based test for decades. However, almost of all these existing studies evaluated focus merely on test questions with text-based (or static text/graphic) type of presentation form illustrated exclusively. In this paper, we present our study on test questions using both…
Descriptors: Elementary School Students, Semantics, Difficulty Level, Item Response Theory
Lai, Cheng Fei; Alonzo, Julie; Tindal, Gerald – Behavioral Research and Teaching, 2008
In this technical report, we describe the development and piloting of a series of mathematics progress monitoring measures intended for use with students in grades kindergarten through eighth grade. These measures, available as part of easyCBM[TM], an online progress monitoring assessment system, were developed in 2007 and 2008 and administered to…
Descriptors: Grade 6, General Education, Response to Intervention, Access to Education
Donovan, Jenny; Hutton, Penny; Lennon, Melissa; O'Connor, Gayl; Morrissey, Noni – Ministerial Council on Education, Employment, Training and Youth Affairs (NJ1), 2008
In July 2001, the Ministerial Council on Education, Employment, Training and Youth Affairs (MCEETYA) agreed to the development of assessment instruments and key performance measures for reporting on student skills, knowledge and understandings in primary science. It directed the newly established Performance Measurement and Reporting Taskforce…
Descriptors: Foreign Countries, Scientific Literacy, Media Literacy, Scientific Concepts
Donovan, Jenny; Lennon, Melissa; O'Connor, Gayl; Morrissey, Noni – Ministerial Council on Education, Employment, Training and Youth Affairs (NJ1), 2008
In 2003 the first nationally-comparable science assessment was designed, developed and carried out under the auspices of the national council of education ministers, the Ministerial Council on Education, Employment, Training and Youth Affairs (MCEETYA). In 2006 a second science assessment was conducted and, for the first time nationally, the…
Descriptors: Foreign Countries, Scientific Literacy, Science Achievement, Comparative Analysis
Wu, Margaret; Donovan, Jenny; Hutton, Penny; Lennon, Melissa – Ministerial Council on Education, Employment, Training and Youth Affairs (NJ1), 2008
In July 2001, the Ministerial Council on Education, Employment, Training and Youth Affairs (MCEETYA) agreed to the development of assessment instruments and key performance measures for reporting on student skills, knowledge and understandings in primary science. It directed the newly established Performance Measurement and Reporting Taskforce…
Descriptors: Foreign Countries, Scientific Literacy, Science Achievement, Comparative Analysis