Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 0 |
Since 2016 (last 10 years) | 2 |
Since 2006 (last 20 years) | 4 |
Descriptor
Author
DeBoer, George E. | 4 |
Herrmann-Abell, Cari F. | 3 |
Buckley, Barbara C. | 1 |
Davenport, Jodi L. | 1 |
Huang, Chun-Wei | 1 |
Jordan, Kevin A. | 1 |
Quellmalz, Edys S. | 1 |
Timms, Michael J. | 1 |
Publication Type
Reports - Research | 4 |
Journal Articles | 2 |
Speeches/Meeting Papers | 2 |
Education Level
Middle Schools | 4 |
Junior High Schools | 3 |
Secondary Education | 3 |
Elementary Education | 2 |
High Schools | 2 |
Elementary Secondary Education | 1 |
Higher Education | 1 |
Audience
Location
United States | 1 |
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Herrmann-Abell, Cari F.; DeBoer, George E. – Grantee Submission, 2016
Understanding students' misconceptions and how they change is an essential part of supporting students in their science learning. This paper presents results from distractor-driven multiple-choice assessments that target students' misconceptions about energy. Over 20,000 elementary, middle and high school students from across the U.S. participated…
Descriptors: Item Response Theory, Probability, Elementary School Students, Middle School Students
Herrmann-Abell, Cari F.; DeBoer, George E. – Grantee Submission, 2016
Energy is a core concept in the teaching of science. Therefore, it is important to know how students' thinking about energy develops so that elementary, middle, and high school students can be appropriately supported in their understanding of energy. This study tests the validity of a proposed theoretical model of students' growth of understanding…
Descriptors: Item Response Theory, Science Tests, Scientific Concepts, Energy
Quellmalz, Edys S.; Davenport, Jodi L.; Timms, Michael J.; DeBoer, George E.; Jordan, Kevin A.; Huang, Chun-Wei; Buckley, Barbara C. – Journal of Educational Psychology, 2013
How can assessments measure complex science learning? Although traditional, multiple-choice items can effectively measure declarative knowledge such as scientific facts or definitions, they are considered less well suited for providing evidence of science inquiry practices such as making observations or designing and conducting investigations.…
Descriptors: Science Education, Educational Assessment, Psychometrics, Science Tests
Herrmann-Abell, Cari F.; DeBoer, George E. – Chemistry Education Research and Practice, 2011
Distractor-driven multiple-choice assessment items and Rasch modeling were used as diagnostic tools to investigate students' understanding of middle school chemistry ideas. Ninety-one items were developed according to a procedure that ensured content alignment to the targeted standards and construct validity. The items were administered to 13360…
Descriptors: Construct Validity, Chemistry, Misconceptions, Multiple Choice Tests