Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 3 |
Since 2016 (last 10 years) | 11 |
Since 2006 (last 20 years) | 18 |
Descriptor
Foreign Countries | 24 |
Test Items | 24 |
Item Response Theory | 21 |
Difficulty Level | 7 |
Achievement Tests | 6 |
Mathematics Tests | 5 |
Scores | 5 |
Student Evaluation | 5 |
Test Bias | 5 |
Test Validity | 5 |
International Assessment | 4 |
More ▼ |
Source
Author
Barss, Joseph | 1 |
Bertrand, Richard | 1 |
Birol, Gülnur | 1 |
Blömeke, Sigrid | 1 |
Braeken, Johan | 1 |
Bristow, M. | 1 |
Budgell, Glen R. | 1 |
Bulut, Okan | 1 |
Chalmers, R. Philip | 1 |
Chen, Ching-I | 1 |
Chew, Alex L. | 1 |
More ▼ |
Publication Type
Journal Articles | 21 |
Reports - Research | 21 |
Reports - Evaluative | 3 |
Speeches/Meeting Papers | 2 |
Tests/Questionnaires | 1 |
Education Level
Higher Education | 8 |
Postsecondary Education | 8 |
Elementary Secondary Education | 3 |
Secondary Education | 3 |
Elementary Education | 2 |
Grade 4 | 2 |
Grade 8 | 2 |
Grade 9 | 2 |
High Schools | 2 |
Intermediate Grades | 2 |
Junior High Schools | 2 |
More ▼ |
Audience
Practitioners | 1 |
Teachers | 1 |
Location
Canada | 24 |
United States | 5 |
Australia | 2 |
Chile | 2 |
Finland | 2 |
Germany | 2 |
Hong Kong | 2 |
Japan | 2 |
Philippines | 2 |
Russia | 2 |
South Korea | 2 |
More ▼ |
Laws, Policies, & Programs
Assessments and Surveys
Trends in International… | 2 |
Program for International… | 1 |
Progress in International… | 1 |
What Works Clearinghouse Rating
Guo, Hongwen; Rios, Joseph A.; Ling, Guangming; Wang, Zhen; Gu, Lin; Yang, Zhitong; Liu, Lydia O. – ETS Research Report Series, 2022
Different variants of the selected-response (SR) item type have been developed for various reasons (i.e., simulating realistic situations, examining critical-thinking and/or problem-solving skills). Generally, the variants of SR item format are more complex than the traditional multiple-choice (MC) items, which may be more challenging to test…
Descriptors: Test Format, Test Wiseness, Test Items, Item Response Theory
Saatcioglu, Fatima Munevver; Sen, Sedat – International Journal of Testing, 2023
In this study, we illustrated an application of the confirmatory mixture IRT model for multidimensional tests. We aimed to examine the differences in student performance by domains with a confirmatory mixture IRT modeling approach. A three-dimensional and three-class model was analyzed by assuming content domains as dimensions and cognitive…
Descriptors: Item Response Theory, Foreign Countries, Elementary Secondary Education, Achievement Tests
Whitaker, Douglas; Barss, Joseph; Drew, Bailey – Online Submission, 2022
Challenges to measuring students' attitudes toward statistics remain despite decades of focused research. Measuring the expectancy-value theory (EVT) Cost construct has been especially challenging owing in part to the historical lack of research about it. To measure the EVT Cost construct better, this study asked university students to respond to…
Descriptors: Statistics Education, College Students, Student Attitudes, Likert Scales
Fukuzawa, Sherry; deBraga, Michael – Journal of Curriculum and Teaching, 2019
Graded Response Method (GRM) is an alternative to multiple-choice testing where students rank options according to their relevance to the question. GRM requires discrimination and inference between statements and is a cost-effective critical thinking assessment in large courses where open-ended answers are not feasible. This study examined…
Descriptors: Alternative Assessment, Multiple Choice Tests, Test Items, Test Format
Shin, Jinnie; Bulut, Okan; Gierl, Mark J. – Journal of Experimental Education, 2020
The arrangement of response options in multiple-choice (MC) items, especially the location of the most attractive distractor, is considered critical in constructing high-quality MC items. In the current study, a sample of 496 undergraduate students taking an educational assessment course was given three test forms consisting of the same items but…
Descriptors: Foreign Countries, Undergraduate Students, Multiple Choice Tests, Item Response Theory
Scribner, Emily D.; Harris, Sara E. – Journal of Geoscience Education, 2020
The Mineralogy Concept Inventory (MCI) is a statistically validated 18-question assessment that can be used to measure learning gains in introductory mineralogy courses. Development of the MCI was an iterative process involving expert consultation, student interviews, assessment deployment, and statistical analysis. Experts at the two universities…
Descriptors: Undergraduate Students, Mineralogy, Introductory Courses, Science Tests
McIntosh, James – Scandinavian Journal of Educational Research, 2019
This article examines whether the way that PISA models item outcomes in mathematics affects the validity of its country rankings. As an alternative to PISA methodology a two-parameter model is applied to PISA mathematics item data from Canada and Finland for the year 2012. In the estimation procedure item difficulty and dispersion parameters are…
Descriptors: Foreign Countries, Achievement Tests, Secondary School Students, International Assessment
Chalmers, R. Philip; Counsell, Alyssa; Flora, David B. – Educational and Psychological Measurement, 2016
Differential test functioning, or DTF, occurs when one or more items in a test demonstrate differential item functioning (DIF) and the aggregate of these effects are witnessed at the test level. In many applications, DTF can be more important than DIF when the overall effects of DIF at the test level can be quantified. However, optimal statistical…
Descriptors: Test Bias, Sampling, Test Items, Statistical Analysis
Goldhammer, Frank; Martens, Thomas; Lüdtke, Oliver – Large-scale Assessments in Education, 2017
Background: A potential problem of low-stakes large-scale assessments such as the Programme for the International Assessment of Adult Competencies (PIAAC) is low test-taking engagement. The present study pursued two goals in order to better understand conditioning factors of test-taking disengagement: First, a model-based approach was used to…
Descriptors: Student Evaluation, International Assessment, Adults, Competence
Bristow, M.; Erkorkmaz, K.; Huissoon, J. P.; Jeon, Soo; Owen, W. S.; Waslander, S. L.; Stubley, G. D. – IEEE Transactions on Education, 2012
Any meaningful initiative to improve the teaching and learning in introductory control systems courses needs a clear test of student conceptual understanding to determine the effectiveness of proposed methods and activities. The authors propose a control systems concept inventory. Development of the inventory was collaborative and iterative. The…
Descriptors: Diagnostic Tests, Concept Formation, Undergraduate Students, Engineering Education
Squires, Jane K.; Waddell, Misti L.; Clifford, Jantina R.; Funk, Kristin; Hoselton, Robert M.; Chen, Ching-I – Topics in Early Childhood Special Education, 2013
Psychometric and utility studies on Social Emotional Assessment Measure (SEAM), an innovative tool for assessing and monitoring social-emotional and behavioral development in infants and toddlers with disabilities, were conducted. The Infant and Toddler SEAM intervals were the study focus, using mixed methods, including item response theory…
Descriptors: Psychometrics, Evaluation Methods, Social Development, Emotional Development
Cui, Ying; Mousavi, Amin – International Journal of Testing, 2015
The current study applied the person-fit statistic, l[subscript z], to data from a Canadian provincial achievement test to explore the usefulness of conducting person-fit analysis on large-scale assessments. Item parameter estimates were compared before and after the misfitting student responses, as identified by l[subscript z], were removed. The…
Descriptors: Measurement, Achievement Tests, Comparative Analysis, Test Items
Deane, Thomas; Nomme, Kathy; Jeffery, Erica; Pollock, Carol; Birol, Gülnur – CBE - Life Sciences Education, 2016
We followed established best practices in concept inventory design and developed a 12-item inventory to assess student ability in statistical reasoning in biology (Statistical Reasoning in Biology Concept Inventory [SRBCI]). It is important to assess student thinking in this conceptual area, because it is a fundamental requirement of being…
Descriptors: Foreign Countries, Measures (Individuals), Test Construction, Statistics
Reckase, Mark D.; Xu, Jing-Ru – Educational and Psychological Measurement, 2015
How to compute and report subscores for a test that was originally designed for reporting scores on a unidimensional scale has been a topic of interest in recent years. In the research reported here, we describe an application of multidimensional item response theory to identify a subscore structure in a test designed for reporting results using a…
Descriptors: English, Language Skills, English Language Learners, Scores
Gattamorta, Karina A.; Penfield, Randall D.; Myers, Nicholas D. – International Journal of Testing, 2012
Measurement invariance is a common consideration in the evaluation of the validity and fairness of test scores when the tested population contains distinct groups of examinees, such as examinees receiving different forms of a translated test. Measurement invariance in polytomous items has traditionally been evaluated at the item-level,…
Descriptors: Foreign Countries, Psychometrics, Test Bias, Test Items
Previous Page | Next Page »
Pages: 1 | 2