Publication Date
In 2025 | 0 |
Since 2024 | 0 |
Since 2021 (last 5 years) | 0 |
Since 2016 (last 10 years) | 2 |
Since 2006 (last 20 years) | 14 |
Descriptor
Source
Author
Mislevy, Robert J. | 67 |
Levy, Roy | 5 |
Sheehan, Kathleen M. | 4 |
Behrens, John T. | 3 |
Almond, Russell G. | 2 |
Bock, R. Darrell | 2 |
Dicerbo, Kristen E. | 2 |
Stocking, Martha L. | 2 |
Verhelst, Norman | 2 |
Wilson, Mark | 2 |
Yan, Duanli | 2 |
More ▼ |
Publication Type
Reports - Evaluative | 31 |
Journal Articles | 30 |
Reports - Research | 23 |
Reports - Descriptive | 10 |
Speeches/Meeting Papers | 8 |
Opinion Papers | 3 |
Guides - Non-Classroom | 2 |
Information Analyses | 1 |
Tests/Questionnaires | 1 |
Education Level
Elementary Secondary Education | 2 |
Grade 8 | 1 |
Higher Education | 1 |
Postsecondary Education | 1 |
Audience
Researchers | 3 |
Policymakers | 1 |
Practitioners | 1 |
Location
Laws, Policies, & Programs
Race to the Top | 1 |
Assessments and Surveys
National Assessment of… | 9 |
Armed Services Vocational… | 4 |
California Achievement Tests | 1 |
Graduate Record Examinations | 1 |
Law School Admission Test | 1 |
Pre Professional Skills Tests | 1 |
SAT (College Admission Test) | 1 |
What Works Clearinghouse Rating
Rahman, Taslima; Mislevy, Robert J. – ETS Research Report Series, 2017
To demonstrate how methodologies for assessing reading comprehension can grow out of views of the construct suggested in the reading research literature, we constructed tasks and carried out psychometric analyses that were framed in accordance with 2 leading reading models. In estimating item difficulty and subsequently, examinee proficiency, an…
Descriptors: Reading Tests, Reading Comprehension, Psychometrics, Test Items
Dardick, William R.; Mislevy, Robert J. – Educational and Psychological Measurement, 2016
A new variant of the iterative "data = fit + residual" data-analytical approach described by Mosteller and Tukey is proposed and implemented in the context of item response theory psychometric models. Posterior probabilities from a Bayesian mixture model of a Rasch item response theory model and an unscalable latent class are expressed…
Descriptors: Bayesian Statistics, Probability, Data Analysis, Item Response Theory
Mislevy, Robert J. – Educational Measurement: Issues and Practice, 2012
This article presents the author's observations on Neil Dorans's NCME Career Award Address: "The Contestant Perspective on Taking Tests: Emanations from the Statue within." He calls attention to some points that Dr. Dorans made in his address, and offers his thoughts in response.
Descriptors: Testing, Test Reliability, Psychometrics, Scores
Mislevy, Robert J.; Zwick, Rebecca – Journal of Educational Measurement, 2012
A new entry in the testing lexicon is through-course summative assessment, a system consisting of components administered periodically during the academic year. As defined in the Race to the Top program, these assessments are intended to yield a yearly summative score for accountability purposes. They must provide for both individual and group…
Descriptors: National Competency Tests, Inferences, Item Response Theory, Summative Evaluation
Mislevy, Robert J. – Teachers College Record, 2014
Background/Context: This article explains the idea of a neopragmatic postmodernist test theory and offers some thoughts about what changing notions concerning the nature of and meanings assigned to knowledge imply for educational assessment, present and future. Purpose: Advances in the learning sciences--particularly situative and sociocognitive…
Descriptors: Test Theory, Postmodernism, Educational Assessment, Educational Trends
Levy, Roy; Mislevy, Robert J.; Sinharay, Sandip – Applied Psychological Measurement, 2009
If data exhibit multidimensionality, key conditional independence assumptions of unidimensional models do not hold. The current work pursues posterior predictive model checking, a flexible family of model-checking procedures, as a tool for criticizing models due to unaccounted for dimensions in the context of item response theory. Factors…
Descriptors: Item Response Theory, Models, Methods, Simulation
Mislevy, Robert J.; Haertel, Geneva; Cheng, Britte H.; Ructtinger, Liliana; DeBarger, Angela; Murray, Elizabeth; Rose, David; Gravel, Jenna; Colker, Alexis M.; Rutstein, Daisy; Vendlinski, Terry – Educational Research and Evaluation, 2013
Standardizing aspects of assessments has long been recognized as a tactic to help make evaluations of examinees fair. It reduces variation in irrelevant aspects of testing procedures that could advantage some examinees and disadvantage others. However, recent attention to making assessment accessible to a more diverse population of students…
Descriptors: Testing Accommodations, Access to Education, Testing, Psychometrics
Rupp, Andre A.; Levy, Roy; Dicerbo, Kristen E.; Sweet, Shauna J.; Crawford, Aaron V.; Calico, Tiago; Benson, Martin; Fay, Derek; Kunze, Katie L.; Mislevy, Robert J.; Behrens, John T. – Journal of Educational Data Mining, 2012
In this paper we describe the development and refinement of "evidence rules" and "measurement models" within the "evidence model" of the "evidence-centered design" (ECD) framework in the context of the "Packet Tracer" digital learning environment of the "Cisco Networking Academy." Using…
Descriptors: Computer Networks, Evidence Based Practice, Design, Instructional Design
Mislevy, Robert J.; Behrens, John T.; Dicerbo, Kristen E.; Levy, Roy – Journal of Educational Data Mining, 2012
"Evidence-centered design" (ECD) is a comprehensive framework for describing the conceptual, computational and inferential elements of educational assessment. It emphasizes the importance of articulating inferences one wants to make and the evidence needed to support those inferences. At first blush, ECD and "educational data…
Descriptors: Educational Assessment, Psychometrics, Evidence, Computer Games
Mislevy, Robert J. – National Center for Research on Evaluation, Standards, and Student Testing (CRESST), 2009
From a contemporary perspective on cognition, the between-persons variables in trait-based arguments in educational assessment are absurd over-simplifications. Yet, for a wide range of applications, they work. Rather than seeing such variables as independently-existing characteristics of people, we can view them as summaries of patterns in…
Descriptors: Test Validity, Educational Assessment, Item Response Theory, Logical Thinking
Mislevy, Robert J. – Research Papers in Education, 2010
An educational assessment embodies an argument from a handful of observations of what students say, do or make in a handful of particular circumstances, to what they know or can do in what kinds of situations more broadly. This article discusses ways in which research into the nature and development of expertise can help assessment designers…
Descriptors: Educational Assessment, Test Construction, Expertise, Research
Mislevy, Robert J. – 1993
Relationships between Bayesian ability estimates and the parameters of a normal population distribution are derived in the context of classical test theory. Analogies are provided for use as approximations in work with item response theory (IRT). The following issues are addressed: (1) the relationship between the distribution of the latent…
Descriptors: Ability, Bayesian Statistics, Computer Software, Estimation (Mathematics)
Mislevy, Robert J. – 1988
Large-scale educational assessments differ from familiar educational measurements by attempting to provide information about the levels and natures of skills in populations rather than in individuals. That the distinct purposes of assessment require different methodologies than individual measurement was recognized by the development of…
Descriptors: Educational Assessment, Evaluation Methods, Item Analysis, Latent Trait Theory

Mislevy, Robert J. – Educational and Psychological Measurement, 1993
Relationships between Bayesian ability estimates and the parameters of a normal population distribution are derived in the context of classical test theory. Formulas are presented for practical work with Bayesian ability estimates, and a numerical illustration is provided. (SLD)
Descriptors: Ability, Bayesian Statistics, Equations (Mathematics), Estimation (Mathematics)
Mislevy, Robert J.; Behrens, John T.; Bennett, Randy E.; Demark, Sarah F.; Frezzo, Dennis C.; Levy, Roy; Robinson, Daniel H.; Rutstein, Daisy Wise; Shute, Valerie J.; Stanley, Ken; Winters, Fielding I. – National Center for Research on Evaluation, Standards, and Student Testing (CRESST), 2007
People use external knowledge representations (EKRs) to identify, depict, transform, store, share, and archive information. Learning how to work with EKRs is central to becoming proficient in virtually every discipline. As such, EKRs play central roles in curriculum, instruction, and assessment. Five key roles of EKRs in educational assessment are…
Descriptors: Educational Assessment, Computer Networks, Test Construction, Computer Assisted Testing