Publication Date
In 2025 | 1 |
Since 2024 | 1 |
Since 2021 (last 5 years) | 1 |
Since 2016 (last 10 years) | 1 |
Since 2006 (last 20 years) | 5 |
Descriptor
Scores | 6 |
Program Evaluation | 3 |
Evaluation Methods | 2 |
Logical Thinking | 2 |
Measurement Techniques | 2 |
Problem Solving | 2 |
Validity | 2 |
Writing Skills | 2 |
Adolescents | 1 |
African Americans | 1 |
Age Differences | 1 |
More ▼ |
Source
Evaluation Review | 6 |
Author
Bolus, Roger | 2 |
Klein, Stephen | 2 |
Shavelson, Richard | 2 |
Benjamin, Roger | 1 |
Bickman, Leonard | 1 |
Daigneault, Pierre-Marc | 1 |
Flay, Brian R. | 1 |
Foster, E. Michael | 1 |
Freedman, David | 1 |
Gregory Chernov | 1 |
Jacob, Steve | 1 |
More ▼ |
Publication Type
Journal Articles | 6 |
Reports - Research | 4 |
Reports - Descriptive | 2 |
Education Level
Higher Education | 2 |
Audience
Location
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Gregory Chernov – Evaluation Review, 2025
Most existing solutions to the current replication crisis in science address only the factors stemming from specific poor research practices. We introduce a novel mechanism that leverages the experts' predictive abilities to analyze the root causes of replication failures. It is backed by the principle that the most accurate predictor is the most…
Descriptors: Replication (Evaluation), Prediction, Scientific Research, Failure
Daigneault, Pierre-Marc; Jacob, Steve; Tremblay, Joel – Evaluation Review, 2012
Background: Stakeholder participation is an important trend in the field of program evaluation. Although a few measurement instruments have been proposed, they either have not been empirically validated or do not cover the full content of the concept. Objectives: This study consists of a first empirical validation of a measurement instrument that…
Descriptors: Stakeholders, Participation, Program Evaluation, Measurement Techniques
Klein, Stephen; Freedman, David; Shavelson, Richard; Bolus, Roger – Evaluation Review, 2008
The Collegiate Learning Assessment (CLA) program measures value added in colleges and universities, by testing the ability of freshmen and seniors to think logically and write clearly. The program is popular enough that it has attracted critics. In this paper, we outline the methods used by the CLA to determine value added. We summarize the…
Descriptors: School Effectiveness, Logical Thinking, College Freshmen, College Seniors
Foster, E. Michael; Wiley-Exley, Elizabeth; Bickman, Leonard – Evaluation Review, 2009
Findings from an evaluation of a model system for delivering mental health services to youth were reassessed to determine the robustness of key findings to the use of methodologies unavailable to the original analysts. These analyses address a key concern about earlier findings--that the quasi-experimental design involved the comparison of two…
Descriptors: Mental Health Programs, Health Services, Youth, Delivery Systems
Segawa, Eisuke; Ngwe, Job E.; Li, Yanhong; Flay, Brian R. – Evaluation Review, 2005
This study employs growth mixture modeling techniques to evaluate the preventive effects of the Aban Aya Youth Project in reducing the rate of growth of violence among African American adolescent males (N = 552). Results suggest three distinct classes of participants: high risk (34%), medium risk (54%), and low risk (12%) based on both the…
Descriptors: Males, Adolescents, Violence, African Americans
Klein, Stephen; Benjamin, Roger; Shavelson, Richard; Bolus, Roger – Evaluation Review, 2007
The Collegiate Learning Assessment (CLA) is a computer administered, open-ended (as opposed to multiple-choice) test of analytic reasoning, critical thinking, problem solving, and written communication skills. Because the CLA has been endorsed by several national higher education commissions, it has come under intense scrutiny by faculty members,…
Descriptors: Higher Education, Educational Assessment, Performance Tests, Logical Thinking