NotesFAQContact Us
Collection
Advanced
Search Tips
Publication Date
In 20250
Since 20240
Since 2021 (last 5 years)0
Since 2016 (last 10 years)1
Since 2006 (last 20 years)12
Audience
Teachers1
Laws, Policies, & Programs
What Works Clearinghouse Rating
Showing 1 to 15 of 18 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Grigg, Kaine; Manderson, Lenore – Australian Educational and Developmental Psychologist, 2015
Existing Australian measures of racist attitudes focus on single groups or have not been validated across the lifespan. To redress this, the present research aimed to develop and validate a measure of racial, ethnic, cultural and religious acceptance--the Australian Racism, Acceptance, and Cultural-Ethnocentrism Scale (RACES)--for use with…
Descriptors: Racial Bias, Racial Attitudes, Foreign Countries, Ethnocentrism
Peer reviewed Peer reviewed
Direct linkDirect link
Kabiri, Masoud; Ghazi-Tabatabaei, Mahmood; Bazargan, Abbas; Shokoohi-Yekta, Mohsen; Kharrazi, Kamal – Applied Measurement in Education, 2017
Numerous diagnostic studies have been conducted on large-scale assessments to illustrate the students' mastery profile in the areas of math and reading; however, for science a limited number of investigations are reported. This study investigated Iranian eighth graders' competency mastery of science and examined the utility of the General…
Descriptors: Elementary Secondary Education, Achievement Tests, International Assessment, Foreign Countries
Peer reviewed Peer reviewed
Direct linkDirect link
Deygers, Bart; Van Gorp, Koen – Language Testing, 2015
Considering scoring validity as encompassing both reliable rating scale use and valid descriptor interpretation, this study reports on the validation of a CEFR-based scale that was co-constructed and used by novice raters. The research questions this paper wishes to answer are (a) whether it is possible to construct a CEFR-based rating scale with…
Descriptors: Rating Scales, Scoring, Validity, Interrater Reliability
Peer reviewed Peer reviewed
Direct linkDirect link
Xie, Qin; Zhong, Xiaoling; Wang, Wen-Chung; Lim, Cher Ping – Higher Education Research and Development, 2014
This paper describes the development and validation of an item bank designed for students to assess their own achievements across an undergraduate-degree programme in seven generic competences (i.e., problem-solving skills, critical-thinking skills, creative-thinking skills, ethical decision-making skills, effective communication skills, social…
Descriptors: Item Banks, Competence, Undergraduate Students, Self Evaluation (Individuals)
Peer reviewed Peer reviewed
Direct linkDirect link
Atar, Burcu; Kamata, Akihito – Hacettepe University Journal of Education, 2011
The Type I error rates and the power of IRT likelihood ratio test and cumulative logit ordinal logistic regression procedures in detecting differential item functioning (DIF) for polytomously scored items were investigated in this Monte Carlo simulation study. For this purpose, 54 simulation conditions (combinations of 3 sample sizes, 2 sample…
Descriptors: Test Bias, Sample Size, Monte Carlo Methods, Item Response Theory
Peer reviewed Peer reviewed
Direct linkDirect link
Prabawa-Sear, Kelsie; Baudains, Catherine – Australian Journal of Environmental Education, 2011
This study investigated student views on the relationship between their environmental attitudes and behaviours and their thoughts about barriers and motivators to environmentally responsible behaviours. The environmental attitudes and behaviours of students participating in a classroom-based environmental education program were measured using two…
Descriptors: Measures (Individuals), Correlation, Focus Groups, Questionnaires
Peer reviewed Peer reviewed
Direct linkDirect link
Pilkonis, Paul A.; Choi, Seung W.; Reise, Steven P.; Stover, Angela M.; Riley, William T.; Cella, David – Assessment, 2011
The authors report on the development and calibration of item banks for depression, anxiety, and anger as part of the Patient-Reported Outcomes Measurement Information System (PROMIS[R]). Comprehensive literature searches yielded an initial bank of 1,404 items from 305 instruments. After qualitative item analysis (including focus groups and…
Descriptors: Item Banks, Focus Groups, Information Systems, Interviews
Peer reviewed Peer reviewed
Direct linkDirect link
Glynn, Shawn M. – Journal of Research in Science Teaching, 2012
The Trends in International Mathematics and Science Study (TIMSS) is a comparative assessment of the achievement of students in many countries. In the present study, a rigorous independent evaluation was conducted of a representative sample of TIMSS science test items because item quality influences the validity of the scores used to inform…
Descriptors: Foreign Countries, Item Response Theory, Psychometrics, Science Tests
Peer reviewed Peer reviewed
Direct linkDirect link
Mercer, Sterett H.; Zeigler-Hill, Virgil; Wallace, Marion; Hayes, DeMarquis M. – Journal of Counseling Psychology, 2011
The present article describes the development and initial validation of the Inventory of Microaggressions Against Black Individuals (IMABI) using a sample of 385 undergraduates who self-identified as Black or African American. The IMABI is a 14-item, unidimensional measure of racial microaggressions that captures both microinsults and…
Descriptors: Race, Social Desirability, Construct Validity, Psychology
Peer reviewed Peer reviewed
Direct linkDirect link
Stubbe, Tobias C. – Educational Research and Evaluation, 2011
The challenge inherent in cross-national research of providing instruments in different languages measuring the same construct is well known. But even instruments in a single language may be biased towards certain countries or regions due to local linguistic specificities. Consequently, it may be appropriate to use different versions of an…
Descriptors: Test Items, International Studies, Foreign Countries, German
Peer reviewed Peer reviewed
Direct linkDirect link
Randall, Jennifer; Engelhard, George, Jr. – Journal of Educational Measurement, 2009
In this study, we present an approach to questionnaire design within educational research based on Guttman's mapping sentences and Many-Facet Rasch Measurement Theory. We designed a 54-item questionnaire using Guttman's mapping sentences to examine the grading practices of teachers. Each item in the questionnaire represented a unique student…
Descriptors: Student Evaluation, Educational Research, Grades (Scholastic), Public School Teachers
Peer reviewed Peer reviewed
Direct linkDirect link
Finch, W. Holmes; French, Brian F. – Educational and Psychological Measurement, 2007
Differential item functioning (DIF) continues to receive attention both in applied and methodological studies. Because DIF can be an indicator of irrelevant variance that can influence test scores, continuing to evaluate and improve the accuracy of detection methods is an essential step in gathering score validity evidence. Methods for detecting…
Descriptors: Item Response Theory, Factor Analysis, Test Bias, Comparative Analysis
Mills, Christine M. – 2003
This study addressed the construction and application of an instrument to measure students' perceptions of their resident assistant's ability to complete specified job skills. Survey items were written from identified training objectives necessary for effective, intentional interactions with students. The instrument has been constructed following…
Descriptors: College Students, Focus Groups, Higher Education, Item Response Theory
Peer reviewed Peer reviewed
Feinstein, Zachary S. – Applied Psychological Measurement, 1995
The closed-interval signed area (CSA) and closed-interval unsigned area (CUA) statistics were studied by Monte Carlo simulation to detect differential item functioning (DIF) when the reference and focal groups had different parameter distributions. Different behaviors of the CSA and CUA as functions of the parameters are discussed. (SLD)
Descriptors: Focus Groups, Item Bias, Item Response Theory, Models
Peer reviewed Peer reviewed
Direct linkDirect link
Ouimet, Judith A.; Bunnage, JoAnne C.; Carini, Robert M.; Kuh, George D.; Kennedy, John – Research in Higher Education, 2004
This study focused on how the design of a national student survey instrument was informed and improved through the combined use of student focus groups, cognitive interviews, and expert survey design advice. We were specifically interested in determining (a) how students interpret the items and response options, (b) the frequency of behaviors or…
Descriptors: Focus Groups, Interviews, College Students, Student Surveys
Previous Page | Next Page ยป
Pages: 1  |  2