ERIC Number: ED641082
Record Type: Non-Journal
Publication Date: 2023
Pages: 165
Abstractor: As Provided
ISBN: 979-8-3811-7338-3
ISSN: N/A
EISSN: N/A
Improving Interrater Reliability of the edTPA with Local Rater Training
Ellie Renae Bowen
ProQuest LLC, Ph.D. Dissertation, Indiana University
The educative Teacher Performance Assessment (edTPA) has been adopted by many state legislatures and teacher preparation programs (TPP). These states require teacher candidates to pass the edTPA with a state-specific passing score to be recommended for licensure. In the 19 states where passing the edTPA has not been required as a condition of licensure (SCALE, 2023), teacher candidates may still be required by their TPP to complete and pass the edTPA. Indiana does not require the edTPA, but Indiana University's School of Education requires candidates to pass a revised version of edTPA before they will be recommended for licensure. At IU, local raters score candidates' submissions. Given the high stakes potential, it is important that local scoring of the edTPA be demonstrated as reliable. Previous examinations of reliability of the edTPA at IU have indicated that interrater reliability of local scores are inadequate (Brannan & Bowen, 2020; 2021). This study extended that research by applying a more rigorous measure of reliability to existing data and testing whether targeted training could improve interrater reliability of newly collected data. In addition, a focus group was conducted with rater-participants to gather perspectives on edTPA and rater training. This study found that the interrater reliability for several edTPA rubrics is inadequate and that the training used in this study was not sufficient to improve rater agreement meaningfully. Participants regarded small group discussion, feedback on scoring decisions from colleagues with the same area of expertise, and introduction to the Understanding Rubric Level Progression resource as most constructive. Findings from this study suggest that improving inter-rater reliability of local scores is a challenging task that requires a commitment to extensive training, rigorous assessment of raters' abilities to score reliably, ongoing assessment of reliability of scores after training, and booster trainings to minimize rater drift. Rater training and reliability may be improved with a collaborative learning approach which provides training specific to each edTPA module. Reliability may be improved by extending training time for new raters and providing annual trainings for veteran raters. Once reliability is established, research is necessary to investigate validity of locally scored edTPAs. [The dissertation citations contained here are published with the permission of ProQuest LLC. Further reproduction is prohibited without permission. Copies of dissertations may be obtained by Telephone (800) 1-800-521-0600. Web page: http://bibliotheek.ehb.be:2222/en-US/products/dissertations/individuals.shtml.]
Descriptors: Interrater Reliability, Teacher Evaluation, Rating Scales, Performance Based Assessment, Preservice Teachers, Training, Evaluation Utilization, Evaluation Needs, Program Improvement
ProQuest LLC. 789 East Eisenhower Parkway, P.O. Box 1346, Ann Arbor, MI 48106. Tel: 800-521-0600; Web site: http://bibliotheek.ehb.be:2222/en-US/products/dissertations/individuals.shtml
Publication Type: Dissertations/Theses - Doctoral Dissertations
Education Level: Higher Education; Postsecondary Education
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: N/A
Identifiers - Location: Indiana
Identifiers - Assessments and Surveys: edTPA (Teacher Performance Assessment)
Grant or Contract Numbers: N/A