NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 4 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Luu, Kimberly; Sidhu, Ravi; Chadha, Neil K.; Eva, Kevin W. – Advances in Health Sciences Education, 2023
Clinical supervisors are known to assess trainee performance idiosyncratically, causing concern about the validity of their ratings. The literature on this issue relies heavily on retrospective collection of decisions, resulting in the risk of inaccurate information regarding what actually drives raters' perceptions. Capturing in-the-moment…
Descriptors: Clinical Experience, Practicum Supervision, Student Evaluation, Evaluation Methods
Peer reviewed Peer reviewed
Direct linkDirect link
Murphy, Douglas J.; Bruce, David A.; Mercer, Stewart W.; Eva, Kevin W. – Advances in Health Sciences Education, 2009
To investigate the reliability and feasibility of six potential workplace-based assessment methods in general practice training: criterion audit, multi-source feedback from clinical and non-clinical colleagues, patient feedback (the CARE Measure), referral letters, significant event analysis, and video analysis of consultations. Performance of GP…
Descriptors: Reliability, Graduate Medical Education, Family Practice (Medicine), Vocational Evaluation
Peer reviewed Peer reviewed
Direct linkDirect link
Eva, Kevin W.; Solomon, Patty; Neville, Alan J.; Ladouceur, Michael; Kaufman, Karyn; Walsh, Allyn; Norman, Geoffrey R. – Advances in Health Sciences Education, 2007
Introduction: Tutorial-based assessment, despite providing a good match with the philosophy adopted by educational programmes that emphasize small group learning, remains one of the greatest challenges for educators working in this context. The current study was performed in an attempt to assess the psychometric characteristics of tutorial-based…
Descriptors: Construct Validity, Sampling, Psychometrics, Evaluation Methods
Peer reviewed Peer reviewed
Direct linkDirect link
Reiter, Harold I.; Rosenfeld, Jack; Nandagopal, Kiruthiga; Eva, Kevin W. – Advances in Health Sciences Education, 2004
Context: Various research studies have examined the question of whether expert or non-expert raters, faculty or students, evaluators or standardized patients, give more reliable and valid summative assessments of performance on Objective Structured Clinical Examinations (OSCEs). Less studied has been the question of whether or not non-faculty…
Descriptors: Evidence, Video Technology, Feedback (Response), Evaluators