The Measured Effect of Delay in Completing Operative Performance Ratings on Clarity and Detail of Ratings Assigned.

Williams RG, Chen XP, Sanfey H, Markwell SJ, Mellinger JD, Dunnington GL. J Surg Educ. 2014 Nov-Dec;71(6):e132-8.

Abstract

 

PURPOSE:

Operative performance ratings (OPRs) need adequate clarity and detail to support self-directed learning and valid progress decisions. This study was designed to determine (1) the elapsed time between observing operative performances and completing performance ratings under field conditions and (2) the effect of increased elapsed time on rating clarity and detail.

METHODS:

Overall, 895 OPRs by 19 faculty members for 37 general surgery residents were the focus of this study. The elapsed time between observing the performance and completing the evaluation was recorded. No-delay comparison data included 45 additional ratings of 8 performances collected under controlled conditions immediately following the performance by 17 surgeons whose sole responsibility was to observe and rate the performances. Item-to-item OPR variation and the presence and nature of comments were indicators of evaluation claritydetail, and quality.

RESULTS:

Elapsed time between observing and evaluating performances under field conditions were as follows: 1 day or less, 116 performances (13%); 2 to 3 days, 178 performances (20%); 4 to 14 days, 377 performances (42%); and more than 14 days, 224 performances (25%). Overall, 87% of performances rated more than 14 days after observation had no item-to-item ratings variation compared with 62% rated with a delay of 4 to 14 days, 41% rated with a delay of 2 to 3 days, 42% rated within 1 day, and 2% rated immediately. In addition, 70% of ratings completed more than 14 days after observation had no written comments, compared with 49% for those completed with a delay of 4 to 14 days, 45% for those completed in 2 to 3 days, and 46% for those completed within 1 day. Moreover, 47% of comments submitted after more than 14 days were exclusively global comments (less instructionally useful) compared with 7% for those completed with a delay of 4 to 14 days and 5% for those completed in 1 to 3 days.

CONCLUSIONS:

The elapsed time between observation and rating of operative performances should be recorded. Immediate ratings should be encouraged. Ratings completed more than 3 days after observation should be discouraged and discounted, as they lack clarity and detail about theperformance.

PubMed ID 25088368