Ability of Ophthalmology Residents to Self-Assess Their Performance Through Established Milestones

Research output: Contribution to journalArticle

Abstract

Objectives: Accurate self-assessment is an important aspect of practice-based learning and improvement and a critical skill for resident growth. The Accreditation Council for Graduate Medical Education mandates semiannual milestones assessments by a clinical competency committee (CCC) for all ophthalmology residents. There are six core competencies: patient care (PC), medical knowledge, systems-based practice, practice-based learning and improvement, professionalism, and interpersonal communication skills. These competencies are assessed by the milestones rubric, which has detailed behavioral anchors and are also used for trainee self-assessments. This study compares resident self-assessed (SA) and faculty CCC milestones scores. Design: Residents completed milestones self-assessments prior to receiving individual score reports from the CCC. Correlation coefficients were calculated comparing the SA and CCC scores. In addition, statistical models were used to determine predictors of disparities and differences between the SA and CCC scores. Setting: Wilmer Eye Institute, Johns Hopkins Hospital. Participants: Twenty-one residents in the Wilmer Ophthalmology Residency program from July 2014 to June 2016. Results: Fifty-seven self-assessments were available for the analysis. For each resident's first assessment, SA and CCC scores were strongly correlated (r ≥ 0.6 and p < 0.05) for four milestones, and not correlated for the remaining 20 milestones. In multivariable models, the SA and CCC scores are less disparate for medical knowledge and systems-based practice competencies compared to practice-based learning and improvement. Higher year of training, PC and professionalism competencies were predictive of statistically significant resident overestimation of scores relative to the CCC. In addition, higher CCC scores predicted statistically significant lower SA-CCC disparities and differences. SA-CCC differences did not lower to a significant extent with repeated assessments or modification to the end-of-rotation evaluation forms. Conclusions: Self-assessments by ophthalmology residents are not well-correlated with faculty assessments, emphasizing the need for improved and frequent timely feedback. Residents have the greatest difficulty self-assessing their professionalism and PC competency. In general, senior residents and underperforming residents have more inaccurate self-assessments.

Original languageEnglish (US)
JournalJournal of surgical education
DOIs
StatePublished - Jan 1 2019

Fingerprint

Aptitude
Clinical Competence
Ophthalmology
resident
self-assessment
ability
performance
patient care
Patient Care
Learning
learning
Graduate Medical Education
interpersonal communication
Needs Assessment
Accreditation
knowledge-based system
Statistical Models
Internship and Residency
accreditation
communication skills

Keywords

  • Clinical competency committee
  • Core competencies
  • Evaluation
  • Medical Knowledge
  • Milestones
  • Ophthalmology residency
  • Patient Care
  • Practice-Based Learning and Improvement
  • Professionalism
  • Resident self-assessment
  • Systems-Based Practice

ASJC Scopus subject areas

  • Surgery
  • Education

Cite this

@article{194cfa8725414df28a788859895ca3ba,
title = "Ability of Ophthalmology Residents to Self-Assess Their Performance Through Established Milestones",
abstract = "Objectives: Accurate self-assessment is an important aspect of practice-based learning and improvement and a critical skill for resident growth. The Accreditation Council for Graduate Medical Education mandates semiannual milestones assessments by a clinical competency committee (CCC) for all ophthalmology residents. There are six core competencies: patient care (PC), medical knowledge, systems-based practice, practice-based learning and improvement, professionalism, and interpersonal communication skills. These competencies are assessed by the milestones rubric, which has detailed behavioral anchors and are also used for trainee self-assessments. This study compares resident self-assessed (SA) and faculty CCC milestones scores. Design: Residents completed milestones self-assessments prior to receiving individual score reports from the CCC. Correlation coefficients were calculated comparing the SA and CCC scores. In addition, statistical models were used to determine predictors of disparities and differences between the SA and CCC scores. Setting: Wilmer Eye Institute, Johns Hopkins Hospital. Participants: Twenty-one residents in the Wilmer Ophthalmology Residency program from July 2014 to June 2016. Results: Fifty-seven self-assessments were available for the analysis. For each resident's first assessment, SA and CCC scores were strongly correlated (r ≥ 0.6 and p < 0.05) for four milestones, and not correlated for the remaining 20 milestones. In multivariable models, the SA and CCC scores are less disparate for medical knowledge and systems-based practice competencies compared to practice-based learning and improvement. Higher year of training, PC and professionalism competencies were predictive of statistically significant resident overestimation of scores relative to the CCC. In addition, higher CCC scores predicted statistically significant lower SA-CCC disparities and differences. SA-CCC differences did not lower to a significant extent with repeated assessments or modification to the end-of-rotation evaluation forms. Conclusions: Self-assessments by ophthalmology residents are not well-correlated with faculty assessments, emphasizing the need for improved and frequent timely feedback. Residents have the greatest difficulty self-assessing their professionalism and PC competency. In general, senior residents and underperforming residents have more inaccurate self-assessments.",
keywords = "Clinical competency committee, Core competencies, Evaluation, Medical Knowledge, Milestones, Ophthalmology residency, Patient Care, Practice-Based Learning and Improvement, Professionalism, Resident self-assessment, Systems-Based Practice",
author = "Divya Srikumaran and Jing Tian and Pradeep Ramulu and Michael Boland and Woreta, {Fasika A} and Wang, {Kendrick M.} and Nicholas Mahoney",
year = "2019",
month = "1",
day = "1",
doi = "10.1016/j.jsurg.2018.12.004",
language = "English (US)",
journal = "Journal of Surgical Education",
issn = "1931-7204",
publisher = "Elsevier Inc.",

}

TY - JOUR

T1 - Ability of Ophthalmology Residents to Self-Assess Their Performance Through Established Milestones

AU - Srikumaran, Divya

AU - Tian, Jing

AU - Ramulu, Pradeep

AU - Boland, Michael

AU - Woreta, Fasika A

AU - Wang, Kendrick M.

AU - Mahoney, Nicholas

PY - 2019/1/1

Y1 - 2019/1/1

N2 - Objectives: Accurate self-assessment is an important aspect of practice-based learning and improvement and a critical skill for resident growth. The Accreditation Council for Graduate Medical Education mandates semiannual milestones assessments by a clinical competency committee (CCC) for all ophthalmology residents. There are six core competencies: patient care (PC), medical knowledge, systems-based practice, practice-based learning and improvement, professionalism, and interpersonal communication skills. These competencies are assessed by the milestones rubric, which has detailed behavioral anchors and are also used for trainee self-assessments. This study compares resident self-assessed (SA) and faculty CCC milestones scores. Design: Residents completed milestones self-assessments prior to receiving individual score reports from the CCC. Correlation coefficients were calculated comparing the SA and CCC scores. In addition, statistical models were used to determine predictors of disparities and differences between the SA and CCC scores. Setting: Wilmer Eye Institute, Johns Hopkins Hospital. Participants: Twenty-one residents in the Wilmer Ophthalmology Residency program from July 2014 to June 2016. Results: Fifty-seven self-assessments were available for the analysis. For each resident's first assessment, SA and CCC scores were strongly correlated (r ≥ 0.6 and p < 0.05) for four milestones, and not correlated for the remaining 20 milestones. In multivariable models, the SA and CCC scores are less disparate for medical knowledge and systems-based practice competencies compared to practice-based learning and improvement. Higher year of training, PC and professionalism competencies were predictive of statistically significant resident overestimation of scores relative to the CCC. In addition, higher CCC scores predicted statistically significant lower SA-CCC disparities and differences. SA-CCC differences did not lower to a significant extent with repeated assessments or modification to the end-of-rotation evaluation forms. Conclusions: Self-assessments by ophthalmology residents are not well-correlated with faculty assessments, emphasizing the need for improved and frequent timely feedback. Residents have the greatest difficulty self-assessing their professionalism and PC competency. In general, senior residents and underperforming residents have more inaccurate self-assessments.

AB - Objectives: Accurate self-assessment is an important aspect of practice-based learning and improvement and a critical skill for resident growth. The Accreditation Council for Graduate Medical Education mandates semiannual milestones assessments by a clinical competency committee (CCC) for all ophthalmology residents. There are six core competencies: patient care (PC), medical knowledge, systems-based practice, practice-based learning and improvement, professionalism, and interpersonal communication skills. These competencies are assessed by the milestones rubric, which has detailed behavioral anchors and are also used for trainee self-assessments. This study compares resident self-assessed (SA) and faculty CCC milestones scores. Design: Residents completed milestones self-assessments prior to receiving individual score reports from the CCC. Correlation coefficients were calculated comparing the SA and CCC scores. In addition, statistical models were used to determine predictors of disparities and differences between the SA and CCC scores. Setting: Wilmer Eye Institute, Johns Hopkins Hospital. Participants: Twenty-one residents in the Wilmer Ophthalmology Residency program from July 2014 to June 2016. Results: Fifty-seven self-assessments were available for the analysis. For each resident's first assessment, SA and CCC scores were strongly correlated (r ≥ 0.6 and p < 0.05) for four milestones, and not correlated for the remaining 20 milestones. In multivariable models, the SA and CCC scores are less disparate for medical knowledge and systems-based practice competencies compared to practice-based learning and improvement. Higher year of training, PC and professionalism competencies were predictive of statistically significant resident overestimation of scores relative to the CCC. In addition, higher CCC scores predicted statistically significant lower SA-CCC disparities and differences. SA-CCC differences did not lower to a significant extent with repeated assessments or modification to the end-of-rotation evaluation forms. Conclusions: Self-assessments by ophthalmology residents are not well-correlated with faculty assessments, emphasizing the need for improved and frequent timely feedback. Residents have the greatest difficulty self-assessing their professionalism and PC competency. In general, senior residents and underperforming residents have more inaccurate self-assessments.

KW - Clinical competency committee

KW - Core competencies

KW - Evaluation

KW - Medical Knowledge

KW - Milestones

KW - Ophthalmology residency

KW - Patient Care

KW - Practice-Based Learning and Improvement

KW - Professionalism

KW - Resident self-assessment

KW - Systems-Based Practice

UR - http://www.scopus.com/inward/record.url?scp=85062263976&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85062263976&partnerID=8YFLogxK

U2 - 10.1016/j.jsurg.2018.12.004

DO - 10.1016/j.jsurg.2018.12.004

M3 - Article

JO - Journal of Surgical Education

JF - Journal of Surgical Education

SN - 1931-7204

ER -