Assessing medical students and residents perceptions of the learning environment: Exploring validity evidence for the interpretation of scores from existing tools

Jorie Colbert-Getz, Sooyoun Kim, Victoria H. Goode, Robert B Shochet, Scott Wright

Research output: Contribution to journalArticle

Abstract

Purpose Although most agree that supportive learning environments (LEs) are essential for effective medical education, an accurate assessment of LE quality has been challenging for educators and administrators. Two previous reviews assessed LE tools used in the health professions; however, both have shortcomings. The primary goal of this systematic review was to explore the validity evidence for the interpretation of scores from LE tools. Method The authors searched ERIC, PsycINFO, and PubMed for peer-reviewed studies that provided quantitative data on medical students' and/ or residents' perceptions of the LE published through 2012 in the United States and internationally. They also searched SCOPUS and the reference lists of included studies for subsequent publications that assessed the LE tools. From each study, the authors extracted descriptive, sample, and validity evidence (content, response process, internal structure, relationship to other variables) information. They calculated a total validity evidence score for each tool. Results The authors identified 15 tools that assessed the LE in medical school and 13 that did so in residency. The majority of studies (17; 61%) provided some form of content validity evidence. Studies were less likely to provide evidence of internal structure, response process, and relationship to other variables. Conclusions Given the limited validity evidence for scores from existing LE tools, new tools may be needed to assess medical students' and residents' perceptions of the LE. Any new tools would need robust validity evidence testing and sampling across multiple institutions with trainees at multiple levels to establish their utility.

Original languageEnglish (US)
Pages (from-to)1687-1693
Number of pages7
JournalAcademic Medicine
Volume89
Issue number12
DOIs
StatePublished - Dec 11 2014

Fingerprint

Medical Students
medical student
learning environment
Learning
resident
interpretation
evidence
Health Occupations
Internship and Residency
Medical Education
Administrative Personnel
Medical Schools
PubMed
trainee
Publications
profession
educator
health
school

ASJC Scopus subject areas

  • Medicine(all)
  • Education

Cite this

@article{0c2a2c4a07ae472facb541f13b6e6981,
title = "Assessing medical students and residents perceptions of the learning environment: Exploring validity evidence for the interpretation of scores from existing tools",
abstract = "Purpose Although most agree that supportive learning environments (LEs) are essential for effective medical education, an accurate assessment of LE quality has been challenging for educators and administrators. Two previous reviews assessed LE tools used in the health professions; however, both have shortcomings. The primary goal of this systematic review was to explore the validity evidence for the interpretation of scores from LE tools. Method The authors searched ERIC, PsycINFO, and PubMed for peer-reviewed studies that provided quantitative data on medical students' and/ or residents' perceptions of the LE published through 2012 in the United States and internationally. They also searched SCOPUS and the reference lists of included studies for subsequent publications that assessed the LE tools. From each study, the authors extracted descriptive, sample, and validity evidence (content, response process, internal structure, relationship to other variables) information. They calculated a total validity evidence score for each tool. Results The authors identified 15 tools that assessed the LE in medical school and 13 that did so in residency. The majority of studies (17; 61{\%}) provided some form of content validity evidence. Studies were less likely to provide evidence of internal structure, response process, and relationship to other variables. Conclusions Given the limited validity evidence for scores from existing LE tools, new tools may be needed to assess medical students' and residents' perceptions of the LE. Any new tools would need robust validity evidence testing and sampling across multiple institutions with trainees at multiple levels to establish their utility.",
author = "Jorie Colbert-Getz and Sooyoun Kim and Goode, {Victoria H.} and Shochet, {Robert B} and Scott Wright",
year = "2014",
month = "12",
day = "11",
doi = "10.1097/ACM.0000000000000433",
language = "English (US)",
volume = "89",
pages = "1687--1693",
journal = "Academic Medicine",
issn = "1040-2446",
publisher = "Lippincott Williams and Wilkins",
number = "12",

}

TY - JOUR

T1 - Assessing medical students and residents perceptions of the learning environment

T2 - Exploring validity evidence for the interpretation of scores from existing tools

AU - Colbert-Getz, Jorie

AU - Kim, Sooyoun

AU - Goode, Victoria H.

AU - Shochet, Robert B

AU - Wright, Scott

PY - 2014/12/11

Y1 - 2014/12/11

N2 - Purpose Although most agree that supportive learning environments (LEs) are essential for effective medical education, an accurate assessment of LE quality has been challenging for educators and administrators. Two previous reviews assessed LE tools used in the health professions; however, both have shortcomings. The primary goal of this systematic review was to explore the validity evidence for the interpretation of scores from LE tools. Method The authors searched ERIC, PsycINFO, and PubMed for peer-reviewed studies that provided quantitative data on medical students' and/ or residents' perceptions of the LE published through 2012 in the United States and internationally. They also searched SCOPUS and the reference lists of included studies for subsequent publications that assessed the LE tools. From each study, the authors extracted descriptive, sample, and validity evidence (content, response process, internal structure, relationship to other variables) information. They calculated a total validity evidence score for each tool. Results The authors identified 15 tools that assessed the LE in medical school and 13 that did so in residency. The majority of studies (17; 61%) provided some form of content validity evidence. Studies were less likely to provide evidence of internal structure, response process, and relationship to other variables. Conclusions Given the limited validity evidence for scores from existing LE tools, new tools may be needed to assess medical students' and residents' perceptions of the LE. Any new tools would need robust validity evidence testing and sampling across multiple institutions with trainees at multiple levels to establish their utility.

AB - Purpose Although most agree that supportive learning environments (LEs) are essential for effective medical education, an accurate assessment of LE quality has been challenging for educators and administrators. Two previous reviews assessed LE tools used in the health professions; however, both have shortcomings. The primary goal of this systematic review was to explore the validity evidence for the interpretation of scores from LE tools. Method The authors searched ERIC, PsycINFO, and PubMed for peer-reviewed studies that provided quantitative data on medical students' and/ or residents' perceptions of the LE published through 2012 in the United States and internationally. They also searched SCOPUS and the reference lists of included studies for subsequent publications that assessed the LE tools. From each study, the authors extracted descriptive, sample, and validity evidence (content, response process, internal structure, relationship to other variables) information. They calculated a total validity evidence score for each tool. Results The authors identified 15 tools that assessed the LE in medical school and 13 that did so in residency. The majority of studies (17; 61%) provided some form of content validity evidence. Studies were less likely to provide evidence of internal structure, response process, and relationship to other variables. Conclusions Given the limited validity evidence for scores from existing LE tools, new tools may be needed to assess medical students' and residents' perceptions of the LE. Any new tools would need robust validity evidence testing and sampling across multiple institutions with trainees at multiple levels to establish their utility.

UR - http://www.scopus.com/inward/record.url?scp=84918777889&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84918777889&partnerID=8YFLogxK

U2 - 10.1097/ACM.0000000000000433

DO - 10.1097/ACM.0000000000000433

M3 - Article

C2 - 25054415

AN - SCOPUS:84918777889

VL - 89

SP - 1687

EP - 1693

JO - Academic Medicine

JF - Academic Medicine

SN - 1040-2446

IS - 12

ER -