Deep-Learning-Based Semantic Labeling for 2D Mammography and Comparison of Complexity for Machine Learning Tasks

Paul H. Yi, Abigail Lin, Jinchi Wei, Alice C. Yu, Haris Sair, Ferdinand Hui, Gregory Hager, Susan Harvey

Research output: Contribution to journalArticle

Abstract

Machine learning has several potential uses in medical imaging for semantic labeling of images to improve radiologist workflow and to triage studies for review. The purpose of this study was to (1) develop deep convolutional neural networks (DCNNs) for automated classification of 2D mammography views, determination of breast laterality, and assessment and of breast tissue density; and (2) compare the performance of DCNNs on these tasks of varying complexity to each other. We obtained 3034 2D-mammographic images from the Digital Database for Screening Mammography, annotated with mammographic view, image laterality, and breast tissue density. These images were used to train a DCNN to classify images for these three tasks. The DCNN trained to classify mammographic view achieved receiver-operating-characteristic (ROC) area under the curve (AUC) of 1. The DCNN trained to classify breast image laterality initially misclassified right and left breasts (AUC 0.75); however, after discontinuing horizontal flips during data augmentation, AUC improved to 0.93 (p < 0.0001). Breast density classification proved more difficult, with the DCNN achieving 68% accuracy. Automated semantic labeling of 2D mammography is feasible using DCNNs and can be performed with small datasets. However, automated classification of differences in breast density is more difficult, likely requiring larger datasets.

Original languageEnglish (US)
JournalJournal of Digital Imaging
DOIs
StatePublished - Jan 1 2019

Fingerprint

Mammography
Semantics
Labeling
Learning systems
Learning
Area Under Curve
Neural networks
Breast
Workflow
Triage
Diagnostic Imaging
ROC Curve
Tissue
Databases
Medical imaging
Machine Learning
Breast Density
Deep learning
Screening
Datasets

Keywords

  • Artificial intelligence
  • Breast tissue density
  • Deep learning
  • Mammography
  • Semantic labeling

ASJC Scopus subject areas

  • Radiological and Ultrasound Technology
  • Radiology Nuclear Medicine and imaging
  • Computer Science Applications

Cite this

Deep-Learning-Based Semantic Labeling for 2D Mammography and Comparison of Complexity for Machine Learning Tasks. / Yi, Paul H.; Lin, Abigail; Wei, Jinchi; Yu, Alice C.; Sair, Haris; Hui, Ferdinand; Hager, Gregory; Harvey, Susan.

In: Journal of Digital Imaging, 01.01.2019.

Research output: Contribution to journalArticle

@article{7cc3987cc5d446b5a7fc06d52bb6b377,
title = "Deep-Learning-Based Semantic Labeling for 2D Mammography and Comparison of Complexity for Machine Learning Tasks",
abstract = "Machine learning has several potential uses in medical imaging for semantic labeling of images to improve radiologist workflow and to triage studies for review. The purpose of this study was to (1) develop deep convolutional neural networks (DCNNs) for automated classification of 2D mammography views, determination of breast laterality, and assessment and of breast tissue density; and (2) compare the performance of DCNNs on these tasks of varying complexity to each other. We obtained 3034 2D-mammographic images from the Digital Database for Screening Mammography, annotated with mammographic view, image laterality, and breast tissue density. These images were used to train a DCNN to classify images for these three tasks. The DCNN trained to classify mammographic view achieved receiver-operating-characteristic (ROC) area under the curve (AUC) of 1. The DCNN trained to classify breast image laterality initially misclassified right and left breasts (AUC 0.75); however, after discontinuing horizontal flips during data augmentation, AUC improved to 0.93 (p < 0.0001). Breast density classification proved more difficult, with the DCNN achieving 68{\%} accuracy. Automated semantic labeling of 2D mammography is feasible using DCNNs and can be performed with small datasets. However, automated classification of differences in breast density is more difficult, likely requiring larger datasets.",
keywords = "Artificial intelligence, Breast tissue density, Deep learning, Mammography, Semantic labeling",
author = "Yi, {Paul H.} and Abigail Lin and Jinchi Wei and Yu, {Alice C.} and Haris Sair and Ferdinand Hui and Gregory Hager and Susan Harvey",
year = "2019",
month = "1",
day = "1",
doi = "10.1007/s10278-019-00244-w",
language = "English (US)",
journal = "Journal of Digital Imaging",
issn = "0897-1889",
publisher = "Springer New York",

}

TY - JOUR

T1 - Deep-Learning-Based Semantic Labeling for 2D Mammography and Comparison of Complexity for Machine Learning Tasks

AU - Yi, Paul H.

AU - Lin, Abigail

AU - Wei, Jinchi

AU - Yu, Alice C.

AU - Sair, Haris

AU - Hui, Ferdinand

AU - Hager, Gregory

AU - Harvey, Susan

PY - 2019/1/1

Y1 - 2019/1/1

N2 - Machine learning has several potential uses in medical imaging for semantic labeling of images to improve radiologist workflow and to triage studies for review. The purpose of this study was to (1) develop deep convolutional neural networks (DCNNs) for automated classification of 2D mammography views, determination of breast laterality, and assessment and of breast tissue density; and (2) compare the performance of DCNNs on these tasks of varying complexity to each other. We obtained 3034 2D-mammographic images from the Digital Database for Screening Mammography, annotated with mammographic view, image laterality, and breast tissue density. These images were used to train a DCNN to classify images for these three tasks. The DCNN trained to classify mammographic view achieved receiver-operating-characteristic (ROC) area under the curve (AUC) of 1. The DCNN trained to classify breast image laterality initially misclassified right and left breasts (AUC 0.75); however, after discontinuing horizontal flips during data augmentation, AUC improved to 0.93 (p < 0.0001). Breast density classification proved more difficult, with the DCNN achieving 68% accuracy. Automated semantic labeling of 2D mammography is feasible using DCNNs and can be performed with small datasets. However, automated classification of differences in breast density is more difficult, likely requiring larger datasets.

AB - Machine learning has several potential uses in medical imaging for semantic labeling of images to improve radiologist workflow and to triage studies for review. The purpose of this study was to (1) develop deep convolutional neural networks (DCNNs) for automated classification of 2D mammography views, determination of breast laterality, and assessment and of breast tissue density; and (2) compare the performance of DCNNs on these tasks of varying complexity to each other. We obtained 3034 2D-mammographic images from the Digital Database for Screening Mammography, annotated with mammographic view, image laterality, and breast tissue density. These images were used to train a DCNN to classify images for these three tasks. The DCNN trained to classify mammographic view achieved receiver-operating-characteristic (ROC) area under the curve (AUC) of 1. The DCNN trained to classify breast image laterality initially misclassified right and left breasts (AUC 0.75); however, after discontinuing horizontal flips during data augmentation, AUC improved to 0.93 (p < 0.0001). Breast density classification proved more difficult, with the DCNN achieving 68% accuracy. Automated semantic labeling of 2D mammography is feasible using DCNNs and can be performed with small datasets. However, automated classification of differences in breast density is more difficult, likely requiring larger datasets.

KW - Artificial intelligence

KW - Breast tissue density

KW - Deep learning

KW - Mammography

KW - Semantic labeling

UR - http://www.scopus.com/inward/record.url?scp=85067880276&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85067880276&partnerID=8YFLogxK

U2 - 10.1007/s10278-019-00244-w

DO - 10.1007/s10278-019-00244-w

M3 - Article

C2 - 31197559

AN - SCOPUS:85067880276

JO - Journal of Digital Imaging

JF - Journal of Digital Imaging

SN - 0897-1889

ER -