A dual-mode deep transfer learning (D2TL) system for breast cancer detection using contrast enhanced digital mammograms

Kun Wang, Bhavika Patel, Lujia Wang, Teresa Wu, Bin Zheng, Jing Li

Research output: Contribution to journalArticle

Abstract

Full-field digital mammography (FFDM) and magnetic resonance imaging (MRI) are gold-standard techniques for breast cancer detection. The newly contrast-enhanced digital mammography (CEDM) integrates the complementary strengths of FFDM and MRI, and is being incorporated into the practice of leading institutions. The current clinical practice using CEDM is sub-optimal because it is primarily based on clinicians' trained eyes. Automated diagnostic systems under the conventional machine learning paradigm suffer from drawbacks, such as the requirement for precise segmentation, extraction of shallow features that do not suffice for diagnostic images, and adoption of a sequential design without a global objective. We propose a deep learning (DL)-empowered diagnostic system using CEDM, the core of which is a novel dual-mode deep transfer learning (D2TL) model. The proposed system is innovative in several aspects, including (1) a dual-mode deep architecture design; (2) use of transfer learning to facilitate robust model estimation under small sample-size; (3) development of visualization techniques to help interpret the model results and facilitate inter- and intra-tumor malignancy quantification; and (4) minimization of human bias. We apply D2TL to classify benign vs. malignant tumors using the CEDM data collected from the Mayo Clinic in Arizona. D2TL outperforms competing models and approaches.

Original languageEnglish (US)
JournalIISE Transactions on Healthcare Systems Engineering
DOIs
StatePublished - Jan 1 2019

Fingerprint

Mammography
Learning systems
cancer
Breast Neoplasms
diagnostic
learning
Tumors
Magnetic resonance
gold standard
Neoplasms
quantification
Magnetic Resonance Imaging
visualization
Imaging techniques
paradigm
Gold
Sample Size
Transfer (Psychology)
trend
Visualization

Keywords

  • breast cancer
  • Deep learning
  • imaging-based diagnosis
  • transfer learning

ASJC Scopus subject areas

  • Safety, Risk, Reliability and Quality
  • Safety Research
  • Public Health, Environmental and Occupational Health

Cite this

A dual-mode deep transfer learning (D2TL) system for breast cancer detection using contrast enhanced digital mammograms. / Wang, Kun; Patel, Bhavika; Wang, Lujia; Wu, Teresa; Zheng, Bin; Li, Jing.

In: IISE Transactions on Healthcare Systems Engineering, 01.01.2019.

Research output: Contribution to journalArticle

@article{7f01c146b097421f97c0e5ef0a5223b4,
title = "A dual-mode deep transfer learning (D2TL) system for breast cancer detection using contrast enhanced digital mammograms",
abstract = "Full-field digital mammography (FFDM) and magnetic resonance imaging (MRI) are gold-standard techniques for breast cancer detection. The newly contrast-enhanced digital mammography (CEDM) integrates the complementary strengths of FFDM and MRI, and is being incorporated into the practice of leading institutions. The current clinical practice using CEDM is sub-optimal because it is primarily based on clinicians' trained eyes. Automated diagnostic systems under the conventional machine learning paradigm suffer from drawbacks, such as the requirement for precise segmentation, extraction of shallow features that do not suffice for diagnostic images, and adoption of a sequential design without a global objective. We propose a deep learning (DL)-empowered diagnostic system using CEDM, the core of which is a novel dual-mode deep transfer learning (D2TL) model. The proposed system is innovative in several aspects, including (1) a dual-mode deep architecture design; (2) use of transfer learning to facilitate robust model estimation under small sample-size; (3) development of visualization techniques to help interpret the model results and facilitate inter- and intra-tumor malignancy quantification; and (4) minimization of human bias. We apply D2TL to classify benign vs. malignant tumors using the CEDM data collected from the Mayo Clinic in Arizona. D2TL outperforms competing models and approaches.",
keywords = "breast cancer, Deep learning, imaging-based diagnosis, transfer learning",
author = "Kun Wang and Bhavika Patel and Lujia Wang and Teresa Wu and Bin Zheng and Jing Li",
year = "2019",
month = "1",
day = "1",
doi = "10.1080/24725579.2019.1628133",
language = "English (US)",
journal = "IISE Transactions on Healthcare Systems Engineering",
issn = "2472-5579",
publisher = "Taylor and Francis Ltd.",

}

TY - JOUR

T1 - A dual-mode deep transfer learning (D2TL) system for breast cancer detection using contrast enhanced digital mammograms

AU - Wang, Kun

AU - Patel, Bhavika

AU - Wang, Lujia

AU - Wu, Teresa

AU - Zheng, Bin

AU - Li, Jing

PY - 2019/1/1

Y1 - 2019/1/1

N2 - Full-field digital mammography (FFDM) and magnetic resonance imaging (MRI) are gold-standard techniques for breast cancer detection. The newly contrast-enhanced digital mammography (CEDM) integrates the complementary strengths of FFDM and MRI, and is being incorporated into the practice of leading institutions. The current clinical practice using CEDM is sub-optimal because it is primarily based on clinicians' trained eyes. Automated diagnostic systems under the conventional machine learning paradigm suffer from drawbacks, such as the requirement for precise segmentation, extraction of shallow features that do not suffice for diagnostic images, and adoption of a sequential design without a global objective. We propose a deep learning (DL)-empowered diagnostic system using CEDM, the core of which is a novel dual-mode deep transfer learning (D2TL) model. The proposed system is innovative in several aspects, including (1) a dual-mode deep architecture design; (2) use of transfer learning to facilitate robust model estimation under small sample-size; (3) development of visualization techniques to help interpret the model results and facilitate inter- and intra-tumor malignancy quantification; and (4) minimization of human bias. We apply D2TL to classify benign vs. malignant tumors using the CEDM data collected from the Mayo Clinic in Arizona. D2TL outperforms competing models and approaches.

AB - Full-field digital mammography (FFDM) and magnetic resonance imaging (MRI) are gold-standard techniques for breast cancer detection. The newly contrast-enhanced digital mammography (CEDM) integrates the complementary strengths of FFDM and MRI, and is being incorporated into the practice of leading institutions. The current clinical practice using CEDM is sub-optimal because it is primarily based on clinicians' trained eyes. Automated diagnostic systems under the conventional machine learning paradigm suffer from drawbacks, such as the requirement for precise segmentation, extraction of shallow features that do not suffice for diagnostic images, and adoption of a sequential design without a global objective. We propose a deep learning (DL)-empowered diagnostic system using CEDM, the core of which is a novel dual-mode deep transfer learning (D2TL) model. The proposed system is innovative in several aspects, including (1) a dual-mode deep architecture design; (2) use of transfer learning to facilitate robust model estimation under small sample-size; (3) development of visualization techniques to help interpret the model results and facilitate inter- and intra-tumor malignancy quantification; and (4) minimization of human bias. We apply D2TL to classify benign vs. malignant tumors using the CEDM data collected from the Mayo Clinic in Arizona. D2TL outperforms competing models and approaches.

KW - breast cancer

KW - Deep learning

KW - imaging-based diagnosis

KW - transfer learning

UR - http://www.scopus.com/inward/record.url?scp=85068237750&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85068237750&partnerID=8YFLogxK

U2 - 10.1080/24725579.2019.1628133

DO - 10.1080/24725579.2019.1628133

M3 - Article

AN - SCOPUS:85068237750

JO - IISE Transactions on Healthcare Systems Engineering

JF - IISE Transactions on Healthcare Systems Engineering

SN - 2472-5579

ER -