HealthyGAN: Learning from Unannotated Medical Images to Detect Anomalies Associated with Human Disease

Md Mahfuzur Rahman Siddiquee, Jay Shah, Teresa Wu, Catherine Chong, Todd Schwedt, Baoxin Li

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Automated anomaly detection from medical images, such as MRIs and X-rays, can significantly reduce human effort in disease diagnosis. Owing to the complexity of modeling anomalies and the high cost of manual annotation by domain experts (e.g., radiologists), a typical technique in the current medical imaging literature has focused on deriving diagnostic models from healthy subjects only, assuming the model will detect the images from patients as outliers. However, in many real-world scenarios, unannotated datasets with a mix of both healthy and diseased individuals are abundant. Therefore, this paper poses the research question of how to improve unsupervised anomaly detection by utilizing (1) an unannotated set of mixed images, in addition to (2) the set of healthy images as being used in the literature. To answer the question, we propose HealthyGAN, a novel one-directional image-to-image translation method, which learns to translate the images from the mixed dataset to only healthy images. Being one-directional, HealthyGAN relaxes the requirement of cycle-consistency of existing unpaired image-to-image translation methods, which is unattainable with mixed unannotated data. Once the translation is learned, we generate a difference map for any given image by subtracting its translated output. Regions of significant responses in the difference map correspond to potential anomalies (if any). Our HealthyGAN outperforms the conventional state-of-the-art methods by significant margins on two publicly available datasets: COVID-19 and NIH ChestX-ray14, and one institutional dataset collected from Mayo Clinic. The implementation is publicly available at https://github.com/mahfuzmohammad/HealthyGAN.

Original languageEnglish (US)
Title of host publicationSimulation and Synthesis in Medical Imaging - 7th International Workshop, SASHIMI 2022, Held in Conjunction with MICCAI 2022, Proceedings
EditorsCan Zhao, David Svoboda, Jelmer M. Wolterink, Maria Escobar
PublisherSpringer Science and Business Media Deutschland GmbH
Pages43-54
Number of pages12
ISBN (Print)9783031169793
DOIs
StatePublished - 2022
Event7th International Workshop on Simulation and Synthesis in Medical Imaging, SASHIMI 2022, held in conjunction with 25th International Conference on Medical Image Computing and Computer-Assisted Intervention, MICCAI 2022 - Singapore, Singapore
Duration: Sep 18 2022Sep 18 2022

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume13570 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference7th International Workshop on Simulation and Synthesis in Medical Imaging, SASHIMI 2022, held in conjunction with 25th International Conference on Medical Image Computing and Computer-Assisted Intervention, MICCAI 2022
Country/TerritorySingapore
CitySingapore
Period9/18/229/18/22

Keywords

  • Anomaly detection
  • COVID-19 detection
  • Image-to-Image translation
  • Migraine detection
  • Thoracic disease detection

ASJC Scopus subject areas

  • Theoretical Computer Science
  • General Computer Science

Fingerprint

Dive into the research topics of 'HealthyGAN: Learning from Unannotated Medical Images to Detect Anomalies Associated with Human Disease'. Together they form a unique fingerprint.

Cite this