Class-aware image search for interpretable cancer identification

Arash Ebrahimian, Hossein Mohammadi, Morteza Babaie, Nima Maftoon, H. R. Tizhoosh

Research output: Contribution to journalArticlepeer-review

Abstract

In recent times, the performance of computer-aided diagnosis systems in classification of malignancies has significantly improved. Search and retrieval methods are specifically important as they assist physicians in making the right diagnosis in medical imaging owing to their ability of obtaining similar cases for a query image. Supervised classification algorithms are generally more accurate than unsupervised search-based classifications; however, the latter may more easily provide insights into the decision-making process by providing a group of similar cases and their corresponding metadata (i.e., diagnostic reports) and not simply a class probability. In this study, we propose a class-aware search operating on deep image embeddings to increase the accuracy of content-based search. We validate our methodology using two different publicly available datasets, one containing endometrial cancer images and the other containing colorectal cancer images. The proposed class-aware scenarios can enhance the accuracy of the search-based classifier, thereby making them more feasible in practice. With search results providing access to the metadata of retrieved cases (i.e., pathology reports of evidently diagnosed cases), such a combination has clear benefits for assisting experts with explainable results.

Original languageEnglish (US)
Pages (from-to)197352-197362
Number of pages11
JournalIEEE Access
Volume8
DOIs
StatePublished - 2020

Keywords

  • Deep learning
  • Medical image classification
  • Medical image search
  • Pathology whole-slide images

ASJC Scopus subject areas

  • Computer Science(all)
  • Materials Science(all)
  • Engineering(all)

Fingerprint

Dive into the research topics of 'Class-aware image search for interpretable cancer identification'. Together they form a unique fingerprint.

Cite this