An adversarial machine learning based approach and biomechanically-guided validation for improving deformable image registration accuracy between a planning CT and cone-beam CT for adaptive prostate radiotherapy applications

Anand P. Santhanam, Michael Lauria, Brad Stiehl, Daniel Elliott, Saty Seshan, Scott Hsieh, Minsong Cao, Daniel Low

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Adaptive radiotherapy is an effective procedure for the treatment of cancer, where the daily anatomical changes in the patient are quantified, and the dose delivered to the tumor is adapted accordingly. Deformable Image Registration (DIR) inaccuracies and delays in retrieving and registering on-board cone beam CT (CBCT) image datasets from the treatment system with the planning kilo Voltage CT (kVCT) have limited the adaptive workflow to a limited number of patients. In this paper, we present an approach for improving the DIR accuracy using a machine learning approach coupled with biomechanically guided validation. For a given set of 11 planning prostate kVCT datasets and their segmented contours, we first assembled a biomechanical model to generate synthetic abdominal motions, bladder volume changes, and physiological regression. For each of the synthetic CT datasets, we then injected noise and artifacts in the images using a novel procedure in order to mimic closely CBCT datasets. We then considered the simulated CBCT images for training neural networks that predicted the noise and artifact-removed CT images. For this purpose, we employed a constrained generative adversarial neural network, which consisted of two deep neural networks, a generator and a discriminator. The generator produced the artifact-removed CT images while the discriminator computed the accuracy. The deformable image registration (DIR) results were finally validated using the model-generated landmarks. Results showed that the artifact-removed CT matched closely to the planning CT. Comparisons were performed using the image similarity metrics, and a normalized cross correlation of >0.95 was obtained from the cGAN based image enhancement. In addition, when DIR was performed, the landmarks matched within 1.1 +/- 0.5 mm. This demonstrates that using an adversarial DNN-based CBCT enhancement, improved DIR accuracy bolsters adaptive radiotherapy workflow.

Original languageEnglish (US)
Title of host publicationMedical Imaging 2020
Subtitle of host publicationImage Processing
EditorsIvana Isgum, Bennett A. Landman
PublisherSPIE
ISBN (Electronic)9781510633933
DOIs
StatePublished - 2020
EventMedical Imaging 2020: Image Processing - Houston, United States
Duration: Feb 17 2020Feb 20 2020

Publication series

NameProgress in Biomedical Optics and Imaging - Proceedings of SPIE
Volume11313
ISSN (Print)1605-7422

Conference

ConferenceMedical Imaging 2020: Image Processing
Country/TerritoryUnited States
CityHouston
Period2/17/202/20/20

ASJC Scopus subject areas

  • Electronic, Optical and Magnetic Materials
  • Biomaterials
  • Atomic and Molecular Physics, and Optics
  • Radiology Nuclear Medicine and imaging

Fingerprint

Dive into the research topics of 'An adversarial machine learning based approach and biomechanically-guided validation for improving deformable image registration accuracy between a planning CT and cone-beam CT for adaptive prostate radiotherapy applications'. Together they form a unique fingerprint.

Cite this