Deep convolutional neural network for prostate MR segmentation

Zhiqiang Tian, Lizhi Liu, Baowei Fei

Research output: Chapter in Book/Report/Conference proceedingConference contribution

18 Scopus citations


Automatic segmentation of the prostate in magnetic resonance imaging (MRI) has many applications in prostate cancer diagnosis and therapy. We propose a deep fully convolutional neural network (CNN) to segment the prostate automatically. Our deep CNN model is trained end-to-end in a single learning stage based on prostate MR images and the corresponding ground truths, and learns to make inference for pixel-wise segmentation. Experiments were performed on our in-house data set, which contains prostate MR images of 20 patients. The proposed CNN model obtained a mean Dice similarity coefficient of 85.3%±3.2% as compared to the manual segmentation. Experimental results show that our deep CNN model could yield satisfactory segmentation of the prostate.

Original languageEnglish (US)
Title of host publicationMedical Imaging 2017
Subtitle of host publicationImage-Guided Procedures, Robotic Interventions, and Modeling
EditorsRobert J. Webster, Baowei Fei
ISBN (Electronic)9781510607156
StatePublished - 2017
Externally publishedYes
EventMedical Imaging 2017: Image-Guided Procedures, Robotic Interventions, and Modeling - Orlando, United States
Duration: Feb 14 2017Feb 16 2017

Publication series

NameProgress in Biomedical Optics and Imaging - Proceedings of SPIE
ISSN (Print)1605-7422


OtherMedical Imaging 2017: Image-Guided Procedures, Robotic Interventions, and Modeling
Country/TerritoryUnited States


  • Convolutional neural network
  • Deep learning
  • Magnetic resonance imaging (MRI)
  • Prostate segmentation

ASJC Scopus subject areas

  • Electronic, Optical and Magnetic Materials
  • Biomaterials
  • Atomic and Molecular Physics, and Optics
  • Radiology Nuclear Medicine and imaging


Dive into the research topics of 'Deep convolutional neural network for prostate MR segmentation'. Together they form a unique fingerprint.

Cite this