TY - JOUR
T1 - An Automatic Deep Learning–Based Workflow for Glioblastoma Survival Prediction Using Preoperative Multimodal MR Images
T2 - A Feasibility Study
AU - Fu, Jie
AU - Singhrao, Kamal
AU - Zhong, Xinran
AU - Gao, Yu
AU - Qi, Sharon X.
AU - Yang, Yingli
AU - Ruan, Dan
AU - Lewis, John H.
N1 - Funding Information:
The authors would like to acknowledge the BraTS 2018 challenge organizing committee. Sources of support: This research is partially funded by Varian Master Research Grant. Disclosures: Y.Y. reports personal fees from ViewRay, Inc, outside the submitted work; D.R. reports grants from Varian Medical Systems, Inc, during the conduct of the study; J.H.L. reports grants from Varian Medical Systems, Inc, during the conduct of the study.
Publisher Copyright:
© 2021 The Authors
PY - 2021/9
Y1 - 2021/9
N2 - Purpose: Most radiomic studies use the features extracted from the manually drawn tumor contours for classification or survival prediction. However, large interobserver segmentation variations lead to inconsistent features and hence introduce more challenges in constructing robust prediction models. Here, we proposed an automatic workflow for glioblastoma (GBM) survival prediction based on multimodal magnetic resonance (MR) images. Methods and Materials: Two hundred eighty-five patients with glioma (210 GBM, 75 low-grade glioma) were included. One hundred sixty-three of the patients with GBM had overall survival data. Every patient had 4 preoperative MR images and manually drawn tumor contours. A 3-dimensional convolutional neural network, VGG-Seg, was trained and validated using 122 patients with glioma for automatic GBM segmentation. The trained VGG-Seg was applied to the remaining 163 patients with GBM to generate their autosegmented tumor contours. The handcrafted and deep learning (DL)–based radiomic features were extracted from the autosegmented contours using explicitly designed algorithms and a pretrained convolutional neural network, respectively. One hundred sixty-three patients with GBM were randomly split into training (n = 122) and testing (n = 41) sets for survival analysis. Cox regression models were trained to construct the handcrafted and DL-based signatures. The prognostic powers of the 2 signatures were evaluated and compared. Results: The VGG-Seg achieved a mean Dice coefficient of 0.86 across 163 patients with GBM for GBM segmentation. The handcrafted signature achieved a C-index of 0.64 (95% confidence interval, 0.55-0.73), whereas the DL-based signature achieved a C-index of 0.67 (95% confidence interval, 0.57-0.77). Unlike the handcrafted signature, the DL-based signature successfully stratified testing patients into 2 prognostically distinct groups. Conclusions: The VGG-Seg generated accurate GBM contours from 4 MR images. The DL-based signature achieved a numerically higher C-index than the handcrafted signature and significant patient stratification. The proposed automatic workflow demonstrated the potential of improving patient stratification and survival prediction in patients with GBM.
AB - Purpose: Most radiomic studies use the features extracted from the manually drawn tumor contours for classification or survival prediction. However, large interobserver segmentation variations lead to inconsistent features and hence introduce more challenges in constructing robust prediction models. Here, we proposed an automatic workflow for glioblastoma (GBM) survival prediction based on multimodal magnetic resonance (MR) images. Methods and Materials: Two hundred eighty-five patients with glioma (210 GBM, 75 low-grade glioma) were included. One hundred sixty-three of the patients with GBM had overall survival data. Every patient had 4 preoperative MR images and manually drawn tumor contours. A 3-dimensional convolutional neural network, VGG-Seg, was trained and validated using 122 patients with glioma for automatic GBM segmentation. The trained VGG-Seg was applied to the remaining 163 patients with GBM to generate their autosegmented tumor contours. The handcrafted and deep learning (DL)–based radiomic features were extracted from the autosegmented contours using explicitly designed algorithms and a pretrained convolutional neural network, respectively. One hundred sixty-three patients with GBM were randomly split into training (n = 122) and testing (n = 41) sets for survival analysis. Cox regression models were trained to construct the handcrafted and DL-based signatures. The prognostic powers of the 2 signatures were evaluated and compared. Results: The VGG-Seg achieved a mean Dice coefficient of 0.86 across 163 patients with GBM for GBM segmentation. The handcrafted signature achieved a C-index of 0.64 (95% confidence interval, 0.55-0.73), whereas the DL-based signature achieved a C-index of 0.67 (95% confidence interval, 0.57-0.77). Unlike the handcrafted signature, the DL-based signature successfully stratified testing patients into 2 prognostically distinct groups. Conclusions: The VGG-Seg generated accurate GBM contours from 4 MR images. The DL-based signature achieved a numerically higher C-index than the handcrafted signature and significant patient stratification. The proposed automatic workflow demonstrated the potential of improving patient stratification and survival prediction in patients with GBM.
UR - http://www.scopus.com/inward/record.url?scp=85112785150&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85112785150&partnerID=8YFLogxK
U2 - 10.1016/j.adro.2021.100746
DO - 10.1016/j.adro.2021.100746
M3 - Article
C2 - 34458648
AN - SCOPUS:85112785150
SN - 2452-1094
VL - 6
JO - Advances in Radiation Oncology
JF - Advances in Radiation Oncology
IS - 5
M1 - 100746
ER -