Your browser doesn't support javascript.
loading
Post-Radiotherapy PET Image Outcome Prediction by Deep Learning Under Biological Model Guidance: A Feasibility Study of Oropharyngeal Cancer Application.
Ji, Hangjie; Lafata, Kyle; Mowery, Yvonne; Brizel, David; Bertozzi, Andrea L; Yin, Fang-Fang; Wang, Chunhao.
Afiliación
  • Ji H; Department of Mathematics, North Carolina State University, Raleigh, NC, United States.
  • Lafata K; Department of Radiation Oncology, Duke University Medical Center, Durham, NC, United States.
  • Mowery Y; Department of Radiology, Duke University Medical Center, Durham, NC, United States.
  • Brizel D; Department of Electrical and Computer Engineering, Duke University, Durham, NC, United States.
  • Bertozzi AL; Department of Radiation Oncology, Duke University Medical Center, Durham, NC, United States.
  • Yin FF; Department of Radiation Oncology, Duke University Medical Center, Durham, NC, United States.
  • Wang C; Mechanical and Aerospace Engineering Department, University of California, Los Angeles, Los Angeles, CA, United States.
Front Oncol ; 12: 895544, 2022.
Article en En | MEDLINE | ID: mdl-35646643
Purpose: To develop a method of biologically guided deep learning for post-radiation 18FDG-PET image outcome prediction based on pre-radiation images and radiotherapy dose information. Methods: Based on the classic reaction-diffusion mechanism, a novel biological model was proposed using a partial differential equation that incorporates spatial radiation dose distribution as a patient-specific treatment information variable. A 7-layer encoder-decoder-based convolutional neural network (CNN) was designed and trained to learn the proposed biological model. As such, the model could generate post-radiation 18FDG-PET image outcome predictions with breakdown biological components for enhanced explainability. The proposed method was developed using 64 oropharyngeal patients with paired 18FDG-PET studies before and after 20-Gy delivery (2 Gy/day fraction) by intensity-modulated radiotherapy (IMRT). In a two-branch deep learning execution, the proposed CNN learns specific terms in the biological model from paired 18FDG-PET images and spatial dose distribution in one branch, and the biological model generates post-20-Gy 18FDG-PET image prediction in the other branch. As in 2D execution, 718/233/230 axial slices from 38/13/13 patients were used for training/validation/independent test. The prediction image results in test cases were compared with the ground-truth results quantitatively. Results: The proposed method successfully generated post-20-Gy 18FDG-PET image outcome prediction with breakdown illustrations of biological model components. Standardized uptake value (SUV) mean values in 18FDG high-uptake regions of predicted images (2.45 ± 0.25) were similar to ground-truth results (2.51 ± 0.33). In 2D-based Gamma analysis, the median/mean Gamma Index (<1) passing rate of test images was 96.5%/92.8% using the 5%/5 mm criterion; such result was improved to 99.9%/99.6% when 10%/10 mm was adopted. Conclusion: The developed biologically guided deep learning method achieved post-20-Gy 18FDG-PET image outcome predictions in good agreement with ground-truth results. With the breakdown biological modeling components, the outcome image predictions could be used in adaptive radiotherapy decision-making to optimize personalized plans for the best outcome in the future.
Palabras clave

Texto completo: 1 Colección: 01-internacional Base de datos: MEDLINE Tipo de estudio: Guideline / Prognostic_studies / Risk_factors_studies Idioma: En Revista: Front Oncol Año: 2022 Tipo del documento: Article País de afiliación: Estados Unidos Pais de publicación: Suiza

Texto completo: 1 Colección: 01-internacional Base de datos: MEDLINE Tipo de estudio: Guideline / Prognostic_studies / Risk_factors_studies Idioma: En Revista: Front Oncol Año: 2022 Tipo del documento: Article País de afiliación: Estados Unidos Pais de publicación: Suiza