Your browser doesn't support javascript.
loading
Generalizable attention U-Net for segmentation of fibroglandular tissue and background parenchymal enhancement in breast DCE-MRI.
Nowakowska, Sylwia; Borkowski, Karol; Ruppert, Carlotta M; Landsmann, Anna; Marcon, Magda; Berger, Nicole; Boss, Andreas; Ciritsis, Alexander; Rossi, Cristina.
Afiliación
  • Nowakowska S; Diagnostic and interventional Radiology, University Hospital Zurich, University Zurich, Rämistrasse 100, 8091, Zurich, Switzerland. sylwia.nowakowska@usz.ch.
  • Borkowski K; b-rayZ AG, Wagistrasse 21, 8952, Schlieren, Switzerland.
  • Ruppert CM; Diagnostic and interventional Radiology, University Hospital Zurich, University Zurich, Rämistrasse 100, 8091, Zurich, Switzerland.
  • Landsmann A; Diagnostic and interventional Radiology, University Hospital Zurich, University Zurich, Rämistrasse 100, 8091, Zurich, Switzerland.
  • Marcon M; Diagnostic and interventional Radiology, University Hospital Zurich, University Zurich, Rämistrasse 100, 8091, Zurich, Switzerland.
  • Berger N; Diagnostic and interventional Radiology, University Hospital Zurich, University Zurich, Rämistrasse 100, 8091, Zurich, Switzerland.
  • Boss A; Present Address: Institut RadiologieSpital Lachen, Oberdorfstrasse 41, 8853, Lachen, Switzerland.
  • Ciritsis A; Diagnostic and interventional Radiology, University Hospital Zurich, University Zurich, Rämistrasse 100, 8091, Zurich, Switzerland.
  • Rossi C; Present address: GZO AG Spital Wetzikon, Spitalstrasse 66, 8620, Wetzikon, Switzerland.
Insights Imaging ; 14(1): 185, 2023 Nov 06.
Article en En | MEDLINE | ID: mdl-37932462
OBJECTIVES: Development of automated segmentation models enabling standardized volumetric quantification of fibroglandular tissue (FGT) from native volumes and background parenchymal enhancement (BPE) from subtraction volumes of dynamic contrast-enhanced breast MRI. Subsequent assessment of the developed models in the context of FGT and BPE Breast Imaging Reporting and Data System (BI-RADS)-compliant classification. METHODS: For the training and validation of attention U-Net models, data coming from a single 3.0-T scanner was used. For testing, additional data from 1.5-T scanner and data acquired in a different institution with a 3.0-T scanner was utilized. The developed models were used to quantify the amount of FGT and BPE in 80 DCE-MRI examinations, and a correlation between these volumetric measures and the classes assigned by radiologists was performed. RESULTS: To assess the model performance using application-relevant metrics, the correlation between the volumes of breast, FGT, and BPE calculated from ground truth masks and predicted masks was checked. Pearson correlation coefficients ranging from 0.963 ± 0.004 to 0.999 ± 0.001 were achieved. The Spearman correlation coefficient for the quantitative and qualitative assessment, i.e., classification by radiologist, of FGT amounted to 0.70 (p < 0.0001), whereas BPE amounted to 0.37 (p = 0.0006). CONCLUSIONS: Generalizable algorithms for FGT and BPE segmentation were developed and tested. Our results suggest that when assessing FGT, it is sufficient to use volumetric measures alone. However, for the evaluation of BPE, additional models considering voxels' intensity distribution and morphology are required. CRITICAL RELEVANCE STATEMENT: A standardized assessment of FGT density can rely on volumetric measures, whereas in the case of BPE, the volumetric measures constitute, along with voxels' intensity distribution and morphology, an important factor. KEY POINTS: • Our work contributes to the standardization of FGT and BPE assessment. • Attention U-Net can reliably segment intricately shaped FGT and BPE structures. • The developed models were robust to domain shift.
Palabras clave

Texto completo: 1 Colección: 01-internacional Base de datos: MEDLINE Idioma: En Revista: Insights Imaging Año: 2023 Tipo del documento: Article País de afiliación: Suiza Pais de publicación: Alemania

Texto completo: 1 Colección: 01-internacional Base de datos: MEDLINE Idioma: En Revista: Insights Imaging Año: 2023 Tipo del documento: Article País de afiliación: Suiza Pais de publicación: Alemania