Your browser doesn't support javascript.
loading
CholecTriplet2021: A benchmark challenge for surgical action triplet recognition.
Nwoye, Chinedu Innocent; Alapatt, Deepak; Yu, Tong; Vardazaryan, Armine; Xia, Fangfang; Zhao, Zixuan; Xia, Tong; Jia, Fucang; Yang, Yuxuan; Wang, Hao; Yu, Derong; Zheng, Guoyan; Duan, Xiaotian; Getty, Neil; Sanchez-Matilla, Ricardo; Robu, Maria; Zhang, Li; Chen, Huabin; Wang, Jiacheng; Wang, Liansheng; Zhang, Bokai; Gerats, Beerend; Raviteja, Sista; Sathish, Rachana; Tao, Rong; Kondo, Satoshi; Pang, Winnie; Ren, Hongliang; Abbing, Julian Ronald; Sarhan, Mohammad Hasan; Bodenstedt, Sebastian; Bhasker, Nithya; Oliveira, Bruno; Torres, Helena R; Ling, Li; Gaida, Finn; Czempiel, Tobias; Vilaça, João L; Morais, Pedro; Fonseca, Jaime; Egging, Ruby Mae; Wijma, Inge Nicole; Qian, Chen; Bian, Guibin; Li, Zhen; Balasubramanian, Velmurugan; Sheet, Debdoot; Luengo, Imanol; Zhu, Yuanbo; Ding, Shuai.
Afiliación
  • Nwoye CI; ICube, University of Strasbourg, CNRS, France. Electronic address: nwoye@unistra.fr.
  • Alapatt D; ICube, University of Strasbourg, CNRS, France.
  • Yu T; ICube, University of Strasbourg, CNRS, France.
  • Vardazaryan A; IHU Strasbourg, France.
  • Xia F; Department of Computer Science, University of Chicago, United States.
  • Zhao Z; Department of Computer Science, University of Chicago, United States.
  • Xia T; Lab for Medical Imaging and Digital Surgery, Shenzhen Institute of Advanced Technology, Chinese Academy of Sciences, China.
  • Jia F; Lab for Medical Imaging and Digital Surgery, Shenzhen Institute of Advanced Technology, Chinese Academy of Sciences, China.
  • Yang Y; School of Management, Hefei University of Technology, Hefei, China.
  • Wang H; School of Management, Hefei University of Technology, Hefei, China.
  • Yu D; Institue of Medical Robotics, Shanghai Jiao Tong University, Shanghai, China.
  • Zheng G; Institue of Medical Robotics, Shanghai Jiao Tong University, Shanghai, China.
  • Duan X; Argonne National Laboratory, 9700 S Cass Ave, Lemont, IL 60439, United States.
  • Getty N; Argonne National Laboratory, 9700 S Cass Ave, Lemont, IL 60439, United States.
  • Sanchez-Matilla R; Digital Surgery, a Medtronic Company, London, UK.
  • Robu M; Digital Surgery, a Medtronic Company, London, UK.
  • Zhang L; Institute of Automation, Chinese Academy of Sciences, China.
  • Chen H; Institute of Automation, Chinese Academy of Sciences, China.
  • Wang J; Department of Computer Science at School of Informatics, Xiamen University, Xiamen, China.
  • Wang L; Department of Computer Science at School of Informatics, Xiamen University, Xiamen, China.
  • Zhang B; Johnson & Johnson.
  • Gerats B; Meander Medical Centre, The Netherlands.
  • Raviteja S; Indian Institute of Technology Kharagpur, India.
  • Sathish R; Indian Institute of Technology Kharagpur, India.
  • Tao R; Institue of Medical Robotics, Shanghai Jiao Tong University, Shanghai, China.
  • Kondo S; Muroran Institute of Technology, Japan.
  • Pang W; Department of Biomedical Engineering, National University of Singapore, Singapore.
  • Ren H; Department of Electronic Engineering, The Chinese University of Hong Kong, Hong Kong.
  • Abbing JR; Meander Medical Centre, The Netherlands.
  • Sarhan MH; Johnson & Johnson.
  • Bodenstedt S; Department for Translational Surgical Oncology, National Center for Tumor Diseases Partner Site Dresden, Germany.
  • Bhasker N; Department for Translational Surgical Oncology, National Center for Tumor Diseases Partner Site Dresden, Germany.
  • Oliveira B; 2Ai School of Technology, IPCA, Barcelos, Portugal; Life and Health Science Research Institute (ICVS), School of Medicine, University of Minho, Braga, Portugal; Algoritimi Center, School of Engineering, University of Minho, Guimeraes, Portugal.
  • Torres HR; 2Ai School of Technology, IPCA, Barcelos, Portugal; Life and Health Science Research Institute (ICVS), School of Medicine, University of Minho, Braga, Portugal; Algoritimi Center, School of Engineering, University of Minho, Guimeraes, Portugal.
  • Ling L; School of Management, Hefei University of Technology, Hefei, China.
  • Gaida F; Technical University of Munich, Germany.
  • Czempiel T; Technical University of Munich, Germany.
  • Vilaça JL; 2Ai School of Technology, IPCA, Barcelos, Portugal.
  • Morais P; 2Ai School of Technology, IPCA, Barcelos, Portugal.
  • Fonseca J; Algoritimi Center, School of Engineering, University of Minho, Guimeraes, Portugal.
  • Egging RM; Meander Medical Centre, The Netherlands.
  • Wijma IN; Meander Medical Centre, The Netherlands.
  • Qian C; Institute of Automation, Chinese Academy of Sciences, China.
  • Bian G; Institute of Automation, Chinese Academy of Sciences, China.
  • Li Z; Institute of Automation, Chinese Academy of Sciences, China.
  • Balasubramanian V; Indian Institute of Technology Kharagpur, India.
  • Sheet D; Indian Institute of Technology Kharagpur, India.
  • Luengo I; Digital Surgery, a Medtronic Company, London, UK.
  • Zhu Y; School of Management, Hefei University of Technology, Hefei, China.
  • Ding S; School of Management, Hefei University of Technology, Hefei, China.
Med Image Anal ; 86: 102803, 2023 05.
Article en En | MEDLINE | ID: mdl-37004378
Context-aware decision support in the operating room can foster surgical safety and efficiency by leveraging real-time feedback from surgical workflow analysis. Most existing works recognize surgical activities at a coarse-grained level, such as phases, steps or events, leaving out fine-grained interaction details about the surgical activity; yet those are needed for more helpful AI assistance in the operating room. Recognizing surgical actions as triplets of combination delivers more comprehensive details about the activities taking place in surgical videos. This paper presents CholecTriplet2021: an endoscopic vision challenge organized at MICCAI 2021 for the recognition of surgical action triplets in laparoscopic videos. The challenge granted private access to the large-scale CholecT50 dataset, which is annotated with action triplet information. In this paper, we present the challenge setup and the assessment of the state-of-the-art deep learning methods proposed by the participants during the challenge. A total of 4 baseline methods from the challenge organizers and 19 new deep learning algorithms from the competing teams are presented to recognize surgical action triplets directly from surgical videos, achieving mean average precision (mAP) ranging from 4.2% to 38.1%. This study also analyzes the significance of the results obtained by the presented approaches, performs a thorough methodological comparison between them, in-depth result analysis, and proposes a novel ensemble method for enhanced recognition. Our analysis shows that surgical workflow analysis is not yet solved, and also highlights interesting directions for future research on fine-grained surgical activity recognition which is of utmost importance for the development of AI in surgery.
Asunto(s)
Palabras clave

Texto completo: 1 Colección: 01-internacional Base de datos: MEDLINE Asunto principal: Laparoscopía / Benchmarking Tipo de estudio: Prognostic_studies Límite: Humans Idioma: En Revista: Med Image Anal Asunto de la revista: DIAGNOSTICO POR IMAGEM Año: 2023 Tipo del documento: Article Pais de publicación: Países Bajos

Texto completo: 1 Colección: 01-internacional Base de datos: MEDLINE Asunto principal: Laparoscopía / Benchmarking Tipo de estudio: Prognostic_studies Límite: Humans Idioma: En Revista: Med Image Anal Asunto de la revista: DIAGNOSTICO POR IMAGEM Año: 2023 Tipo del documento: Article Pais de publicación: Países Bajos