Automated detection of gastric cancer by retrospective endoscopic image dataset using u-net r-cnn

Atsushi Teramoto, Tomoyuki Shibata, Hyuga Yamada, Yoshiki Hirooka, Kuniaki Saito, Hiroshi Fujita

研究成果: ジャーナルへの寄稿学術論文査読

8 被引用数 (Scopus)

抄録

Upper gastrointestinal endoscopy is widely performed to detect early gastric cancers. As an automated detection method for early gastric cancer from endoscopic images, a method involving an object detection model, which is a deep learning technique, was proposed. However, there were challenges regarding the reduction in false positives in the detected results. In this study, we proposed a novel object detection model, U-Net R-CNN, based on a semantic segmentation technique that extracts target objects by performing a local analysis of the images. U-Net was introduced as a semantic segmentation method to detect early candidates for gastric cancer. These candidates were classified as gastric cancer cases or false positives based on box classification using a convolutional neural network. In the experiments, the detection performance was evaluated via the 5-fold cross-validation method using 1208 images of healthy subjects and 533 images of gastric cancer patients. When DenseNet169 was used as the convolutional neural network for box classification, the detection sensitivity and the number of false positives evaluated on a lesion basis were 98% and 0.01 per image, respectively, which improved the detection performance compared to the previous method. These results indicate that the proposed method will be useful for the automated detection of early gastric cancer from endoscopic images.

本文言語英語
論文番号11275
ジャーナルApplied Sciences (Switzerland)
11
23
DOI
出版ステータス出版済み - 01-12-2021

All Science Journal Classification (ASJC) codes

  • 材料科学一般
  • 器械工学
  • 工学一般
  • プロセス化学およびプロセス工学
  • コンピュータ サイエンスの応用
  • 流体および伝熱

フィンガープリント

「Automated detection of gastric cancer by retrospective endoscopic image dataset using u-net r-cnn」の研究トピックを掘り下げます。これらがまとまってユニークなフィンガープリントを構成します。

引用スタイル