MULTI-SCALE ATTENTION BASED U-NET MODEL FOR LIVER TUMOR SEGMENTATION

Authors

  • J. Jenisha Communication Systems, Bethlahem Institute of Engineering, Anna University.
  • A. Joel Dickson Electronics and Communication Engineering, Bethlahem Institute of Engineering, Anna University, Chennai.

Keywords:

Liver tumor segmentation, Attention mechanism, Deep learning.

Abstract

It is essential to automatically evaluate the position and size of the liver tumour for radiologists, diagnosis, and the clinical process. Many U-Net-based variants have been suggested in recent years to enhance the segmentation results for medical image segmentation, but they are unable to describe the global spatial and channel relationships among lesion regions. To overcome this issues, we proposed a novel network called Multi-scale Attention UNet (MA-UNet) to address this problem by adding a self-attention mechanism into our approach to adaptively combine local features with their global dependencies. The attention mechanism of the MA-UNet allows it to capture complex contextual dependencies. Position-wise Attention Block and Multi-scale Fusion Attention Block are the two blocks that we have developed. The feature interdependencies in spatial dimensions, which represent the spatial dependencies between pixels in a global view, are modelled using the Position-wise Attention Block. A multi-scale semantic feature fusion attention block is also used to capture the channel dependencies between any feature map. On the MICCAI 2017 LiTS Competition dataset, we assess our methodology. Compared to other cutting-edge methods, the suggested way performs better. The Dice and VOE values of liver tumors segmentation are 0.749 ± 0.08 and 0.21 ± 0.06 respectively.

Downloads

Published

-

How to Cite

J. Jenisha, & A. Joel Dickson. (2023). MULTI-SCALE ATTENTION BASED U-NET MODEL FOR LIVER TUMOR SEGMENTATION. EPRA International Journal of Research and Development (IJRD), 8(3), 31–36. Retrieved from http://eprajournals.net/index.php/IJRD/article/view/1614