MRI Segmentation Using Inception-based U-Net Architecture and Up Skip Connections

Document Type : Original Article

Authors

1 Department of Computer Engineering, Technical and Vocational University (TVU), Tehran, Iran

2 technical and vocational university

3 Shahriar Institute of Higher Education, Department of Computer Engineering, Astara, Iran.

10.48301/kssa.2023.394044.2530

Abstract

Medical imaging is a non-invasive technique that has caused significant development in diagnosing and identifying human diseases. Among all medical imaging techniques, magnetic resonance imaging (MRI) is more popular. This method is not harmful to human health and can perform imaging of human brain details with high quality. Correct segmentation of brain tumors in MR images is very important. Traditional methods for segmenting medical images are time-consuming and require high expertise. Deep learning methods for brain tumor segmentation from MR images usually use normal convolution layers, in which case they will not have the ability to distinguish micro-scale and large-scale structures. In this research, a new method based on deep learning for brain tumor segmentation on MR images is presented. The proposed method is a generalization of the famous U-Net architecture, with the difference that the Inception module is used instead of normal convolution layers. Due to convolution kernels with different sizes in parallel, the Inception module can extract small-scale and large-scale features from the image. The proposed method has been evaluated on the BraTS 2022 dataset and the accuracy results obtained for the Dice similarity coefficient with a value of 0.91 indicate the improvement of the detection accuracy. The evaluation results show that both the hypothesis presented about the effect of high jump connections in improving the flow of information and learning is correct, and the use of the Inception module has significantly improved the evaluation criteria of the model.

Keywords

Main Subjects



Articles in Press, Accepted Manuscript
Available Online from 28 August 2023
  • Receive Date: 27 April 2023
  • Revise Date: 22 June 2023
  • Accept Date: 13 August 2023