This paper presents an automatic Couinaud segmentation method based on deep learning of key point detection. Assuming that the liver mask has been extracted, the proposed method can automatically divide the liver into eight anatomical segments according to Couinaud's definition. Firstly, an attentive residual hourglass-based cascaded network (ARH-CNet) is proposed to identify six key bifurcation points of the hepatic vascular system. Subsequently, the detected points are used to derive the planes that divide the liver into different functional units, and the caudate lobe is segmented slice-by-slice based on the circles defined by the detected points. We comprehensively evaluate our method on a public dataset from MICCAI 2018. Experiments firstly demonstrate the effectiveness of our landmark detection network ARH-CNet, which is superior to that of two baseline methods, also robust to noisy data. The average error distance of all predicted key points is 4.68 ± 3.17 mm, and the average accuracy of all points is 90% with the detection error distance of 7 mm. We also verify that summation of the corresponding heat-maps can improve the accuracy of point localization. Furthermore, the overlap-based accuracy and the Dice score of our landmark-derived Couinaud segmentation are respectively 91% and 84%, which are better than the performance of the direct segmentation approach and the traditional plane-based method, thus our method can be regarded as a good alternative for automatic Couinaud segmentation.

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.compbiomed.2022.105363DOI Listing

Publication Analysis

Top Keywords

couinaud segmentation
16
key bifurcation
8
attentive residual
8
residual hourglass-based
8
hourglass-based cascaded
8
cascaded network
8
automatic couinaud
8
divide liver
8
network arh-cnet
8
detected points
8

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!