153 | 0 | 39 |
下载次数 | 被引频次 | 阅读次数 |
目的 探讨并验证一种腋神经自动分割的深度学习模型,以实现实时自动识别腋神经的解剖结构。方法 回顾性分析100例患者(男54例、女46例)的腋神经超声图像,采用软件ITK-SNAP手动标记,建立数据集并分为训练集与测试集;基于U-Mamba框架构建一种腋神经自动分割的深度学习模型;以平均交并比(mIoU)、平均骰子相似系数(mDice)及准确度为评价指标评估模型的性能。结果 共纳入831张超声图像构建数据集。其中训练集683张,测试集148张。训练集的总mIoU为0.980,mDice为0.990。测试集的总mIoU为0.672,mDice为0.776,分割准确度为99.3%。经过5折交叉验证的IoU中位数(四分位数)为0.981(0.978,0.983)。结论 基于U-Mamba的深度学习模型,在自动识别腋神经解剖结构时获得良好的效果,具有较高的临床应用价值。
Abstract:Objective To provide and validate a deep learning model for automatic axillary nerve segmentation to achieve automatically identify axillary nerve anatomy in real time. Methods The axillary nerve ultrasound images of 100 patients(54 males and 46 females) were retrospectively analyzed, and ITK-SNAP software was used to for manual labeling, and a dataset was established and divided into training and testing sets. A deep learning model for automatic axillary nerve segmentation was constructed based on the U-Mamba framework. Mean Intersection over Union(mIoU), mean Dice Similarity Coefficient(mDice) and accuracy rate were used to evaluate the performance of the model. Results A total of 831 ultrasound images were included to construct the entire dataset. Among them, 683 ultrasound images were for training sets and 148 for for test sets. The total mIoU and mDice coefficient of the training set were 0.980 and 0.990. The total mIoU of the test set was 0.672, the mDice coefficient was 0.776, and the segmentation accuracy was 99.3%. The median and upper and lower quartiles of the 5-fold cross-validated IoU were 0.981(0.978 to 0.983). Conclusion The model based on U-Mamba deep learning can achieve good results in the automatic identification of axillary nerve anatomy and has a good clinical application value.
[1] PRICE D J. The shoulder block:a new alternative to interscalene brachial plexus blockade for the control of postoperative shoulder pain[J]. Anaesthesia and intensive care,2007,35(4):575-581.
[2] MARIANO E R,MARSHALL Z J,URMAN R D,et al.Ultrasound and its evolution in perioperative regional anesthesia and analgesia[J]. Best practice&research.clinical anaesthesiology,2014,28(1):29-39.
[3] BOWNESS J,TAYLOR A. Ultrasound-guided regional anaesthesia:visualising the nerve and needle[J]. Advances in experimental medicine and Biology,2020,1235:19-34.
[4] WOODWORTH G E,CARNEY P A,COHEN J M,et al.Development and validation of an assessment of regional anesthesia ultrasound interpretation skills[J]. Regional anesthesia and pain medicine,2015,40(4):306-314.
[5] SITES B D,CHAN V W,NEAL J M,et al. The American society of regional anesthesia and pain medicine and the european society of regional anaesthesia and pain therapy joint committee recommendations for education and training in ultrasound-guided regional anesthesia[J].Regional anesthesia and pain medicine,2009,35(2Suppl):S74-S80.
[6] NOBLE J A,NAVAB N,BECHER H. Ultrasonic image analysis and image-guided interventions[J]. Interface focus,2011,1(4):673-685.
[7]陈子平,聂锦平,朱梦叶.人工智能在超声引导神经阻滞方面的研究进展[J].中国疼痛医学杂志,2022,28(11):805-816.
[8] JONES M R,NOVITCH M B,SEN S,et al. Upper extremity regional anesthesia techniques:a comprehensive review for clinical anesthesiologists[J]. Best practice&research clinical anaesthesiology,2020,34(1):e13-e29.
[9] NASCIMENTO J C,MARQUES J S. Robust shape tracking with multiple models in ultrasound images[J]. IEEE transactions on image processing:a publication of the ieee signal processing society,2008,17(3):392-406.
[10] THRALL J H,LI X,LI Q Z,et al. Artificial intelligence and machine learning in radiology:opportunities,challenges,pitfalls,and criteria for success[J]. Journal of the American college of radiology:JACR,2018,15(3Pt B):504-508.
[11] SALTO-TELLEZ M,MAXWELL P,HAMILTON P. Artificial intelligence-the third revolution in pathology[J].Histopathology,2019,74(3):372-376.
[12] HASHIMOTO D A,ROSMAN G,RUS D,et al. Artificial intelligence in surgery:promises and perils[J]. Annals of surgery,2018,268(1):70-76.
[13] LLOYD J,MORSE R,TAYLOR A,et al. Artificial intelligence:innovation to assist in the identification of sonoanatomy for ultrasound-guided regional anaesthesia[J].Advances in experimental medicine and biology,2022,1356:117-140.
[14] BOWNESS J S,BURCKETTST L D,HERNANDEZ N,et al. Assistive artificial intelligence for ultrasound image interpretation in regional anaesthesia:an external validation study[J]. British journal of anaesthesia,2023,130(2):217-225.
[15] GUNGOR I,GUNAYDIN B,OKTAR S O,et al. A realtime anatomy?dentification via tool based on artificial?ntelligence for ultrasound-guided peripheral nerve block procedures:an accuracy study[J]. Journal of anesthesia,2021,35(4):591-594.
[16] BOWNESS J S,ELBOGHDADLY K,WOODWORTH G,et al. Exploring the utility of assistive artificial intelligence for ultrasound scanning in regional anesthesia[J].Regional anesthesia and pain medicine,2022,47(6):375-379.
[17] HE K,ZHANG X,REN S,et al. Deep residual learning for image recognition[J]. Proceedings of the IEEE conference on computer vision and pattern recognition,2016:770-778.
[18] NAM J G,PARK S,HWANG E J,et al. Development and validation of deep learning–based automatic detection algorithm for malignant pulmonary nodules on chest radiographs[J]. Radiology,2019,290(1):218-228.
[19] ECHLE A,RINDTORFF N T,BRINKER T J,et al.Deep learning in cancer pathology:a new generation of clinical biomarkers[J]. British journal of cancer,2021,124(4):686-696.
[20] MINAEE S,BOYKOV Y,PORIKLI F,et al. Image segmentation using deep learning:a survey[J]. IEEE transactions on pattern analysis and machine intelligence,2022,44(7):3523-3542.
[21] MA J,LI F,WANG B. U-mamba:enhancing long-range dependency for biomedical image segmentation[J].Arxiv preprint arxiv:2401.04722,2024.
[22] SMISTAD E,JOHANSEN K F,IVERSEN D H,et al.Highlighting nerves and blood vessels for ultrasoundguided axillary nerve block procedures using neural networks[J]. Journal of medical imaging,2018,5(4):044004.
基本信息:
DOI:10.13885/j.issn.2097-681X.2025.07.004
中图分类号:R614;TP391.41;TP18
引用信息:
[1]程偲,张明,黄生辉等.深度学习应用于腋神经的超声图像识别[J].兰州大学学报(医学版),2025,51(07):24-30.DOI:10.13885/j.issn.2097-681X.2025.07.004.
基金信息:
甘肃省自然科学基金资助项目(22JR5RA954); 兰州市科技计划资助项目(2024-3-37)