Cookies on this website

We use cookies to ensure that we give you the best experience on our website. If you click 'Accept all cookies' we'll assume that you are happy to receive all cookies and you won't see this message again. If you click 'Reject all non-essential cookies' only necessary cookies providing core functionality such as security, network management, and accessibility will be enabled. Click 'Find out more' for information on how to change your cookie settings.

Ultrasound-guided regional anaesthesia (UGRA) involves the targeted deposition of local anaesthesia to inhibit the function of peripheral nerves. Ultrasound allows the visualisation of nerves and the surrounding structures, to guide needle insertion to a perineural or fascial plane end point for injection. However, it is challenging to develop the necessary skills to acquire and interpret optimal ultrasound images. Sound anatomical knowledge is required and human image analysis is fallible, limited by heuristic behaviours and fatigue, while its subjectivity leads to varied interpretation even amongst experts. Therefore, to maximise the potential benefit of ultrasound guidance, innovation in sono-anatomical identification is required.Artificial intelligence (AI) is rapidly infiltrating many aspects of everyday life. Advances related to medicine have been slower, in part because of the regulatory approval process needing to thoroughly evaluate the risk-benefit ratio of new devices. One area of AI to show significant promise is computer vision (a branch of AI dealing with how computers interpret the visual world), which is particularly relevant to medical image interpretation. AI includes the subfields of machine learning and deep learning, techniques used to interpret or label images. Deep learning systems may hold potential to support ultrasound image interpretation in UGRA but must be trained and validated on data prior to clinical use.Review of the current UGRA literature compares the success and generalisability of deep learning and non-deep learning approaches to image segmentation and explains how computers are able to track structures such as nerves through image frames. We conclude this review with a case study from industry (ScanNav Anatomy Peripheral Nerve Block; Intelligent Ultrasound Limited). This includes a more detailed discussion of the AI approach involved in this system and reviews current evidence of the system performance.The authors discuss how this technology may be best used to assist anaesthetists and what effects this may have on the future of learning and practice of UGRA. Finally, we discuss possible avenues for AI within UGRA and the associated implications.

Original publication

DOI

10.1007/978-3-030-87779-8_6

Type

Journal article

Journal

Adv Exp Med Biol

Publication Date

2022

Volume

1356

Pages

117 - 140

Keywords

Anatomy, Artificial intelligence, Blocks, Computer vision, Convolutional neural network, Machine learning, Regional anaesthesia, Sono-anatomy, Ultrasound, Anesthesia, Conduction, Artificial Intelligence, Humans, Peripheral Nerves, Ultrasonography, Ultrasonography, Interventional