Download PDF
Review  |  Open Access  |  30 Jul 2024

Advancements in needle visualization enhancement and localization methods in ultrasound: a literature review

Views: 253 |  Downloads: 26 |  Cited:  0
Art Int Surg 2024;4:149-69.
10.20517/ais.2024.20 |  © The Author(s) 2024.
Author Information
Article Notes
Cite This Article

Abstract

Ultrasound guidance plays a central role in numerous minimally invasive procedures involving percutaneous needle insertion, ensuring safe and accurate needle placement. However, it encounters two primary challenges: (1) aligning the needle with the ultrasound beam and (2) visualizing the needle even when correctly aligned. In this review, we offer a concise overview of the physics foundation underlying these challenges and explore various approaches addressing specific challenges, with a focus on software-based solutions. We further distinguish between hardware-based and software-based solutions, placing a stronger emphasis on the latter. The incorporation of artificial intelligence into these methods to enhance needle visualization and localization is briefly discussed. We identify state-of-the-art needle detection methods, showcasing submillimeter precision in tip localization and orientation. Additionally, we provide insights into potential future directions, aiming to facilitate the translation of these advanced methods into the clinic. This article serves as a comprehensive guide, offering insights into challenges, evolving solutions, and prospective research directions to effectively address these issues.

Keywords

Needle enhancement, segmentation, localization, ultrasound, biopsy, machine learning, deep learning

INTRODUCTION

Ultrasound guidance lies at the heart of many minimally invasive procedures that require percutaneous needle insertion. Applications can include minimally invasive localized anesthesia, tissue biopsy, central venous cannulation, percutaneous drainage, and therapeutic delivery[1-3]. Guidance ensures that the needle reaches its intended target without unnecessary pricking of tissue along the way, improving procedure efficacy and minimizing postoperative complications. Ultrasound is the gold standard for needle visualization vis-a-vis patient anatomy owing to its safety (it does not use ionizing radiation), real-time nature and often low cost[2,4,5].

However, ultrasound needle guidance demands expertise and is challenging, especially for lesstrained users, as in resource-constrained settings such as rural and remote communities[6]. The two major challenges with ultrasound needle guidance include: (1) aligning the needle with the ultrasound beam, and (2) visualizing and localizing the needle within the ultrasound image even when properly aligned[6]. Traditionally, clinicians employ various handheld techniques to enhance needle visibility, including scanning the ultrasound probe over the patient’s skin by sliding, rotating, or tilting the transducer to align the needle with the ultrasound beam signal. The visualization of the needle and its trajectory can also be improved by moving the entire needle in a short in-and-out or side-to-side motion, or by using hydrolocalization techniques where a fluid or an ultrasound contrast agent is injected throughout the needle to create a contrast on the ultrasound image[6-8]. However, movements of the entire needle could result in unintentional tissue structural damage if the needle is not visualized, and microbubbles formed during hydrolocalization can lead to acoustic shadowing and consequently obscure the image of target structures[6].

Various approaches have been proposed to address aligning and localizing the needle within the ultrasound plane, but a review of the broad literature available is necessary to advance research on this topic. Scholten et al. provided an extensive review on needle tip visualization and localization but focused mainly on hardware-based methods[9]. In contrast, Yang et al. gave an extensive review on medical instrument detection, including needle detection, but only focused on software-based methods with a general outlook on medical instrument detection, including catheters that have different acoustic properties from needles[10]. Beigi et al. reviewed both hardware-based and software-based methods but did not discuss the underlying challenges with ultrasound needle guidance nor provide a thorough synthesis of the existing approaches[11].

Given the broad range of literature on the topic of needle visualization and localization in ultrasound[9-13], it would be helpful to have a compact review of the development of different methods, the motivation behind them, and the state-of-the-art performance. The main contributions of this review are: (1) we provide a physics foundation of the exact challenges of needle detection in ultrasound; (2) we provide a concise review of the existing methods with a focus on learning-based methods; (3) we compare and contrast the performance of such methods over time; and (4) we highlight promising directions for future research on needle detection in 2D ultrasound. This review could serve as a reference for further research in a more focused direction.

NEEDLE APPEARANCE IN ULTRASOUND

An ultrasound transducer consists of an array of piezoelectric elements that generate high-frequency sound waves, which are then transmitted into the body. For ultrasound transducers used in needle interventions, these sound waves range from 3 to 15 MHz[6]. Upon hitting tissue boundaries along the line of transmission, the sound waves are reflected as echoes, which are received by the piezoelectric elements. The time delay between transmission of the sound wave and reception of the echo is used to determine the depth of the tissue boundary from the transducer. The ultrasound device then forms a 2D image where the brightness at any point in the image is proportional to the intensity of the echo from a tissue boundary at the corresponding distance from the probe. The echo intensity depends on the acoustic impedance of the tissues at the boundary. For instance, the higher the difference between the acoustic impedance of the tissues, the stronger the echo[6,14].

A needle appears as a bright signal in the ultrasound image due to its high acoustic impedance as compared to biological tissue[6]. However, needle visibility in the ultrasound image depends on the insertion technique, insertion angle, and insertion depth. Two primary techniques are more commonly employed: in-plane and out-of-plane techniques [Figure 1A][15,16]. With the in-plane technique, the ultrasound probe is aligned parallel to the needle’s trajectory, allowing for continuous visualization of the entire needle length within a single plane on the ultrasound screen. This technique is particularly beneficial when precise needle placement and avoidance of adjacent structures are paramount. The in-plane approach provides a clear, longitudinal view of the needle, aiding in maintaining accuracy throughout the procedure[17]. The challenge with in-plane insertion is aligning the needle, which typically has a diameter of 1 mm, with the ultrasound beam of width 1 mm[18]. The out-of-plane technique involves positioning the ultrasound probe perpendicular to the needle’s path. Needle advancement is visualized as a dot on the ultrasound display, and is suitable for procedures where the entire length of the needle is not critical. This technique is commonly applied when targeting larger structures or when a shallower angle of angle of insertion insertion is required[17]. Finally, needle visibility depends on the depth of the insertion, with deeper insertions having poor needle visibility due to attenuation of the ultrasound beam [Figure 1B].

Advancements in needle visualization enhancement and localization methods in ultrasound: a literature review

Figure 1. Needle visibility depends on the insertion technique used and needle insertion angle. (A) For in-plane insertion (top), both the shaft and tip are visible; however, for out-of-plane insertion, (bottom), only the tip is visible (bottom)[15]; (B) For shallow insertion angles, (first column), specular reflection is high and the needle is visible; however, for steeper insertion angles, (second and third column), needle visibility is lost.

LITERATURE SEARCH METHODOLOGY

We performed a thorough search of the literature using PubMed, Scopus, IEEE Xplore, Google Scholar, and Semantic Scholar databases. We searched for articles on needle visualization enhancement and localization in ultrasound. We used the keywords: needle, enhancement, detection, localization, visualization, and ultrasound, with the search query “ultrasound AND needle AND (detection OR localization OR visualization)”. We reviewed the titles and abstracts of the search hits to ensure they were proposing methods for enhancing needle detection in ultrasound with the major focus placed on software-based approaches. Various commercial products have been developed for needle enhancement and localization in ultrasound, but these are out of the scope of this review.

REVIEW RESULTS

We categorized all the literature based on the challenges being solved and whether they are hardware-based or software-based [Figure 2]. In this section, we briefly describe the different approaches, providing their motivation, strengths, limitations, and how they have evolved.

Advancements in needle visualization enhancement and localization methods in ultrasound: a literature review

Figure 2. Existing literature on improving needle visibility in ultrasound can be categorized into two: (top row) those improving needle alignment with the ultrasound beam (bottom row), and those improving needle visualization and localization. Each of these two categories can be further classified into hardware-based methods (left column) and software-based methods (right column). It can be noted that there are currently no software-based approaches for improving needle alignment.

Needle alignment

A main challenge with ultrasound needle-guided procedures is aligning the needle with the ultrasound beam. Most solutions are hardware-based methods and can be broken down into two categories: (1) guided insertion of the needle in the ultrasound plane; and (2) increasing the field view of the ultrasound probe.

Hardware-based methods

Guided insertion
The simplest guided insertion solution is to use needle guides that attach to the transducer to keep the needle within the ultrasound plane[19], or to attach a laser on the ultrasound probe which projects a line on the skin to indicate the ultrasound plane[20]. In central venous catheterization, needle guides have been shown to increase success rate under ultrasound[21]. However, the needle can still deviate from the ultrasound plane once inside the skin. In addition, a fixed system limits the degrees of freedom of the needle[9]. Robotic systems have been used clinically to perform ultrasound-guided needle insertion either autonomously or semi-autonomously under the control of the surgeon[22]. The first semi-autonomously robotic ultrasound-guided nerve block in humans was conducted in 2013 with a 100% success rate and a decrease in procedure time[23]. Autonomous systems have been utilized in phantom models, successfully maneuvering the ultrasound probe and needle simultaneously using Kalman and Random Sample Consensus (RANSAC) algorithms with a 1 mm accuracy[24]. Recent handheld robotic devices can guide non-expert ultrasound users in probe localization and needle insertion, enabling them to achieve procedure times and success rates similar to those of experts[25]. However, the cost of implementing robotic technology is high, and autonomous systems are still at the proof-of-concept phase.

Probe choice
Another approach to align the needle with the ultrasound beam is to maximize the field of view for a given procedure, such as 3D imaging. A 3D image can be constructed by sweeping a 2D transducer over a region of interest to capture numerous images. These 2D images are then processed with segmentation algorithms to reconstruct the 3D view. 3D rendered images have been useful in analyzing anatomical structures and guiding catheters for spreading local anesthetics after peripheral nerve blocks in patients[26]. However, 3D imaging and needle tracking in real time are not simultaneously possible using a 2D transducer. In contrast, an electronic beam steering matrix array transducer can emit and receive sound waves in 3D, allowing for real-time or 4D imaging. In a mid-line epidural lumbar spine needle placement study, the distance between the puncture site identified by the 3D probe and standard palpitation was within 3 mm[27]. Therefore, 3D transducers show promise in aligning the needle within the ultrasound image. However, their resolution and screen refresh rate (i.e., visible feed pauses) are inferior to 2D transducers[28].

Needle visualization and localization

Even when the needle is aligned with the ultrasound beam, it can often be challenging to visualize it and then localize it. This section briefly highlights the various approaches that have been proposed to enhance needle visualization and localization.

Hardware-based methods

Needle tip visualization and localization under ultrasound procedures is critical for needle navigation. Existing solutions can be clustered into two categories: (1) needle modification; and (2) needle tracking.

Needle modification
The first example of needle texture modifications is an echogenic needle designed to enhance visibility under ultrasound imaging. This is done by creating polymeric coatings or rough etches on the surface of the needle to create special reflective properties that enable them to produce strong echoes when they are struck by ultrasound waves[29,30]. Echogenic needles are most commonly used for biopsies, injection aspirations, and nerve blocks[30,31]. In biopsy procedures, echogenic needles are shown to be more advantageous for needle visibility for finer gauge needles since they are more useful for difficult procedures where the needle angle is suboptimal to the transducer[29]. However, echogenic needles tend to be more expensive compared to conventional needles due to the specialized materials and manufacturing process. Furthermore, the entrapment of micro air bubbles in surface-modified needles can create acoustic shadowing and artifacts[29,32].

Another needle modification approach is to integrate a piezoelectric buzzer and mechanically vibrate the needle tip. Needle oscillations are detected with Doppler in a cadaveric study[33], and more recently, in a clinical feasibility trial[34]. Tracking needle movement within a structure can be done with duplex ultrasound, which overlays brightness-mode (B-mode) with Doppler mode. Briefly, B-mode imaging converts the magnitude of the reflected sound waves to shapes that can be displayed in 2D, while Doppler mode analyzes the sound frequency between the moving needle and the stationary transducer to determine relative motion[35,36]. In an interstitial prostate brachytherapy study, the Doppler signal produced at the needle tip was imaged within 1-mm accuracy but was dependent on tissue stiffness and composition[34]. Therefore, there remains a need for a better system for needle tip localization that does not involve the excess costs associated with modifying all new needles.

Needle tracking
Needle tip tracking can be accomplished optically by tracking markers on the needle shaft. Briefly, an optical camera is affixed to the ultrasound transducer and tracks the markers as the needle penetrates the skin and deeper tissue. Simultaneously, a tracking algorithm predicts the trajectory of the needle tip by calculating the relative position of the needle to the transducer[37,38]. In a pediatric central venous catheter placement feasibility study, two stereo cameras fixed to the transducer predicted the catheter trajectory and led to a 90% successful central vein cannulation on the first pass. However, no statistical results on whether this method improved catheter placement were assessed in the study[39]. The main drawback of an optical tracking approach is that the system assumes that the needle tip does not bend during surgery. Acoustic sensors, such as fiber-optic piezoelectric hydrophones, have been developed to facilitate physicians to localize the needle within the ultrasound image[40-42]. These sensors are integrated into an intraoperative needle to measure the time difference between sound emission by the ultrasound transducer and reception at the needle tip[40]. In bovine phantoms, the accuracy of fiber optic hydrophones was found to be within 1 mm between tracked and labeled positions, and its tracking performance was independent of needle insertion angle[41]. In a porcine phantom, anesthesiologists found that using acoustic sensors reduced the procedure time and the number of hand movements in out‐of‐plane peripheral nerve block procedures[42]. A similar approach to hydrophones is using a photoacoustic emitter at the needle tip. A pulse laser is transmitted through an optical fiber onto a photoacoustic material which transforms light into acoustic sound waves at the needle tip. The produced sound waves are then detected by the ultrasound transducer and the needle tip can be localized in real time[43]. In a phantom simulation study, anesthesiologists, residents, and medical students watched videos of needle tip placements: two successful and one failed, with and without the photoacoustic emissions. Failure of needle placement was identified with 100% across all participants for both in-plane and out-of-plane approaches[44]. However, the efficacy of photoacoustics in clinical practice remains to be determined.

Another approach that can help the localization of the needle in an ultrasound image is to use a filament sensor embedded in the needle. Briefly, the sensor induces a small current which can be detected by an electromagnetic measuring device[45,46]. The needle tip position is then triangulated and projected onto the ultrasound image. In a clinical study, electromagnetic needle tracking was shown to reduce the number of needle reinsertion and shorten block performance time in out-of-plane approaches[47]. Moreover, in percutaneous liver biopsies, the spatial accuracy of the needle tip displayed electronically was within 2 mm from the real position seen on the ultrasound image[48]. However, the magnetic interference caused by surrounding magnetic tools and the required proximity of the field generator to the procedure are limitations for such a system. Although prototypes have been developed to integrate magnetic sensors into the transducer to reduce the additional equipment needed[49,50], such solutions still require magnetizing the needles instead of using standard operative needles.

Software based methods

Software-based methods for needle visualization and localization can be categorized into three: (1) image acquisition methods; (2) classical image processing methods; and (3) learning-based methods. In this section, we briefly describe these methods.

Image acquisition
Currently, the image acquisition method used to improve needle visualization is beam steering, in which the ultrasound beam is perpendicular to the needle to maximize specular reflection[6]. The steered images are then spatially compounded to form a single image in which the needle is more conspicuous. Cheung and Rholing developed an algorithm that automatically adjusts the steering angle in linear 2D ultrasound arrays[51]. The challenge with beam steering is that it does not localize the needle and also has a limited steering angle that makes needle enhancement challenging for steeper and deeper insertions[52].

Classical image processing methods
The challenges faced by image acquisition methods inspired exploration into image processing methods that were independent of beam-steering. Classical image processing methods follow a workflow involving: image preprocessing, feature extraction, needle detection from the features, and postprocessing to localize the needle [Figure 3]. In the image processing step, the input ultrasound images are transformed using a preprocessor P with corresponding parameters s. Early image processing approaches involved modeling ultrasound signal transmission to estimate signal loss due to attenuation[53]. However, this approach only works for in-plane insertion. This challenge can be overcome using digital subtraction of consecutive frames[54] or optical flow methods[55] to capture subtle motion changes even when the needle is imperceptible. Image processing for 3D ultrasound volumes could involve transforming the volume into appropriate views to normalize deformed object representations inherent in 3D ultrasound transducers[56].

Advancements in needle visualization enhancement and localization methods in ultrasound: a literature review

Figure 3. Software-based needle localization methods can be categorized into three: (A) classical image processing methods that use a handcrafted feature extractor and decoder; (B) machine learning-based methods with a trainable decoder; (C and D) and deep learning-based methods with a trainable feature extractor and decoder. Generally, software-based methods take as input either single ultrasound images or a video stream of ultrasound images.

After image processing, features are extracted using a handcrafted feature extractor E parameterized in λ1. Classical feature extractors can be categorized into two: intensity-based feature extractors and phase-based feature extractors. Intensity-based feature extractors use edge-detection methods to localize the needle, for instance, Ayvali and Desai used a circular Hough Transform to directly localize the tip of a hollow needle in 2D ultrasound[57]. The challenge with intensity-based features is that they depend on the visibility of the needle and assume the needle to be the brightest object within the image[57]. Such features quickly fail in the presence of other high-intensity artifacts such as tissue close to bone. This led to the adoption of scale and rotation invariant features extractors such as Gabor filters, log-Gabor filters, and Histogram of Oriented Gradients (HOG)[56,58-60].

Based on the extracted features, the input image is segmented by the handcrafted decoder D parameterized in λ2 to obtain a binary map indicating which pixels are likely to be the needle. To localize the needle in the segmentation map, postprocessing approaches such as the RANSAC[61], Kalman filter[61], Hough Transform[52-57], and Radon Transform[53] are used to fit a line through the segmentation map to obtain the needle trajectory. The needle tip is then determined as the pixel with the highest intensity at the distal end of the estimated trajectory. However, this naive approach is not robust to high-intensity artifacts along the estimated trajectory, and methods such as Maximum Likelihood Estimation Sample Consensus (MLESAC) can be used to determine the most likely pixel corresponding to the needle tip[62].

Generally, image processing methods can take as input either a single ultrasound image[53,63], two consecutive images[54,61,64] or a video stream of ultrasound images[52,57,59]. Algorithms that rely only on a single image to make a prediction are generally more suitable for needle tip verification, such as in needle ablation procedures. Nevertheless, they can be successively applied on a stream of ultrasound images to facilitate needle guidance toward a target in real time[65]. However, algorithms that rely on a stream of images before making a prediction are more suited for needle guidance as they benefit from the temporal information encoded within the image stream. These algorithms have the potential to model needle motion during insertion, making them good candidates for needle guidance[55,66].

Classical image processing methods face two major challenges: (1) they require carefully engineered feature extractors E with arbitrarily or heuristically selected parameters λ1 which are generally not robust to intensity, scale and orientation changes, or are computationally expensive[63]; (2) designing a decoding algorithm for high dimensional features is not tractable. These challenges led to the adoption of methods to automatically learn features necessary for needle detection and also learn robust classifiers for high-dimensional features.

Learning-based methods
Learning-based methods can be broadly categorized into two categories: (1) machine learning-based methods in which only the parameters λ2 of the decoder D are learned, and (2) deep learning-based methods in which the parameters λ1 and λ2 of both the feature extractor E and decoder D, respectively, are learned [Figure 3B and C]. Initial attempts at learning a classifier involved combining various combinations of threshold responses of the image to the feature extractors using the Adaboost statistical algorithm[67]. The intuition behind this approach is that, independently, the threshold responses are weak classifiers, but when combined, a strong classifier can be obtained. However, this approach is wasteful as it only utilizes a subset of all the extracted features. Other common classification algorithms used include Bayesian classifer[68], linear discriminant analysis (LDA)[56], support vector machine (SVM)[60], and its variant linear support vector machine (LSVM)[56].

The most commonly used decoder in machine learning methods is the SVM algorithm, which can learn to classify high-dimensional features from data. For needle detection, the SVM is fed with all the extracted features and it outputs a binary segmentation mask where white pixels represent the needle and black pixels the background[55,60,69]. The challenge with SVMs is that they do not directly output probability estimates of their predictions which may be required for uncertainty estimation and postprocessing. While machine learning approaches solve the decoder issues faced in classical image processing methods, they still rely on handcrafted features that exhibit the challenges mentioned in Section “Classical image processing methods”.

Deep learning-based methods, on the other hand, aim to leverage data to learn both the feature extractor 1 and decoder 2. Early approaches used multi-layer perceptrons (MLPs) to learn relevant needle features and classify them accordingly[70]. In the approach proposed by Geraldes and Rocha, the MLP took as input a region of interest (ROI) selected from the input ultrasound image, output a probability estimate for each pixel in the ROI being a needle, and a threshold applied to the output to localize the needle[70]. This approach, however, yielded tip localization errors greater than 5 mm.

Most deep learning methods use convolutional neural networks (CNNs) with multiple layers, whereby the first layers learn local features from the image and the deeper layers combine the local features to learn more global features. CNN-based approaches can be categorized into four: (1) classification; (2) regression; (3) segmentation; and (4) object detection. Using CNNs for classification is common in methods working with 3D ultrasound. For instance, in the approach by Pourtaherian et al., a CNN is used to classify voxels extracted from 3D ultrasound volumes as either needle or background yielding a 3D voxel-wise segmentation map of the needle[71,72]. A cylindrical model is then fitted to this map using RANSAC to estimate the needle axis which is used to determine the 2D plane containing the entire needle. Another approach would be to classify each scan plane in the 3D volume as either containing a needle or not, and then similarly combine and visualize the 2D plane that contains the entire needle[54]. While these approaches enhance the visualization of the needle in 3D ultrasound, they do not localize the needle tip.

With regression, the features extracted by the CNN are used to directly regress the needle tip coordinates (x, y)[65], or their proxy by regressing four values representing the two opposite vertices of a tight bounding box centered around the needle tip[66] [Figure 3D]. These approaches are suitable for needle localization in both in-plane and out-of-plane insertion as they do not heavily rely on shaft information. The only downside of these approaches is that they do not enhance visualization of the entire needle during in-plane insertions.

For segmentation, the high-level features extracted by the CNN are used in reverse to generate a probability map with pixel-wise probabilities of the existence of a needle[72-78]. This probability map can then be post-processed, usually by thresholding, to generate a binary segmentation map. CNNs with segmentation are the most commonly used deep learning approach for needle detection because they can detect the entire needle, including the shaft, while producing probabilities for their outputs, which leaves room for a variety of postprocessing approaches [Figure 3C]. A special case of segmentation can be found in high dose rate (HDR) prostate brachytherapy applications where multiple needles are segmented simultaneously[79-82]. In these applications, transverse 2D slices obtained from the 3D ultrasound volume are passed as input to the CNN trained to output the corresponding multi-needle segmentations for each slice. Unlike shaft segmentations for in-plane needle insertion, segmentations in HDR prostate brachytherapy slices are circular and centered around each needle in a given slice. These segmentations are then combined and the centers of the circles are considered to be the needle shaft with the most distal bright intensity considered as the needle tip.

Object detection methods are similar to segmentation methods but output bounding boxes encasing the detected needle. For instance, Mwikirize et al. used a CNN to automatically generate potential bounding box regions containing the needle and fed them to a region-based CNN (R-CNN) to classify which regions contained the needle[83]. On the other hand, Wang et al. used the Yolox-nano detector that outputs bounding box predictions for each pixel. These predictions are then combined using non-max suppression to obtain a single bounding box indicating the predicted needle[84]. Rubin et al. combined a 3D CNN, to extract temporal features from an ultrasound video stream, with a 2D Yolov3-tiny object detector to efficiently detect the needle[85]. Object detection can also be performed directly for the needle tip instead of the entire needle[86].

DISCUSSION

Various methods and approaches have been developed to enhance needle alignment, visualization, and localization in ultrasound. Currently, all methods that are aimed at improving needle alignment are hardware-based [Figure 2], and require additional hardware which increases the cost of the ultrasound systems and also disrupts normal workflow. The same applies to the hardware-based needle visualization methods. This can be a big challenge, especially in resource-constrained communities such as rural and remote settings, and low- and middle-income countries that can not afford additional hardware.

On the other hand, software-based methods do not require additional hardware and could be a potential alternative in such scenarios. Classical image-based methods for needle visualization and localization rely heavily on carefully engineered feature extractors and classifiers which are often not robust to various image acquisition settings and image quality. Learning-based methods address this challenge by automatically learning the feature extractor and/or classifier from existing data. Deep learning-based methods exhibit superior performance compared to classical methods, and thus, this discussion will mainly focus on the most recent deep learning-based methods for needle visualization and localization, summarized in Table 1 and detailed in Table 2.

Table 1

A summary of the reviewed deep learning-based methods in descending order by date of publication, and their corresponding results

AuthorModeDatasetTarget organTip error (mm)Orientation error (degrees)Processing time (fps)
Che et al.[87]2DPhantom (bovine, carrageenan)Liver1.17
Wang et al.[84]2D+tEx vivo (porcine)Liver, heart, kidney, abdomen0.9210
Zade et al.[88]2D+tPhantom (lumbar epidural simulator, femoral Vascular Access Ezono)Epidural space, vascular structures2.122.0814
Chen et al.[78]2DEx vivo (bovine, porcine)0.450.4214
Wijata et al.[77]2DIn vivo (core needle biopsies)46
Mwikirize et al.[66]2D+tEx vivo (bovine, porcine, chicken)Epidural space0.5216
Gao et al.[76]2DEx vivo (bovine, porcine, chicken)0.290.2967
Zhang et al.[79]2DIn vivo (HDR prostate brachytherapy)Prostate0.33
Andersén et al.[81]2DIn vivo (HDR prostate brachytherapy)Prostate1
Gillies et al.[89]2DPhantom (tissue)
In vivo (prostate, gynecologic, liver, kidney)
Tissue, prostate, gynecologic, liver, kidney4.41.4
Zhang et al.[79]3DIn vivo (HDR prostate brachytherapy)Prostate0.44
Zhang et al.[75]3DIn vivo (HDR prostate brachytherapy)Prostate0.442
Lee et al.[74]2DIn vivo (biopsy)Kidney13.3
Mwikirize et al.[65]2D+tEx vivo (bovine, porcine, chicken)Epidural space1
Pourtaherian et al.[90]3DEx vivo (porcine leg)
Arif et al.[73]3D+tPhantom (abdominal, liver)
In vivo (liver biopsy)
Liver, abdomen
Pourtaherian[72]3DEx vivo (chicken breast, porcine leg)0.7
Mwikirize et al.[83]2DEx vivo (porcine, bovine)Epidural space0.230.82
Mwikirize et al.[60]3DEx vivo (bovine, porcine)Epidural space0.44
Pourtaherian et al.[71]3DEx vivo (chicken breast)0.52
Beigi et al.[69]2D+tIn vivo (pig)Biceps femoris muscle0.821.68
Beigi et al.[55]2D+tIn vivo (pig)Biceps femoris muscle1.692.12
Geraldes and Rocha[70]2DPhantom (ballistic gelatin)5
Table 2

A summary of the deep reviewed learning-based methods listed in descending order of date published

DateAuthorModeApproachStrengthsLimitations
2024-01-12Che et al.[87]2D- Calibrate ultrasound probe and needle to optical tracker
- Capture needle tip using tracker
- Represent tip in the ultrasound image coordinate system
- Combines both hardware and software-based approaches- Requires extra hardware: optical tracker and localizers
- Calibration may be erratic if the tip strays from the imaging plane
- Involves a two-person workflow and separate algorithms for needle tracking and detection
2023-05-17Yan et al.[91]2D+tConsists of 3 modules
(1) Visual tracking module
- Pass sampled patches from consecutive frames through the 2-branch Siamese network
- One branch dynamically extracts features from template patches and the other extracts features from current patch
(2) Motion prediction module
- Needle velocity is estimated using displacement of the tip between consecutive frames
(3) Data fusion
- Use MLP to independently fuse data from visual tracking module and motion prediction module
- Incorporates temporal information
- Thoroughly evaluated for multiple users and insertion motions
- Fails if there is sustained disappearance of the needle tip
- Tracking fails when the needle tip is too small and indistinguishable from background right at insertion
2023-05-09Wang et al.[84]2D+t(1) Enhanced ultrasound image by subtracting current frame from previous frame and fusing the difference with the current frame
(2) Extract spatial constraint (bounding box around shaft) if present
(3) Extract temporal constraint (bounding box around tip) if present
(4) Combine the constraints to localize needle tip
- Accounts for motion
- Continous needle detection even when static between frames
- Real time
- Evaluated on human data- Evaluated on CPU
- Independent of detector
- Only in plane
- Detects only needle tip
- In plane
- Mentions that different detectors could be used, but this is not true as specific detectors (that yield bounding boxes) should be used
- Doesn’t visualize shaft
2023-02-06Zade et al.[88]2D+t- Extract motion field’s amplitude and phase from successive frames
- Extract spatiotemporal features from the motion fields
- Pass features through detector that outputs a vector of size values including pixelwise line parameters (tx,ty,theta), and probabilities for needle shaft and tip
- Evaluated approach on 2 categories of images (needle aligned correctly, and needle imperceptible)- No ablation study to show relevance of the different components of the proposed assisted excitation module
- No comparison with SOTA to show how they fail to model speckle dynamics
2021-10-22Chen et al.[78]2D- 2 frameworks
(1) Predict segmentation from two adjacent images
(2) 2 models - one segments shaft, the other segments needle tip from ROIs extracted from the predictions of (1)
- Use segmentations to find
- Segments shaft, localizes needle
- Doesn’t require prior knowledge of insertion side/orientation
- Fully automatic
- Does not justify how approach accounts for time (inputs are two consecutive images but with no time information encoded)
- Did not compare with state of the art methods
2021-06-28Wijata et al.[77]2D- Pass image through Unet like architecture
- Architecture uses large kernels (11 × 11, 9 × 9, 7 × 7) to detect large objects
- Apply Radon transform to determine needle trajectory
- back-transform trajectory back to binary image
- Evaluated on in vivo data- Not robust to high intensity artifacts
2021-05-11Rubin et al.[85]3D- Pass k previous frames to 3D CNN (including current frame)
- Repeatedly apply 3D convolution to obtain a single feature map for temporal information
- Pass the feature map through a 2D detector
- 2D detector outputs bounding box of needle in current frame
- Efficient (real-time, can run on low-cost computing hardware)
- Evaluated on challenging cases
- Only detects needle with a bounding box and does not provide metrics on needle tip localization
2021-04-11Mwikirize et al.[66]2D+t- Enhance tip in 5 consecutive images
- Pass images through CNN+LSTM network to predict needle tip location in reference frame
- Does not rely on needle shaft visibility, hence, works for both in plane and out of plane detection
- Models temporal dynamics associated with needle tip motion
- More accurate
- Does not require a priori information about needle trajectory
- Performs well in presence of high intensity artifacts
- Not sensitive to type and size of needle used
- Straight needles
- Does not localize the shaft- Limited evaluations performed for needle sizes, motion and high intensity artifacts, insertion angles, tissue types, domain invariance across imaging systems
2021-03-31Gao et al.[76]2D- Use Unet architecture to segment needle, classify frame, and predict needle tip in one-shot
- Use the outputs to precisely locate needle tip and enhance needle shaft and tip visualization
- Real-time
- Localization and segmentation in one shot
- Approach extensively evaluated using various metircs
- Requires beam steering
- Needle after beam steering is visible enough (questions need for algorithm)
- Has a classification network that is not necessary
- Can only work for in-plane needles
2020-10-07Zhang et al.[79]2D- Pass transverse TRUS image through CNN to extract features
- Pass features through region proposal network to obtain potential regions containing the needle tip
- Classify proposals, pixels in the proposals, and also generate bounding box coordinates from the proposal
- Use DBSCAN to model the needles, and localize the shaft and needle tips
- Works for multiple needles
- Works for 3D and 2D needle localization
- Inference time evaluated on GPU which may not reflect performance on low-compute devices
- Only in-plane needle insertion
2020-10-04Andersén et al.[81]3D- Pass entire 3D ultrasound volume through 3D CNN (U-Net)
- Combine predictions to obtain a 3D localization of each needle
- Works for multiple needles
- Doesn’t require any preprocessing
- Does not localize the needle tip
- Only applicable to multiple needles and 3D ultrasound
2020-08-06Gillies et al.[89]2D- Pass 2D image through U-Net architecture to get segmentation map
- Take the largest island as the needle
- Use RANSAC to estimate trajectory
- Tip is the deepest point along the estimated trajectory
- Evaluated on various target organs- Localization requires knowledge of needle entry direction
- Not optimized for real-time performance
2020-03-16Zhang et al.[92]3D- Obtain ultrasound volumes and their corresponding CT images
- Threshold the CT images to highlight voxels containing the needles
- Train a model to extract features from the ultrasound images using the preprocessed CT images as the weak labels
- Use the trained model on new ultrasound images to extract needle segmentations
- Apply RANSAC to model the needle from the segmentations
- Doesn’t need manual labels as CT images are used as weak labels- Requires CT image to train
- Limited to brachytherapy application
2020-03-10Zhang et al.[75]3D- Extract 3D patch from transverse view of the 3D ultrasound volume
- Pass patch through Unet model with attention gates to obtain circular pixelwise segmentations
- Merge segmented patches to show the needle localization in the 3D volume
- Centers of detected circular segmentations correspond to the needle shaft
- Most distal bright intensity corresponds to the needle tip
- Can detect multiple needles at once
- Achieves both segmentation and localization of the needles simultaneously
- Fast as compared to manual segmentation
- Can only work for 3D ultrasound
2020-01-20Lee et al.[74]2D- Pass image through model to get segmentation
- Apply a max contour algorithm to find most contiguous segment
- Visualize by drawing bounding from top right most pixel to bottom left pixel, and straightening the segmentation as diagonal of the bounding box
- Method evaluated on human data (8 patients)- Evaluation metrics reported in # of pixels, rather than mm
- Only compared general segmentation architectures, but no earlier needle detection methods
2019-10-10Mwikirize et al.[65]2D+t- Enhance needle tip in consecutive us images
- Classifiy enhanced images and localize tip in enhanced images that have needle
- Real time (67 fps)
- Both in plane and out of plane detection
- Robust to intensity variations
- Resilient to high intensity artifacts in the image
- Incorporates temporal information
- Not robust to motion artifacts such as breathing, or pulsating
- Cannot detect stationary needle tip as it depends on motion
2019-02-24Pourtaherian et al.[90]3D- Slice 3D ultrasound volume into 2D slices
- Select 3 consecutive slices (with the middle one being the reference slice) to incorporate some 3D information
- Pass slices as 3 channel input to fully connected network (autoencoder style)
- Obtain pixelwise classification of the slices
- Conceptually simple architecture- Computationally expensive
2019-02-11Arif et al.[73]3D+t- Segment 3D ultrasound volume using a CNN
- Extract needle candidates from segmentation using connected component labelling and PCA
- Combine needle candidates with those from previous time step to obtain real needle by detecting motion between the time steps
- Visualize needle in two planes; 1 perpendicular and the other parallel to the transducer
- Incorporates temporal information
- Ablation studies performed on architecture
- Evaluated on multiple datasets (3 datasets) - performance doesn’t vary much except for the in vivo data
- Not robust to motion artifacts as it assumes only needle moves between two consecutive frames
- Assumes linear needle motion
- Not robust to transducer motion (translation or rotation)
- Large standard deviation on in vivo data as compared to phantom data (not easily generalizable)
- Computational speed measured on GPU
- Doesn’t localize needle tip - just the plane and segmentation of the needle (perhaps visible shaft)
2018-05-31Pourtaherian[72]3D- Extract voxels from 3D ultrasound volume and classify each voxel as needle or background
- Obtain 2D cross section slices from the 3D ultrasound volume and segment the needle in each slice (various slices perpendicular to the lateral and elevation plane)
- Map segmentation output onto its corresponding position in 3D
- Estimate needle axis by fitting a model of the needle to the segmented voxels (model is a straight cylinder having a fixed diameter)
- Visualize the 2D cross section plane that contains the entire needle
- Can detect very short needles (5mm and 10mm)
- Robust to transducer and patient movements (as it performs repeated detection in 3D volume)
- Method evaluated on data from 2 transducers (of varying resolution) and 2 tissue types, 2 needle types (of different gauge)
- Method does not explicitly detect the needle tip (only the plane where the needle and tip are maximally visible)
- Can’t detect needle in the first 2mm
- Computationally expensive- Only in-plane
2018-03-06Mwikirize et al.[83]2D(1) Detect needle using a bounding box
(2) Use bounding box to automatically determine needle trajectory and tip
- Relies on intensity invariant features. Robust to low intensity needle features and presence of high intensity artifacts
- Faster than Pourtaherian (2017)
- Method is fully automatic (doesn’t require feature engineering as methods that use Gabor filters, etc.)
- Doesn’t require prior knowledge of needle insertion side and orientation
- Robust in cases where the needle is slightly off-plane
- Real-time detection rate (25 fps)
- Inference time evaluated on GPU (most ultrasound devices run on CPU)
- Only in-plane needle insertion. Depends on availability of needle data in image
- Can’t detect whole needle if there is shaft discontinuity
- Detection algorithm very sensitive to data preprocessing
- Only works for non-bending needles
- Reliance on an expert to determine ground-truth tip (not possible when tip information is completely invisible)
- Only in-plane
2017-09-08Mwikirize et al.[60]3D(1) Detect slices with needle
(2) Enhance needle and localize it in all slices (multi-plane)
- Robust to high intensity artifacts
- low execution time
- Accurate tip localization for moderate steep insertion angles
- Robust even when shaft if discontinuous
-Detection accuracy is independent of tissue type
- Assumes needle insertion side is known a priori
- Assumes needle has minimal bending
- Large gauge needle used (17G), may not work for thinner needles
- Over all computation time of 3.5 s is high for real-time application
- For only in plane needle insertion
2017-09-04Pourtaherian et al.[71]3D- Extract voxels from 3D ultrasound volume
- Classify each voxel as needle or background
- Estimate needle axis by fitting a model of the needle to the detected voxels (model is a straight cylinder having a fixed diameter)
- Visualize the 2D cross section plane that that contains the entire needle
- Method is automated localization of needle
- High precision, low false negative rate
- Can detect short needles- Intuitive visualization of the needle
- Evaluated for both thin and large needles (17G and 22G)
- Single network used
- Only determines plane containing the needle but needle tip is not localized
- High computational complexity (2.19 s for a coarse-fine classification on GPU)
- Only in-plane needles
2017-06-24Beigi et al.[69]2D+t(1) Capture micro-motion of needle by extracting spatio-temporal features of cuboids from optical and differential flow maps at multiple scales and orientations using a steerable complex pyramid
(2) Pass features to incremental SVM to classify pixels as needle or background in current frame
(3) Apply Hough Transform on segmentation to determine needle trajectory
- Can detect micro-motion of needle
- Doesn’t require prior knowledge of needle insertion side and angle
- Mitigates other sources of motion such as tremor
- Uses engineerd features
- In plane
- 17G needle (may not work for thinner needles)
- Requires motion, can’t detect static needle
- Only tested on curved array transducers
- Only in-plane needles
2017-03-06Beigi et al.[55]2D+t(1) Segmentation (using a learning-based framework to classify pixels based on temporal analysis of phase variations to obtain a probability map)
(2) Localization (use a Hough Transform on the probability map to localize the needle)
- Superior parameter tuning (after model is trained)
- More robust to imaging parameter variations
- Has potential to adapt to different operators in various applications
- Takes into account temporal information- Can distinguish needle from tissue in contact
- Only tested on needles with minimal bending
- Used only 17G needles, may not work for thinner needles
- Only in-plane needles
2014-10-02Geraldes and Rocha[70]2D(1) ROI selection
(2) MLP prediction
(3) Kalman filter to improve robustness
- Approach is evaluated over an entire video with tip error provided for each frame- High tip error
- No other quantitative evaluation provided
- Evaluated on just 2 videos
- Requires initial ROI selection
- Paper mentions that it focuses on flexible needles but used straight needle
2014-07-06Hatt et al.[67]2D2 stages
(1)MLPSegment needle (classify each pixel in image as needle or background)
(2) Localize needle from segmentation (using Radon transform)
- Segmentation method robust in presence of other high intensity artifacts
- Segmentation method could be used to enhance needle visualization even without localization
- Method robust with weak assumptions about needle orientation
- Does not work for curved needles as Radom transform considers straight paths through the image
- Requires beam steering, assumes needle orientation is known a priori and is aligned with the selected beam steer angle
- Evaluated on data from 18-G needles. May not work for thinner needles with non-distinctive appearance
- Only in-plane needle insertion

The challenge with learning-based methods is that they require a lot of data to be trained. This data can be collected from tissue-mimicking phantoms, freshly excised animal cadavers, or in vivo during clinical procedures. Most methods use data collected by performing needle insertions in vitro with phantoms, and ex vivo with porcine, bovine, and chicken while mimicking clinical scenarios [Table 1]. Only methods developed for HDR prostate brachytherapy consistently use human in vivo data for evaluation. Future methods can find motivation from Gillies et al., who evaluated their approach on in vivo datasets from multiple organs and scenarios on top of the phantom datasets[89].

The data are typically annotated by an expert sonographer who performed the needle insertion experiments to obtain the ground truth labels. In some scenarios, a hardware-based tracking system is used to obtain a more accurate needle tip location, especially for cases where the needle is imperceptible to the human eye[54,66,88]. In all the proposed approaches, local datasets were collected and this is not ideal for comparing the proposed methods as noise and biases can easily be introduced into the data. To date, there is no benchmark dataset on which developed methods can be evaluated, which has significantly stifled progress[11].

The typical evaluation metric for learning-based methods is needle tip error, as the ultimate goal for needle localization is to avoid puncture of critical tissue such as veins along the needle trajectory. For segmentation methods that also detect the needle shaft, needle trajectory/orientation error is an important metric, on top of the needle tip error, to assess model performance for needle guidance during insertion. Needle localization performance has progressed over the years, in terms of needle tip localization error, up until 2022, when there seems to be a decrease in progress [Figure 4]. However, needle orientation error is also used to ensure that a large portion of the needle shaft is also accurately detected. Out of all the proposed methods, deep learning-based methods that report both tip localization and orientation error achieve state-of-the-art performance [Figure 4B][76,78,83]. Another key metric for software-based methods is inference time on central processing unit (CPU), given that when deployed, these algorithms should achieve real-time performance, which is considered to be any processing speed greater than 16 fps[14].

Advancements in needle visualization enhancement and localization methods in ultrasound: a literature review

Figure 4. Various approaches have been developed for needle tip localization (A), but only 7 report both needle tip localization and needle orientation error (B). Until 2022, progress was being made in reducing the needle tip localization error; however, most recent approaches seem to have reversed this trend (A). Current state-of-the-art methods achieve sub-millimeter needle tip localization and needle orientation errors (B). However, caution should be taken as these results are based on the self-reported errors on local datasets.

One observation from many of the proposed learning-based methods performing needle segmentation is the use of metrics, such as the Dice coefficient and intersection over union (IOU), that do not reflect clinical needs for needle localization. For instance, both DICE and IOU only measure the overlap between the prediction and the ground truth; they are inherently biased to focus more on the needle shaft, which may not be as informative as the needle trajectory error (degrees) or tip localization error (mm). The trend toward such metrics has mainly increased with machine learning and deep learning-based methods and could be a result of adopting metrics commonly used in other domains where they directly reflect the domain needs. Caution should thus be taken when selecting metrics to optimize and evaluate proposed software-based methods[93,94]. Maier-Hein et al. developed a comprehensive framework to act as a guideline in selecting problem-aware metrics for assessing medical image processing machine learning algorithms[93].

While a lot of progress has been made in the needle visualization and localization approaches, little progress has been made in developing effective needle alignment approaches and this has hindered clinical adoption. The current software-based methods alone cannot solve both challenges and they greatly rely on the assumption that the needle is correctly aligned with the probe and is imperceptible to the naked eye.

FUTURE DIRECTIONS

A benchmark dataset containing data from different organs, needle types, and ultrasound scanners from multiple centers, with clearly defined evaluation metrics, should be collected and made publicly available for fair comparison of any proposed software-based methods. An ideal needle detection method would then be one that can maintain its performance across multiple organs, needle types, and ultrasound devices used, as it is not feasible to always develop a new method for each scenario. In terms of needle enhancement, future investigations could focus on directly detecting needles from raw radio frequency (RF) signals before image reconstruction. The hypothesis posits that the raw RF signal, being rich in information, may contain distinct patterns indicative of needles, allowing for their detection and isolation amidst background noise. Another exciting avenue for future research on software-based image acquisition techniques is specular beamforming. Instead of using the conventional delay-and-sum beamforming that assumes scattered reflection, specular beamforming takes into account the physics of specular reflection from acoustically hard objects such as needles to improve their visibility in ultrasound[95].

Innovative strategies are needed to enhance needle alignment with ultrasound probes, particularly for inexperienced clinicians, while maintaining procedural efficiency. These strategies could potentially combine hardware-based needle alignment approaches with software-based methods for needle visualization enhancement and localization. Future research could also explore utilizing the entirety of video information, mirroring the approach taken by clinicians when adjusting probes and needles. Software-based methods that track probe and needle motion can be developed to improve needle alignment with the ultrasound beam. These can leverage inertial measurement unit (IMU) data, which are available in some ultrasound devices[96]. Additionally, it is crucial to investigate integrating needle detection and tracking methods, as relying solely on needle localization methods may not fully address the challenge of needle alignment[87].

CONCLUSION

Ultrasound-guided needle insertion is crucial during many minimally invasive procedures but faces two major challenges: (1) aligning the needle with the ultrasound probe; and (2) visualizing the needle under ultrasound. In this paper, we reviewed the various methods proposed to improve needle visibility in ultrasound with a major focus on software-based methods but also briefly described relevant hardware-based approaches. Various approaches have been proposed, with AI methods achieving state-of-the-art performance with submillimeter tip localization and orientation errors. However, these approaches have not yet seen clinical applications as they only solve needle visualization but not needle alignment. We believe that when combined with approaches that help with needle alignment, the proposed needle visualization approaches have very significant potential in improving ultrasound-guided needle procedures.

DECLARATIONS

Authors’ contributions

Made substantial contributions to the conception and design of the study: Hacihaliloglu I, Liu D

Reviewed literature on needle appearance in ultrasound: Tadayon P

Reviewed literature on the hardware-based methods: Arora I, Pieters A

Reviewed literature on the software-based methods: Gulam S, Kimbowa A

Made substantial contributions to the writing of the initial draft: Pieters A, Tadayon P, Arora I, Gulam S, Kimbowa A

Made substantial contributions to the writing of the final manuscript: Pieters A, Pinos A, Kimbowa A, Hacihaliloglu I

Availability of data and materials

Not applicable.

Financial support and sponsorship

This work was supported by the Canadian Foundation for Innovation (CFI) - John R. Evans Leaders Fund (JELF) program grant number 42816. We acknowledge the support of the Natural Sciences and Engineering Research Council of Canada (NSERC). Discovery Grant RGPIN-2023-03575. Financial and material support was also provided by the Minimally Invasive Image Guided Procedure Lab (MIIPs), and Engineers in Scrubs program at the University of British Columbia School of Biomedical Engineering, with support from Vancouver Coastal Health Research Institute and the Centre for Aging SMART.

Conflicts of interest

All authors declared that there are no conflicts of interest.

Ethical approval and consent to participate

Not applicable.

Consent for publication

Not applicable.

Copyright

© The Author(s) 2024.

REFERENCES

1. Barr L, Hatch N, Roque PJ, Wu TS. Basic ultrasound-guided procedures. Crit Care Clin 2014;30:275-304.

2. Tirado A, Nagdev A, Henningsen C, Breckon P, Chiles K. Ultrasound-guided procedures in the emergency department-needle guidance and localization. Emerg Med Clin North Am 2013;31:87-115.

3. Campbell AS. Ultrasound-Guided Procedures. In: Kuzmiak CM, editor. Interventional breast procedures. Cham: Springer; 2019. pp. 21-33.

4. Huntoon MA. Imaging in Interventional Pain Management. In: Narouze SN, editor. Atlas of ultrasound-guided procedures in interventional pain management. New York: Springer; 2011. pp. 3-12.

5. Albrecht E, Chin KJ. Advances in regional anaesthesia and acute pain management: a narrative review. Anaesthesia 2020;75:e101-10.

6. Souza D, Lerman I, Halaszynski TM. Ultrasound technical aspects: how to improve needle visibility. In: Narouze SN, editor. Atlas of ultrasound-guided procedures in interventional pain management. New York: Springer; 2018. pp. 27-55.

7. Marhofer P, Chan VWS. Ultrasound-guided regional anesthesia: current concepts and future trends. Anesth Analg 2007;104:1265-9.

8. Chapman GA, Johnson D, Bodenham AR. Visualisation of needle position using ultrasonography. Anaesthesia 2006;61:148-58.

9. Scholten HJ, Pourtaherian A, Mihajlovic N, Korsten HHM, A Bouwman R. Improving needle tip identification during ultrasound-guided procedures in anaesthetic practice. Anaesthesia 2017;72:889-904.

10. Yang H, Shan C, Kolen AF, de With PHN. Medical instrument detection in ultrasound: a review. Artif Intell Rev 2023;56:4363-402.

11. Beigi P, Salcudean SE, Ng GC, Rohling R. Enhancement of needle visualization and localization in ultrasound. Int J Comput Assist Radiol Surg 2021;16:169-78.

12. Chin KJ, Perlas A, Chan VWS, Brull R. Needle visualization in ultrasound-guided regional anesthesia: challenges and solutions. Reg Anesth Pain Med 2008;33:532-44.

13. Peng C, Cai Q, Chen M, Jiang X. Recent Advances in tracking devices for biomedical ultrasound imaging applications. Micromachines 2022;13:1855.

14. Aldrich JE. Basic physics of ultrasound imaging. Crit Care Med 2007;35:S131-7.

15. Obstetric Anesthesia. Needling techniques. Available from: https://pie.med.utoronto.ca/OBAnesthesia/OBAnesthesia_content/OBA_needling_module.html. [Last accessed on 25 Jul 2024].

16. Cao L, Tan YT, Wei T, Li H. Comparison between the long-axis in-plane and short-axis out-of-plane approaches for ultrasound-guided arterial cannulation: a meta-analysis and systematic review. BMC Anesthesiol 2023;23:120.

17. Miller AG, Bardin AJ. Review of ultrasound-guided radial artery catheter placement. Respir Care 2016;61:383-8.

18. Chan V, Perlas A. Basics of ultrasound: pitfalls and limitations. In: Narouze SN, editor. Atlas of ultrasound-guided procedures in interventional pain management. New York: Springer; 2018. pp. 11-5.

19. Elsharkawy H, Babazade R, Kolli S, Kalagara H, Soliman ML. The Infiniti Plus ultrasound needle guidance system improves needle visualization during the placement of spinal anesthesia. Korean J Anesthesiol 2016;69:417-9.

20. Collins GB, Fanou EM, Young J, Bhogal P. A comparison of free-hand vs laser-guided long-axis ultrasound techniques in novice users. Br J Radiol 2013;86:20130026.

21. Maecken T, Heite L, Wolf B, Zahn PK, Litz RJ. Ultrasound-guided catheterisation of the subclavian vein: freehand vs needle-guided technique. Anaesthesia 2015;70:1242-9.

22. Antico M, Sasazawa F, Wu L, et al. Ultrasound guidance in minimally invasive robotic procedures. Med Image Anal 2019;54:149-67.

23. Hemmerling TM, Taddei R, Wehbe M, Cyr S, Zaouter C, Morse J. First robotic ultrasound-guided nerve blocks in humans using the magellan system. Anesth Analg 2013;116:491-4.

24. Kojcev R, Fuerst B, Zettinig O, et al. Dual-robot ultrasound-guided needle placement: closing the planning-imaging-action loop. Int J Comput Assist Radiol Surg 2016;11:1173-81.

25. Brattain LJ, Pierce TT, Gjesteby LA, et al. AI-enabled, ultrasound-guided handheld robotic device for femoral vascular access. Biosensors 2021;11:522.

26. Choquet O, Capdevila X. Case report: Three-dimensional high-resolution ultrasound-guided nerve blocks: a new panoramic vision of local anesthetic spread and perineural catheter tip location. Anesth Analg 2013;116:1176-81.

27. Beigi P, Malenfant P, Rasoulian A, Rohling R, Dube A, Gunka V. Three-dimensional ultrasound-guided real-time midline epidural needle placement with epiguide: a prospective feasibility study. Ultrasound Med Biol 2017;43:375-9.

28. Gebhard RE, Eubanks TN, Meeks R. Three-dimensional ultrasound imaging. Curr Opin Anesthesiol 2015;28:583-7.

29. Hopkins RE, Bradley M. In-vitro visualization of biopsy needles with ultrasound: a comparative study of standard and echogenic needles using an ultrasound phantom. Clin Radiol 2001;56:499-502.

30. van de Berg NJ, Sánchez-Margallo JA, van Dijke AP, Langø T, van den Dobbelsteen JJ. A methodical quantification of needle visibility and echogenicity in ultrasound images. Ultrasound Med Biol 2019;45:998-1009.

31. Brookes J, Sondekoppam R, Armstrong K, et al. Comparative evaluation of the visibility and block characteristics of a stimulating needle and catheter vs an echogenic needle and catheter for sciatic nerve block with a low-frequency ultrasound probe. Br J Anaesth 2015;115:912-9.

32. Hovgesen CH, Wilhjelm JE, Vilmann P, Kalaitzakis E. Echogenic surface enhancements for improving needle visualization in ultrasound: a PRISMA systematic review. J Ultrasound Med 2022;41:311-25.

33. Klein SM, Fronheiser MP, Reach J, Nielsen KC, Smith SW. Piezoelectric vibrating needle and catheter for enhancing ultrasound-guided peripheral nerve blocks. Anesth Analg 2007;105:1858-60.

34. Orlando N, Snir J, Barker K, et al. A power Doppler ultrasound method for improving intraoperative tip localization for visually obstructed needles in interstitial prostate brachytherapy. Med Phys 2023;50:2649-61.

35. Daoud MI, Shtaiyat A, Zayadeen AR, Alazrai R. Accurate needle localization using two-dimensional power doppler and b-mode ultrasound image analyses: a feasibility study. Sensors 2018;18:3475.

36. Harmat A, Rohling RN, Salcudean SE. Needle tip localization using stylet vibration [Journal Article]. Ultrasound Med Biol 2006;32:1339-48.

37. Stolka PJ, Foroughi P, Rendina M, Weiss CR, Hager GD, Boctor EM. Needle guidance using handheld stereo vision and projection for ultrasound-based interventions. Med Image Comput Comput Assist Interv 2014;17:684-91.

38. Najafi M, Abolmaesumi P, Rohling R. Single-camera closed-form real-time needle tracking for ultrasound-guided needle insertion. Ultrasound Med Biol 2015;41:2663-76.

39. Gallo C, Foroughi P, Meagher E, et al. Computer-assisted needle navigation for pediatric internal jugular central venous cannulation: a feasibility study. J Vasc Access 2020;21:931-7.

40. Xia W, Mari JM, West SJ, et al. In-plane ultrasonic needle tracking using a fiber-optic hydrophone. Med Phys 2015;42:5983-91.

41. Baker C, Xochicale M, Lin FY, et al. Intraoperative needle tip tracking with an integrated fibre-optic ultrasound sensor. Sensors 2022;22:9035.

42. Kåsine T, Romundstad L, Rosseland LA, et al. Needle tip tracking for ultrasound-guided peripheral nerve block procedures - an observer blinded, randomised, controlled, crossover study on a phantom model. Acta Anaesthesiol Scand 2019;63:1055-62.

43. Watanabe K, Tokumine J, Lefor AK, et al. Photoacoustic needle improves needle tip visibility during deep peripheral nerve block. Sci Rep 2021;11:8432.

44. Nakazawa H, Tokumine J, Lefor AK, et al. Use of a photoacoustic needle improves needle tip recognition in a video recording of simulated ultrasound-guided vascular access: a pilot study. J Vasc Access 2024;25:922-7.

45. Tielens LKP, Damen RBCC, Lerou JGC, Scheffer GJ, Bruhn J. Ultrasound-guided needle handling using a guidance positioning system in a phantom. Anaesthesia 2014;69:24-31.

46. Fevre MC, Vincent C, Picard J, et al. Reduced variability and execution time to reach a target with a needle GPS system: comparison between physicians, residents and nurse anaesthetists. Anaesth Crit Care Pain Med 2018;37:55-60.

47. Niazi AU, Chin KJ, Jin R, Chan VW. Real-time ultrasound-guided spinal anesthesia using the SonixGPS ultrasound guidance system: a feasibility study. Acta Anaesthesiol Scand 2014;58:875-81.

48. Hakime A, Deschamps F, De Carvalho EG, Barah A, Auperin A, De Baere T. Electromagnetic-tracked biopsy under ultrasound guidance: preliminary results. Cardiovasc Intervent Radiol 2012;35:898-905.

49. Swenson JD, Klingler KR, Pace NL, Davis JJ, Loose EC. Evaluation of a new needle guidance system for ultrasound: results of a prospective, randomized, blinded study. Reg Anesth Pain Med 2016;41:356-61.

50. Johnson AN, Peiffer JS, Halmann N, Delaney L, Owen CA, Hersh J. Ultrasound-guided needle technique accuracy: prospective comparison of passive magnetic tracking versus unassisted echogenic needle localization. Reg Anesth Pain Med 2017;42:223-32.

51. Cheung S, Rohling R. Enhancement of needle visibility in ultrasound-guided percutaneous procedures. Ultrasound Med Biol 2004;30:617-24.

52. Beigi P, Salcudean SE, Rohling R, Ng GC. Automatic detection of a hand-held needle in ultrasound via phased-based analysis of the tremor motion. Proc SPIE 2016;9786:166-71.

53. Mwikirize C, Nosher JL, Hacihaliloglu I. Enhancement of needle tip and shaft from 2d ultrasound using signal transmission maps. In: Ourselin S, Joskowicz L, Sabuncu MR, Unal G, Wells W, editors. Medical image computing and computer-assisted intervention - MICCAI 2016. Cham: Springer; 2016. pp. 362-9.

54. Mwikirize C, Nosher JL, Hacihaliloglu I. Learning needle tip localization from digital subtraction in 2D ultrasound. Int J Comput Assist Radiol Surg 2019;14:1017-26.

55. Beigi P, Rohling R, Salcudean SE, Ng GC. CASPER: computer-aided segmentation of imperceptible motion-a learning-based tracking of an invisible needle in ultrasound. Int J Comput Assist Radiol Surg 2017;12:1857-66.

56. Pourtaherian A, Scholten HJ, Kusters L, et al. Medical instrument detection in 3-dimensional ultrasound data volumes. IEEE Trans Med Imaging 2017;36:1664-75.

57. Ayvali E, Desai JP. Optical flow-based tracking of needles and needle-tip localization using circular hough transform in ultrasound images. Ann Biomed Eng 2015;43:1828-40.

58. Pourtaherian A, Zinger S, de With, PHN, Korsten HHM, Mihajlovic N. Benchmarking of state-of-the-art needle detection algorithms in 3D ultrasound data volumes. Proc SPIE 2015;9415:577-84.

59. Beigi P, Rohling R, Salcudean SE, Ng GC. Spectral analysis of the tremor motion for needle detection in curvilinear ultrasound via spatiotemporal linear sampling. Int J Comput Assist Radiol Surg 2016;11:1183-92.

60. Mwikirize C, Nosher JL, Hacihaliloglu I. Local phase-based learning for needle detection and localization in 3D ultrasound. In: Cardoso MJ, et al., editors. Computer assisted and robotic endoscopy and clinical image-based procedures. Cham: Springer; 2017. pp. 108-15.

61. Zhao Y, Cachard C, Liebgott H. Automatic needle detection and tracking in 3D ultrasound using an ROI-based RANSAC and Kalman method. Ultrason Imaging 2013;35:283-306.

62. Tang C, Xie G, Omisore OM, Xiong J, Xia Z. A real-time needle tracking algorithm with first-frame linear structure removing in 2D ultrasound-guided prostate therapy. In: 2019 IEEE International Conference on Robotics and Biomimetics (ROBIO); 2019 Dec 6-8; Dali, China. IEEE; 2020. pp. 1240-5.

63. Hacihaliloglu I, Beigi P, Ng G, Rohling RN, Salcudean S, Abolmaesumi P. Projection-based phase features for localization of a needle tip in 2D curvilinear ultrasound. In: Navab N, Hornegger J, Wells WM, Frangi A, editors. Medical image computing and computer-assisted intervention - MICCAI 2015. Cham: Springer; 2015. pp. 347-54.

64. Agarwal N, Yadav AK, Gupta A, Orlando MF. Real-time needle tip localization in 2D ultrasound images using kalman filter. In: 2019 IEEE/ASME International Conference on Advanced Intelligent Mechatronics (AIM); 2019 Jul 8-12; Hong Kong, China. IEEE; 2019. pp. 1008-12.

65. Mwikirize C, Nosher JL, Hacihaliloglu I. Single shot needle tip localization in 2D ultrasound. In: Shen D, et al., editors. Medical image computing and computer assisted intervention - MICCAI 2019. Cham: Springer; 2019. pp. 637-45.

66. Mwikirize C, Kimbowa AB, Imanirakiza S, Katumba A, Nosher JL, Hacihaliloglu I. Time-aware deep neural networks for needle tip localization in 2D ultrasound. Int J Comput Assist Radiol Surg 2021;16:819-27.

67. Hatt CR, Ng G, Parthasarathy V. Enhanced needle localization in ultrasound using beam steering and learning-based segmentation. Comput Med Imaging Graph 2015;41:46-54.

68. Younes H, Voros S, Troccaz J. Automatic needle localization in 3D ultrasound images for brachytherapy. In: 2018 IEEE 15th International Symposium on Biomedical Imaging (ISBI 2018); 2018 Apr 4-7; Washington, DC, USA. IEEE; 2018. pp. 1203-7.

69. Beigi P, Rohling R, Salcudean T, Lessoway VA, Ng GC. Detection of an invisible needle in ultrasound using a probabilistic SVM and time-domain features. Ultrasonics 2017;78:18-22.

70. Geraldes AA, Rocha TS. A neural network approach for flexible needle tracking in ultrasound images using Kalman filter. In: 5th IEEE RAS/EMBS International Conference on Biomedical Robotics and Biomechatronics; 2014 Aug 12-15; Sao Paulo, Brazil. IEEE; 2014. pp. 70-5.

71. Pourtaherian A, Ghazvinian Zanjani F, Zinger S, et al. Improving needle detection in 3d ultrasound using orthogonal-plane convolutional networks. In: Descoteaux M, et al., editors. Medical image computing and computer-assisted intervention - MICCAI 2017. Cham: Springer; 2017. pp. 610-8.

72. Pourtaherian A. Robust needle detection and visualization for 3D ultrasound image-guided interventions. Available from: https://research.tue.nl/en/publications/robust-needle-detection-and-visualization-for-3d-ultrasound-image. [Last accessed on 25 Jul 2024].

73. Arif M, Moelker A, van Walsum T. Automatic needle detection and real-time Bi-planar needle visualization during 3D ultrasound scanning of the liver. Med Image Anal 2019;53:104-10.

74. Lee JY, Islam M, Woh JR, et al. Ultrasound needle segmentation and trajectory prediction using excitation network. Int J Comput Assist Radiol Surg 2020;15:437-43.

75. Zhang Y, He X, Tian Z, et al. Multi-needle detection in 3d ultrasound images using unsupervised order-graph regularized sparse dictionary learning. IEEE Trans Med Imaging 2020;39:2302-15.

76. Gao J, Liu P, Liu GD, Zhang L. Robust needle localization and enhancement algorithm for ultrasound by deep learning and beam steering methods. J Comput Sci Technol 2021;36:334-46.

77. Wijata A, Andrzejewski J, Pyciński B. An automatic biopsy needle detection and segmentation on ultrasound images using a convolutional neural network. Ultrason Imaging 2021;43:262-72.

78. Chen S, Lin Y, Li Z, Wang F, Cao Q. Automatic and accurate needle detection in 2D ultrasound during robot-assisted needle insertion process. Int J Comput Assist Radiol Surg 2022;17:295-303.

79. Zhang Y, Tian Z, Lei Y, et al. Automatic multi-needle localization in ultrasound images using large margin mask RCNN for ultrasound-guided prostate brachytherapy. Phys Med Biol 2020;65:205003.

80. Zhang Y, Tian Z, Lei Y, et al. Multi-needle detection in ultrasound image using max-margin mask R-CNN. Proc SPIE 2021;11602:264-69.

81. Andersén C, Rydén T, Thunberg P, Lagerlöf JH. Deep learning-based digitization of prostate brachytherapy needles in ultrasound images. Med Phys 2020;47:6414-20.

82. Zhang Y, Lei Y, He X, et al. Ultrasound multi-needle detection using deep attention U-Net with TV regularizations. Proc SPIE 2021;11598:599-604.

83. Mwikirize C, Nosher JL, Hacihaliloglu I. Convolution neural networks for real-time needle detection and localization in 2D ultrasound. Int J Comput Assist Radiol Surg 2018;13:647-57.

84. Wang R, Tan G, Liu X. Robust tip localization under continuous spatial and temporal constraints during 2D ultrasound-guided needle puncture. Int J Comput Assist Radiol Surg 2023;18:2233-42.

85. Rubin J, Chen A, Thodiyil AO, et al. Efficient video-based deep learning for ultrasound guided needle insertion. 2021. Available from: https://openreview.net/forum?id=dVUHL5QhDhL. [Last accessed on 25 Jul 2024].

86. Wang R, Tan G, Liu X. TipDet: A multi-keyframe motion-aware framework for tip detection during ultrasound-guided interventions. Comput Methods Programs Biomed 2024;247:108109.

87. Che H, Qin J, Chen Y, et al. Improving needle tip tracking and detection in ultrasound-based navigation system using deep learning-enabled approach. IEEE J Biomed Health Inform 2024;28:2930-42.

88. Zade AAT, Aziz MJ, Majedi H, Mirbagheri A, Ahmadian A. Spatiotemporal analysis of speckle dynamics to track invisible needle in ultrasound sequences using convolutional neural networks: a phantom study. Int J Comput Assist Radiol Surg 2023;18:1373-82.

89. Gillies DJ, Rodgers JR, Gyacskov I, et al. Deep learning segmentation of general interventional tools in two-dimensional ultrasound images. Med Phys 2020;47:4956-70.

90. Pourtaherian A, Mihajlovic N, Ghazvinian Zanjani F, et al. Localization of partially visible needles in 3D ultrasound using dilated CNNs. In: 2018 IEEE International Ultrasonics Symposium (IUS); 2018 Oct 22-25; Kobe, Japan. IEEE; 2019. pp. 1-4.

91. Yan W, Ding Q, Chen J, Yan K, Tang RSY, Cheng SS. Learning-based needle tip tracking in 2D ultrasound by fusing visual tracking and motion prediction. Med Image Anal 2023;88:102847.

92. Zhang Y, Harms J, Lei Y, et al. Weakly supervised multi-needle detection in 3D ultrasound images with bidirectional convolutional sparse coding. Proc SPIE 2020;11319:229-36.

93. Maier-Hein L, Reinke A, Godau P, et al. Metrics reloaded: recommendations for image analysis validation. Nat Methods 2024;21:195-212.

94. Reinke A, Tizabi MD, Baumgartner M, et al. Understanding metric-related pitfalls in image analysis validation. Nat Methods 2024;21:182-94.

95. Rodriguez-Molares A, Fatemi A, Lovstakken L, Torp H. Specular beamforming. IEEE Trans Ultrason Ferroelectr Freq Control 2017;64:1285-97.

96. Cai Q, Hu J, Chen M, et al. Inertial measurement unit-assisted ultrasonic tracking system for ultrasound probe localization. IEEE Trans Ultrason Ferroelectr Freq Control 2023;70:920-29.

Cite This Article

Review
Open Access
Advancements in needle visualization enhancement and localization methods in ultrasound: a literature review
Alvin KimbowaAlvin Kimbowa, ... Ilker HacihalilogluIlker Hacihaliloglu

How to Cite

Kimbowa, A.; Pieters A.; Tadayon P.; Arora I.; Gulam S.; Pinos A.; Liu D.; Hacihaliloglu I. Advancements in needle visualization enhancement and localization methods in ultrasound: a literature review. Art. Int. Surg. 2024, 4, 149-69. http://dx.doi.org/10.20517/ais.2024.20

Download Citation

If you have the appropriate software installed, you can download article citation data to the citation manager of your choice. Simply select your manager software from the list below and click on download.

Export Citation File:

Type of Import

Tips on Downloading Citation

This feature enables you to download the bibliographic information (also called citation data, header data, or metadata) for the articles on our site.

Citation Manager File Format

Use the radio buttons to choose how to format the bibliographic data you're harvesting. Several citation manager formats are available, including EndNote and BibTex.

Type of Import

If you have citation management software installed on your computer your Web browser should be able to import metadata directly into your reference database.

Direct Import: When the Direct Import option is selected (the default state), a dialogue box will give you the option to Save or Open the downloaded citation data. Choosing Open will either launch your citation manager or give you a choice of applications with which to use the metadata. The Save option saves the file locally for later use.

Indirect Import: When the Indirect Import option is selected, the metadata is displayed and may be copied and pasted as needed.

About This Article

Special Issue

This article belongs to the Special Issue Surgical Data Science - The Art of Data
© The Author(s) 2024. Open Access This article is licensed under a Creative Commons Attribution 4.0 International License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, sharing, adaptation, distribution and reproduction in any medium or format, for any purpose, even commercially, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.

Data & Comments

Data

Views
253
Downloads
26
Citations
0
Comments
0
2

Comments

Comments must be written in English. Spam, offensive content, impersonation, and private information will not be permitted. If any comment is reported and identified as inappropriate content by OAE staff, the comment will be removed without notice. If you have any queries or need any help, please contact us at support@oaepublish.com.

0
Download PDF
Share This Article
Scan the QR code for reading!
See Updates
Contents
Figures
Related
Artificial Intelligence Surgery
ISSN 2771-0408 (Online)
Follow Us

Portico

All published articles will be preserved here permanently:

https://www.portico.org/publishers/oae/

Portico

All published articles will be preserved here permanently:

https://www.portico.org/publishers/oae/