The future of robotic radical prostatectomy driven by artificial intelligence
Since we have entered the precision prostate cancer surgery era[1], the robotic approach currently represents the preferred choice of the patients[2]. Focusing on robot assisted radical prostatectomy (RARP), several technical[3-5] and technological[6] innovations have been introduced with the aim to maximize both functional and oncological outcomes.
The advent of three-dimensional (3D) technology[7] meets both patients’[8] and surgeons’ preferences[8,9], allowing visualization of the anatomy three-dimensionally and enhancing the perception of the disease’s location and characteristics, such as its relationship with the prostate capsule.
A step further in this direction is represented by the possibility to overlap the 3D virtual images with the real anatomy during in vivo robotic procedure, performing augmented reality procedures[10,11]. As reported in our previous experiences, 3D prostatic models can be obtained from 2D-MRI images and consequently used during RARP, allowing the surgeon to focus on the tumor’s characteristics, with particular attention to the potential presence of extracapsular extension. Thanks to specifically developed software, virtual models can be displayed on the da Vinci surgical console (Intuitive Surgical Inc.) and automatically anchored to the in vivo live images during surgery[12].
Notwithstanding the initial encouraging findings, this approach revealed not to be accurate enough due to the high heterogeneity of colors displayed during the endoscopic view and the absence of clear intraoperative landmarks for providing a precise spatial orientation along the three main axes. In the current year, we began to explore the potential applications of artificial intelligence (AI) for urologic in vivo surgery. Our new approach consists of a two-step automatic system that aligns a 3D virtual model of the patient’s prostate with the endoscopic 2D images at the end of the extirpative phase during RARP. For each of the two steps, a specific convolutional neural network (CNN) was developed. Briefly, the first CNN outputs catheter location and z rotation by identifying the anchor point. The second CNN returns antero-posterior rotation on the x axis. Their combined results allow to perform the actual overlay rate. Our findings are promising and were presented during the last edition of Virtual EAU 2021, showing that the introduction of CNNs allows to correctly overlay 3D virtual images in a completely automatic manner. The correct identification of extracapsular extension at the level of the resection bed can potentially bring a reduction in positive surgical margins rates, with a subsequent oncological advantage for the patients with locally advanced disease[13].
As shown in the recent literature, the application of AI in uro-oncology has gained wide diffusion[14]; despite its use during live surgeries, it is still limited to anecdotical experiences[15]. The intraoperative support of machine learning (ML) for autonomous camera positioning was promisingly explored analyzing data obtained by instrument kinematics, laparoscopic video, and surgeon eye-tracking[15]. On the contrary, the application of ML to more complex tasks (e.g., suturing, knot-tying, and tissue dissection) is more difficult to reach. As recently summarized by Ma et al.[16], a robot must be able to perform three different actions to complete these surgical tasks: it must “see” (vision recognition), “think” (task planning), and “act” (task execution).
Therefore, even if this field of research seems to be the most appealing, we need to think of the potentiality of AI driven surgery, looking to a wider horizon[16,17].
Starting from preoperative setting, as shown by Auffenberg et al.[18], specifically developed ML algorithms can help the surgeon in selecting candidates for the different treatments (e.g., active surveillance, radical prostatectomy, radiation therapy, and androgen-deprivation therapy) by analyzing data from the electronic health records.
Furthermore, this technology may also be applied for improving surgical training: by extracting data from the da Vinci console, dedicated ML can be developed to automatically analyze the trainees’ movements, providing a personalized evaluation highlighting the strongest and weakest technical abilities[19]. As well, the application of ML-based analysis to automate segmentation of anatomical landmarks during 12 different surgical steps during RARP showed, with respect to human segmentation, that the ML-based model annotated better the boundaries[20].
Looking to the future, the further development of robotic technology towards automation will enhance surgical outcomes by improving the workflow and minimizing repetitive or mundane tasks[21].
However, the most challenging aspect of this technology is the ability to reproduce the sophistication of human movements and therefore to reach complete autonomy.
Theoretically, to reach this quality of information, multiple tertiary centers should provide standardized data, following uniform standards. Deep learning models developed from these data may be able to predict unexpected complications, offering the surgeon a chance to adjust the intraoperative planning. The robotic system would also be able to recognize the operator and adapt its feedback to the surgeon, providing instantly tailored data to reach the best and smartest surgical decision making[16]. Moreover, exploiting the available cloud services and high-speed internet connection (i.e., 5G), information can be rapidly exchanged between machines.
Even if this scenario sounds appealing, the assurance of data secrecy and the lack of precise legislation represent technical obstacles which still need to be overcome.
In conclusion, particularly in an intraoperative setting, the advent of AI is obstacle by the lack of live data collection and by the complexity of privacy and data sharing legislation.
For all these reasons, the current research should be focused on the ability of AI to provide the operator important additional information (e.g., augmented reality images) during the surgical procedure, rather than trying to substitute the surgeon.
DECLARATIONS
AcknowledgementWe would like to thank Dott. Paolo Verri for his support in this study
Authors’ contributionsStudy concept and manuscript writing: Checcucci E, Porpiglia F
Availability of data and materialsNot applicable.
Financial support and sponsorshipNone.
Conflicts of interestBoth authors declared that there are no conflicts of interest.
Ethical approval and consent to participateNot applicable.
Consent for publicationNot applicable.
Copyright© The Author(s) 2021.
REFERENCES
1. Checcucci E, Amparore D, De Luca S, Autorino R, Fiori C, Porpiglia F. Precision prostate cancer surgery: an overview of new technologies and techniques. Minerva Urol Nefrol 2019;71:487-501.
2. Cacciamani GE, Sebben M, Tafuri A, et al. Consulting 'Dr. Google' for minimally invasive urological oncological surgeries: A contemporary web-based trend analysis. Int J Med Robot 2021;17:e2250.
3. Checcucci E, Pecoraro A, DE Cillis S, et al. San Luigi Study Group. The importance of anatomical reconstruction for continence recovery after robot assisted radical prostatectomy: a systematic review and pooled analysis from referral centers. Minerva Urol Nephrol 2021;73:165-77.
4. Campobasso D, Fiori C, Amparore D, et al. Total anatomical reconstruction during robot-assisted radical prostatectomy in patients with previous prostate surgery. Minerva Urol Nefrol 2019;71:605-11.
5. Manfredi M, Fiori C, Amparore D, Checcucci E, Porpiglia F. Technical details to achieve perfect early continence after radical prostatectomy. Minerva Chir 2019;74:63-77.
6. Porpiglia F, Amparore D, Checcucci E, et al. for ESUT Research Group. Current use of three-dimensional model technology in urology: a road map for personalised surgical planning. Eur Urol Focus 2018;4:652-6.
7. Checcucci E, Amparore D, Fiori C, et al. 3D imaging applications for robotic urologic surgery: an ESUT YAUWP review. World J Urol 2020;38:869-81.
8. Porpiglia F, Bertolo R, Checcucci E, et al. ESUT Research Group. Development and validation of 3D printed virtual models for robot-assisted radical prostatectomy and partial nephrectomy: urologists' and patients' perception. World J Urol 2018;36:201-7.
9. Amparore D, Pecoraro A, Checcucci E, et al. 3D imaging technologies in minimally-invasive kidney and prostate cancer surgery: which is the urologists' perception? Minerva Urol Nephrol 2021; doi: 10.23736/S2724-6051.21.04131-X.
10. Porpiglia F, Checcucci E, Amparore D, et al. Augmented-reality robot-assisted radical prostatectomy using hyper-accuracy three-dimensional reconstruction (HA3D™) technology: a radiological and pathological study. BJU Int 2019;123:834-45.
11. Porpiglia F, Checcucci E, Amparore D, et al. Three-dimensional elastic augmented-reality robot-assisted radical prostatectomy using hyperaccuracy three-dimensional reconstruction technology: a step further in the identification of capsular involvement. Eur Urol 2019;76:505-14.
12. Porpiglia F, Checcucci E, Amparore D, et al. Extracapsular extension on neurovascular bundles during robot-assisted radical prostatectomy precisely localized by 3D automatic augmented-reality rendering. J Urol 2020;203:e1297.
13. Porpiglia F, Checcucci E, Amparore D, et al. Artificial intelligence guided 3D automatic augmented-reality images allow to identify the extracapsular extension on neurovascular bundles during robotic prostatectomy. Eur Urol 2021;79:S1560.
14. Checcucci E, Autorino R, Cacciamani GE, et al. Uro-technology and SoMe Working Group of the Young Academic Urologists Working Party of the European Association of Urology. Artificial intelligence and neural networks in urology: current clinical applications. Minerva Urol Nefrol 2020;72:49-57.
15. Checcucci E, De Cillis S, Granato S, Chang P, Afyouni AS, Okhunov Z. Uro-technology and SoMe Working Group of the Young Academic Urologists Working Party of the European Association of Urology. Applications of neural networks in urology: a systematic review. Curr Opin Urol 2020;30:788-807.
16. Ma R, Vanstrum EB, Lee R, Chen J, Hung AJ. Machine learning in the optimization of robotics in the operative field. Curr Opin Urol 2020;30:808-16.
17. Bhandari M, Zeffiro T, Reddiboina M. Artificial intelligence and robotic surgery: current perspective and future directions. Curr Opin Urol 2020;30:48-54.
18. Auffenberg GB, Ghani KR, Ramani S, et al. Michigan Urological Surgery Improvement Collaborative. askMUSIC: leveraging a Clinical Registry to develop a new machine learning model to inform patients of prostate cancer treatments chosen by similar men. Eur Urol 2019;75:901-7.
19. Ershad M, Rege R, Majewicz Fey A. Automatic and near real-time stylistic behavior assessment in robotic surgery. Int J Comput Assist Radiol Surg 2019;14:635-43.
20. Zia A, Guo L, Zhou L, et al. Novel evaluation of surgical activity recognition models using task-based efficiency metrics. Int J Comput Assist Radiol Surg 2019;14:2155-63.
Cite This Article
How to Cite
Checcucci, E.; Porpiglia, F. The future of robotic radical prostatectomy driven by artificial intelligence. Mini-invasive. Surg. 2021, 5, 49. http://dx.doi.org/10.20517/2574-1225.2021.98
Download Citation
Export Citation File:
Type of Import
Tips on Downloading Citation
Citation Manager File Format
Type of Import
Direct Import: When the Direct Import option is selected (the default state), a dialogue box will give you the option to Save or Open the downloaded citation data. Choosing Open will either launch your citation manager or give you a choice of applications with which to use the metadata. The Save option saves the file locally for later use.
Indirect Import: When the Indirect Import option is selected, the metadata is displayed and may be copied and pasted as needed.
Comments
Comments must be written in English. Spam, offensive content, impersonation, and private information will not be permitted. If any comment is reported and identified as inappropriate content by OAE staff, the comment will be removed without notice. If you have any queries or need any help, please contact us at support@oaepublish.com.