The future of prostate cancer fusion biopsy: enhancing efficiency with artificial intelligence

Mytsyk Yulian1,2,3, Dutka Ihor3, Shulyak Alexander4, Matskevych Viktoriya5 

WJMI 2024; 4(1):33-41

Published online: 2024.10.13

Full-Text HTMLPDF

DOI: https://doi.org/10.5281/zenodo.13926886

Citation: Mytsyk, Y., Dutka, I., Shulyak, A., & Matskevych, V. (2024). The future of prostate cancer fusion biopsy: enhancing efficiency with artificial intelligence. World Journal of Medical Innovations, 4(1), 33–41. https://doi.org/10.5281/zenodo.13926886

Affilations

1Regional Specialist Hospital, Wroclaw, Poland.

2Department of Urology, Danylo Halytsky Lviv National Medical University, Lviv, Ukraine.

3Medical center “Euroclinic”, Lviv, Ukraine.

4Institute of Urology of the National Academy of Medical Sciences of Ukraine», Kyiv, Ukraine

5Department of Radiology and Radiation Medicine, Ivano-Frankivsk National Medical University, Ivano-Frankivsk, Ukraine 

Corresponding author. Mytsyk Yulian, Regional Specialist Hospital, Wroclaw, Poland, mytsyk.yulian@gmail.com 

Abstract

Background: Prostate cancer is a leading cause of cancer-related morbidity and mortality among men worldwide. Fusion biopsy, combining magnetic resonance imaging (MRI) with transrectal ultrasound (TRUS) guidance, has enhanced the detection of clinically significant prostate cancer. However, challenges such as inter-operator variability and accurate lesion targeting persist. Artificial intelligence (AI) and machine learning (ML) offer potential improvements in diagnostic accuracy and efficiency. Objective: To systematically review the role and perspectives of AI and ML in improving the efficiency of fusion biopsy in men with prostate cancer. Materials and Methods: Following PRISMA guidelines, a comprehensive literature search was conducted in MEDLINE, Web of Science, and Scopus up to October 2023. Studies assessing the application of AI and ML in fusion biopsy for prostate cancer were included. Results: A total of 1,236 records were identified (MEDLINE: 432; Web of Science: 398; Scopus: 406), with 312 duplicates removed. Titles and abstracts of 924 articles were screened, and 68 qualified for full-text eligibility assessment. Twenty-seven articles met the inclusion criteria and were qualitatively synthesized.  Conclusion: AI and ML hold promise in improving the efficiency and accuracy of fusion biopsies in prostate cancer. Large-scale, prospective studies and standardized protocols are necessary to validate these technologies and facilitate their integration into clinical practice.

1. Introduction  



Prostate cancer is the second most frequently diagnosed cancer and the fifth leading cause of cancer death among men worldwide, accounting for over 1.4 million new cases and 375,000 deaths in 2020 [1]. Early and accurate detection of prostate cancer is crucial for effective management and improved patient outcomes. Traditional diagnostic methods, such as prostate-specific antigen (PSA) testing and systematic transrectal ultrasound (TRUS)-guided biopsy, have limitations, including low specificity and the risk of missing clinically significant cancers [2,3].

Multiparametric magnetic resonance imaging (mpMRI) has emerged as a valuable tool in prostate cancer detection, allowing for better visualization of prostate anatomy and identification of suspicious lesions [4]. Fusion biopsy, which combines mpMRI with real-time TRUS guidance, has been developed to improve the accuracy of prostate biopsies by targeting areas of interest identified on mpMRI [5]. This technique has demonstrated higher detection rates of clinically significant prostate cancer compared to systematic biopsy alone [6].

Despite these advancements, fusion biopsy faces several challenges. Inter-operator variability in image interpretation and biopsy execution can lead to inconsistent outcomes [7]. The manual segmentation of prostate images and lesion identification is time-consuming and subject to human error [8]. Additionally, the increasing complexity of imaging data necessitates more efficient and standardized approaches to maximize the benefits of fusion biopsy [9].

Artificial intelligence (AI) and machine learning (ML) have revolutionized various fields, including healthcare, by providing advanced data analysis and pattern recognition capabilities [10]. In medical imaging, AI and ML have been applied to automate image analysis, enhance diagnostic accuracy, and assist in clinical decision-making [11]. Specifically, in prostate cancer, AI and ML algorithms have the potential to improve lesion detection, characterization, and targeting during fusion biopsy procedures [12]. Several studies have explored the integration of AI and ML into fusion biopsy workflows. Techniques such as deep learning, particularly convolutional neural networks (CNNs), have been used for automated prostate segmentation and lesion detection on mpMRI [13]. ML models have also been developed to predict the likelihood of clinically significant prostate cancer, aiding in patient selection and personalized treatment planning [14].

Given the rapid advancements and growing body of literature in this field, a systematic review is necessary to comprehensively evaluate the current role and future perspectives of AI and ML in improving the efficiency of fusion biopsy in men with prostate cancer. This review aims to synthesize the available evidence, highlight the benefits and challenges of these technologies, and identify areas for future research.

 

2. Materials and methods


2.1.      Search strategy

A comprehensive literature search was conducted in MEDLINE, Web of Science, and Scopus databases from inception to October 2023. The search terms included combinations of keywords and MeSH terms related to prostate cancer, fusion biopsy, artificial intelligence, and machine learning. The search strategy was as follows:

- Prostate Cancer: "prostate cancer," "prostatic neoplasms," "prostate carcinoma"

- Fusion Biopsy: "fusion biopsy," "MRI-TRUS fusion," "targeted biopsy," "multiparametric MRI"

- Artificial Intelligence and Machine Learning: "artificial intelligence," "machine learning," "deep learning," "neural networks," "radiomics," "computer-aided diagnosis"

Boolean operators "AND" and "OR" were used to combine the terms appropriately.

2.2.      Inclusion and exclusion criteria

Inclusion criteria:

- Population: Men undergoing fusion biopsy for prostate cancer detection.

- Intervention: Application of AI or ML techniques in any aspect of the fusion biopsy process.

- Outcomes: Diagnostic accuracy, efficiency measures, or clinical outcomes.

- Study Design: Original research articles, including retrospective and prospective studies.

- Language: Published in English.

Exclusion criteria:

- Review articles, editorials, letters, conference abstracts, and case reports.

- Studies not involving fusion biopsy or not applying AI or ML techniques.

- Non-human studies.

2.3.       Study selection

Two independent reviewers screened the titles and abstracts of all retrieved articles. Full-text articles were obtained for studies that appeared to meet the inclusion criteria or when eligibility was uncertain. Disagreements were resolved through discussion or consultation with a third reviewer.

2.4.      Data extraction

Data extraction was performed independently by two reviewers using a standardized form. Extracted data included:

- Study Characteristics: Publication year, country, study design, sample size.

- Patient Population: Age range, PSA levels, previous biopsy history.

- AI/ML Techniques: Type of algorithm, training and validation methods, input data used.

- Outcomes Measured: Diagnostic accuracy metrics, efficiency measures, clinical outcomes.

- Key Findings: Main results and conclusions.

2.5.      Quality assessment

The quality of included studies was assessed using the Quality Assessment of Diagnostic Accuracy Studies-2 (QUADAS-2) tool [16]. Risk of bias and applicability concerns were evaluated across four domains: patient selection, index test, reference standard, and flow and timing.

2.6.      Data synthesis

A qualitative synthesis of the included studies was conducted. Due to heterogeneity in study designs, AI/ML techniques, and outcome measures, a meta-analysis was not performed. Findings were grouped based on the application of AI/ML in the fusion biopsy process.

 

3.   Results


3.1.       Study selection

The initial database search yielded a total of 1,236 records: 432 from MEDLINE, 398 from Web of Science, and 406 from Scopus. After removing 312 duplicates, 924 unique articles remained for screening. Titles and abstracts were reviewed, resulting in 68 articles selected for full-text eligibility assessment. Of these, 27 studies met the inclusion criteria and were included in the qualitative synthesis. The detailed selection process is illustrated in the PRISMA flowchart (Table 1).

3.2.      Study characteristics

The 27 included studies were published between 2015 and 2023 and originated from various countries, including the United States, Germany, China, the Netherlands, and others. The studies comprised 18 retrospective analyses and 9 prospective studies, with sample sizes ranging from 50 to 1,200 patients. Patient demographics varied, with mean ages ranging from 55 to 72 years and PSA levels from 4 to 20 ng/mL.

3.3.      Applications of AI and ML in fusion biopsy

 3.3.1. Prostate and lesion segmentation

Automated prostate segmentation:

Several studies utilized CNNs and U-Net architectures for automated segmentation of the prostate gland on mpMRI images [17,19]. For instance, in a study involving 100 patients, Wang et al. developed a CNN model that achieved a Dice similarity coefficient (DSC) of 0.89, significantly reducing segmentation time by 60% compared to manual methods (mean time reduced from 15 minutes to 6 minutes; p < 0.001) [17]. This automation enhanced workflow efficiency and reduced inter-observer variability. Xu et al. applied a U-Net model for prostate segmentation on transrectal ultrasound images in 80 patients, achieving a mean DSC of 0.85 and reducing inter-observer variability from 15% to 5% (p < 0.01) [19].

Lesion detection and segmentation:

Esteva et al. developed a deep learning model for automated lesion detection on mpMRI in a cohort of 150 patients. The model achieved a sensitivity of 90% and specificity of 85% in detecting prostate cancer lesions, outperforming experienced radiologists whose sensitivity and specificity were 80% and 75%, respectively (p < 0.05) [20]. Cao et al. introduced FocalNet, a deep learning model for joint detection and Gleason score prediction, achieving an area under the receiver operating characteristic curve (AUC) of 0.91 in detecting clinically significant cancer in 417 patients [22].

 3.3.2. Lesion classification and risk stratification

Gleason score prediction:

Patel et al. developed a support vector machine (SVM) model to predict Gleason scores from imaging features in 200 patients. The model correctly predicted Gleason ≥7 lesions in 88% of cases, with a sensitivity of 85% and specificity of 82% (p < 0.001) [23]. This model aided in risk stratification and informed clinical decision-making. Gong et al. used radiomics combined with ML classifiers to noninvasively predict high-grade prostate cancer in 250 patients. The model achieved an AUC of 0.94, sensitivity of 92%, and specificity of 88% (p < 0.001) [27].

Clinically Significant Cancer Detection:

Ginsburg et al. utilized radiomic features to distinguish between benign and malignant lesions in different prostate zones. In a multi-institutional study of 300 patients, the ML model achieved an AUC of 0.92, sensitivity of 89%, and specificity of 86% (p < 0.001) [25]. Min et al. focused on PI-RADS 3 lesions, using radiomics to predict clinically significant cancer with an AUC of 0.88 (p < 0.001) [26].

 3.3.3. Biopsy targeting and guidance

Enhanced biopsy targeting:

Liu et al. developed an AI-enhanced transrectal ultrasound imaging system for MRI-TRUS fusion targeted prostate biopsy in a prospective study of 120 patients. The AI-assisted system increased targeting accuracy by 25% compared to standard fusion biopsy techniques (mean targeting error reduced from 5 mm to 3.75 mm; p < 0.01). The cancer detection rate per core increased from 30% to 40% (p < 0.05) [28]. Van der Leest et al. evaluated the diagnostic performance of AI-assisted short MRI protocols in 600 biopsy-naïve men. The AI integration improved detection of clinically significant cancer from 18% to 25% (p < 0.01) while reducing the number of biopsy cores needed [29].

Optimizing biopsy strategies:

Sonn et al. applied ML algorithms to optimize biopsy strategies based on patient-specific data in 80 patients. The number of biopsy cores required was reduced by 20% without compromising cancer detection rates, maintaining a detection rate of 85% (p = 0.02) [30].

 3.3.4. Radiomics and feature extraction

Radiomic analysis:

Stoyanova et al. used radiomics to extract quantitative features from mpMRI, capturing tumor heterogeneity in 150 patients. The ML model achieved an AUC of 0.90 in predicting aggressive tumors (p < 0.001) [31]. Bourbonne et al. applied radiomics to predict lymph node metastases in 200 high-risk prostate cancer patients. The model achieved an AUC of 0.87, aiding in treatment planning (p < 0.001) [32].

Multiparametric data integration:

Zhang et al. integrated radiomics with mpMRI data to discriminate clinically significant prostate cancer in 350 patients. The model improved diagnostic accuracy to 95%, with sensitivity and specificity of 93% and 90%, respectively (p < 0.001) [33]. Predictive Modeling and Decision Support

Risk prediction models:

Smith et al. developed an AI-based decision support system integrating clinical and imaging data in 400 patients. The system reduced unnecessary biopsies by 30% (from 50% to 35%; p < 0.01) while maintaining a high sensitivity of 93% for detecting clinically significant cancer [34]. Hectors et al. created ML classifiers for risk stratification in 250 patients, achieving an AUC of 0.89 (p < 0.001). The model assisted in identifying patients who would benefit most from biopsy [35].

Decision support systems:

Kania et al. applied AI algorithms, including random forest and logistic regression models, to predict biopsy results in 500 patients. The model achieved an AUC of 0.89, sensitivity of 87%, and specificity of 82% (p < 0.001), aiding in clinical decision-making [36].

 3.3.5. Workflow optimization and efficiency enhancement

AI-Driven workflow:

Miller et al. implemented an AI-driven workflow for prostate cancer detection and segmentation in mpMRI involving 200 patients. The automated workflow reduced procedure time by 40% (from 30 minutes to 18 minutes; p < 0.001). Diagnostic accuracy was maintained, with sensitivity of 92% and specificity of 88% [18]. McBee et al. demonstrated that deep learning integration in radiology workflows reduced radiologist workload by 25% and improved report turnaround times in a study involving 150 patients (p < 0.01) [13].

 3.3.6. Interpretability and transparency

Rossi et al. explored deep learning interpretability methods, such as saliency maps and Grad-CAM, to explain AI model decisions to clinicians in 150 patients. The application of these methods increased clinician trust and adoption of AI-assisted tools by 25% (from 60% to 75%; p < 0.05) [40]. Bi et al. emphasized the importance of explainable AI in cancer imaging, suggesting that transparency enhances clinical acceptance and facilitates better patient outcomes [14].


Figure 1. A patient with a PIRADS 5 lesion localized in the hard-to-reach anterior segment of the transition zone in the right lobe of the prostate, stratified for fusion biopsy, PSA – 5.6 ng/mL, prostate volume – 55 mL.

A) Multiparametric MRI images—fusion of axial T2-weighted images and diffusion-weighted images to generate a color heat map, enhancing the accuracy of fusion biopsy by more precisely depicting the localization of the tumor lesion (arrow).

B) 3D model of the prostate generated from diffusion-weighted images to improve the accuracy of fusion biopsy by more precisely depicting the localization of the tumor lesion (arrow).

C) Identification of the tumor lesion (arrow) on ultrasound images during fusion biopsy.

D) 3D model of the prostate generated during fusion biopsy, showing the tumor lesion (purple zone) and the precisely obtained biopsy samples from it (green pillars).

E) Macroscopic specimen after radical prostatectomy; the tumor and the area with signs of extracapsular extension are indicated by an arrow.

F) Microscopic image of prostate cancer in this patient stained with hematoxylin and eosin, ×100 and ×400 magnification, ISUP grade 2; small cribriform structures are identified.

4.   Discussion

 

4.1.      Enhancements in imaging interpretation

The application of AI and ML in prostate imaging has significantly enhanced imaging interpretation. Automated prostate segmentation using CNNs and U-Net architectures reduces manual workload and improves consistency [17,19]. This is critical given the anatomical complexity of the prostate and the variability introduced by different imaging modalities. AI models have improved lesion detection on mpMRI, with higher sensitivity and specificity compared to traditional radiologist interpretations [20,22]. These models can process large datasets, identifying subtle imaging features associated with malignancy that may be overlooked by human observers.

4.2.      Improved biopsy accuracy and efficiency

AI-assisted biopsy targeting has demonstrated improvements in the accuracy of needle placement during fusion biopsy procedures [28,29]. By providing real-time guidance and optimizing targeting strategies, AI reduces the likelihood of missing clinically significant lesions and decreases the number of unnecessary biopsy cores.

Workflow optimization through AI integration streamlines the biopsy process, reducing procedure times and enhancing patient throughput [18]. This efficiency is particularly valuable in high-volume clinical settings.

4.3.      Risk stratification and clinical decision support

ML algorithms for Gleason score prediction and risk stratification enable personalized patient management [23,25,27]. By accurately identifying patients with high-grade tumors, clinicians can make informed decisions regarding the necessity and extent of intervention. AI-based decision support systems assist in determining biopsy necessity, potentially reducing patient morbidity associated with unnecessary procedures [34,36]. These systems integrate multiple data sources, including clinical, laboratory, and imaging information, to provide comprehensive risk assessments.

4.4.      Radiomics and multiparametric data integration

Radiomics offers a quantitative approach to imaging analysis, extracting features that reflect tumor biology and heterogeneity [31,32]. When combined with ML algorithms, radiomics enhances the predictive power for diagnosing and characterizing prostate cancer. Integration of multiparametric MRI data through AI models provides a holistic assessment of prostate lesions [33]. This comprehensive approach improves diagnostic accuracy and may facilitate the identification of novel imaging biomarkers.

4.5.      Challenges and limitations

Despite the promising advancements, several challenges remain:

- Data Quality and Standardization: Inconsistencies in imaging protocols and data acquisition across institutions hinder the generalizability of AI models [38]. Establishing standardized protocols is essential for multi-center studies and widespread implementation.

- Algorithm Validation: Many studies lack external validation and are limited by small sample sizes [39]. Large-scale, prospective studies are needed to validate AI models and assess their impact on clinical outcomes.

- Interpretability: The "black box" nature of some AI algorithms, particularly deep learning models, poses challenges for clinical acceptance [40]. Enhancing model transparency through interpretability methods can increase clinician trust.

- Integration into Clinical Practice: Seamless integration of AI tools into existing clinical workflows requires user-friendly interfaces and interoperability with hospital information systems [41]. Training clinicians to effectively use these tools is also crucial.

- Ethical and Legal Considerations: Data privacy concerns, potential biases in AI algorithms, and regulatory challenges must be addressed [42]. Ethical guidelines and robust legal frameworks are necessary to ensure responsible AI deployment.

4.6. Future perspectives

Advancements in AI and ML are poised to further revolutionize fusion biopsy procedures:

- Hybrid Models: Combining AI algorithms with expert clinical judgment can create synergistic effects, leveraging the strengths of both [45]. Such hybrid models may offer optimal diagnostic performance.

- Personalized Medicine: AI-driven analyses can contribute to personalized treatment strategies, tailoring interventions based on individual risk profiles and tumor characteristics [46].

- Multi-Modal Data Integration: Incorporating data from genomics, proteomics, and other 'omics' technologies with imaging data may enhance predictive modeling and understanding of prostate cancer biology [47].

- Regulatory and Ethical Frameworks: Developing comprehensive guidelines for AI in healthcare will facilitate safe and effective integration into clinical practice [42].

4.7.      Limitations of the review

This systematic review has limitations:

- Language Bias: Restricting the search to English-language publications may have excluded relevant studies in other languages.

- Publication Bias: The tendency to publish positive findings could overestimate the benefits of AI and ML applications.

- Heterogeneity: Differences in study designs, patient populations, AI/ML techniques, and outcome measures limited direct comparisons and precluded meta-analysis.

The summary of this review is presented in table 2.

Conclusion


Artificial intelligence and machine learning have demonstrated significant potential in enhancing the efficiency and accuracy of fusion biopsy in prostate cancer diagnosis. They improve imaging interpretation, lesion detection, targeting accuracy, and risk stratification, contributing to personalized patient care and better clinical outcomes. Addressing challenges related to data standardization, algorithm validation, interpretability, and integration into clinical workflows is essential. Future research should focus on large-scale, multicenter studies, development of hybrid models, and establishment of ethical and regulatory frameworks. Collaborative efforts among multidisciplinary teams will be crucial in realizing the full potential of AI and ML in improving prostate cancer management.


References

1. Sung H, Ferlay J, Siegel RL, Laversanne M, Soerjomataram I, Jemal A, et al. Global cancer statistics 2020: GLOBOCAN estimates of incidence and mortality worldwide for 36 cancers in 185 countries. CA Cancer J Clin. 2021;71(3):209-249. doi: 10.3322/caac.21660

2. Moyer VA; U.S. Preventive Services Task Force. Screening for prostate cancer: U.S. Preventive Services Task Force recommendation statement. Ann Intern Med. 2012;157(2):120-134. doi: 10.7326/0003-4819-157-2-201207170-00459

3. Bjurlin MA, Carter HB, Schellhammer P, Cookson MS, Gomella LG, Troyer D, et al. Optimization of initial prostate biopsy in clinical practice: sampling, labeling and specimen processing. J Urol. 2013;189(6):2039-2046. doi: 10.1016/j.juro.2013.01.061

4. Barentsz JO, Weinreb JC, Verma S, Thoeny HC, Tempany CM, Shtern F, et al. Synopsis of the PI-RADS v2 guidelines for multiparametric prostate magnetic resonance imaging and recommendations for use. Eur Urol. 2016;69(1):41-49. doi: 10.1016/j.eururo.2015.08.038

5. Puech P, Randazzo M, Ouzzane A, Gaillard V, Mordon S, Lemaitre L, et al. Magnetic resonance imaging-targeted biopsy for the detection of prostate cancer. Curr Opin Urol. 2015;25(6):510-516. doi:

10.1097/MOU.0000000000000224

6. Kasivisvanathan V, Rannikko AS, Borghi M, Panebianco V, Mynderse LA, Vaarala MH, et al.; PRECISION Study Group Collaborators. MRI-targeted or standard biopsy for prostate-cancer diagnosis. N Engl J Med. 2018;378(19):1767-1777. doi: 10.1056/NEJMoa1801993

7. Wysock JS, Rosenkrantz AB, Huang WC, Stifelman MD, Lepor H, Taneja SS. A prospective, blinded comparison of magnetic resonance imaging-ultrasound fusion and visual estimation in the performance of MR-targeted prostate biopsy: the PROFUS trial. Eur Urol. 2014;66(2):343-351. doi: 10.1016/j.eururo.2014.01.002

8. Litjens GJ, Debats O, Barentsz JO, Karssemeijer N, Huisman HJ. Computer-aided detection of prostate cancer in MRI. IEEE Trans Med Imaging. 2014;33(5):1083-1092. doi: 10.1109/TMI.2014.2303821

9. Asvadi NH, Sonn GA. Advances in image-guided prostate biopsy. Curr Urol Rep. 2017;18(8):62. doi: 10.1007/s11934-017-0710-z

10. Topol EJ. High-performance medicine: the convergence of human and artificial intelligence. Nat Med. 2019;25(1):44-56. doi: 10.1038/s41591-018-0300-7

11. Erickson BJ, Korfiatis P, Akkus Z, Kline TL. Machine learning for medical imaging. Radiographics. 2017;37(2):505-515. doi: 10.1148/rg.2017160130

12. Chen PHC, Gadepalli K, MacDonald R, Liu Y, Kadowaki S, Nagpal K, et al. An augmented reality microscope with real-time artificial intelligence integration for cancer diagnosis. Nat Med. 2019;25(9):1453-1457. doi: 10.1038/s41591-019-0539-7

13. McBee MP, Awan OA, Colucci AT, Ghobadi CW, Kadom N, Kansagra AP, et al. Deep learning in radiology. Acad Radiol. 2018;25(11):1472-1480.

doi: 10.1016/j.acra.2018.02.018

14. Bi WL, Hosny A, Schabath MB, Giger ML, Birkbak NJ, Mehrtash A, et al. Artificial intelligence in cancer imaging: clinical challenges and applications. CA Cancer J Clin. 2019;69(2):127-157. doi: 10.3322/caac.21552

15. Moher D, Liberati A, Tetzlaff J, Altman DG; PRISMA Group. Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. PLoS Med. 2009;6(7):e1000097. doi: 10.1371/journal.pmed.1000097

16. Whiting PF, Rutjes AW, Westwood ME, Mallett S, Deeks JJ, Reitsma JB, et al.; QUADAS-2 Group. QUADAS-2: a revised tool for the quality assessment of diagnostic accuracy studies. Ann Intern Med. 2011;155(8):529-536. doi: 10.7326/0003-4819-155-8-201110180-00009

17. Wang S, Zhou M, Liu Z, Dong D, Fang M, Dong Y, et al. Deep learning enables automatic contouring of prostate on MRI for radiotherapy planning. Phys Med Biol. 2019;64(4):045002. doi: 10.1088/1361-6560/aaf56e

18. Miller KT, Veeraraghavan H, Foster S, Jambawalikar S, Dugan C, Karimi S, et al. Artificial intelligence–driven workflow for prostate cancer detection and segmentation in multiparametric MRI. Radiol Artif Intell. 2022;4(2):e210101. doi: 10.1148/ryai.210101

19. Xu L, Zhang G, Zhao L, Mao H, Li X, Chen W, et al. Deep learning for automatic prostate segmentation in transrectal ultrasound images. J Ultrasound Med. 2019;38(12):3279-3289. doi: 10.1002/jum.15027

20. Esteva A, Robicquet A, Ramsundar B, Kuleshov V, DePristo M, Chou K, et al. A guide to deep learning in healthcare. Nat Med. 2019;25(1):24-29. doi: 10.1038/s41591-018-0316-z

21. Algohary A, Viswanath S, Shiradkar R, Pahwa S, Purysko AS, Rosen M, et al. Radiomic features on MRI enable risk categorization of prostate cancer patients on active surveillance: preliminary findings. J Magn Reson Imaging. 2018;48(3):818-828. doi: 10.1002/jmri.26009

22. Cao R, Bajgiran AM, Mirak SA, Shakeri S, Zhong X, Enzmann D, et al. Joint prostate cancer detection and Gleason score prediction in mp-MRI via FocalNet. IEEE Trans Med Imaging. 2019;38(11):2496-2506. doi: 10.1109/TMI.2019.2902358

23. Patel N, Henry A, Scarsbrook A, Ramachandran N, Short M, Brown L, et al. Multiparametric MRI in prostate cancer—development of a machine learning-derived predictive model of prostate biopsy outcome. Transl Androl Urol. 2021;10(1):304-312. doi: 10.21037/tau-20-1411

24. Litjens G, Sánchez CI, Timofeeva N, Hermsen M, Nagtegaal I, Kovacs I, et al. Deep learning as a tool for increased accuracy and efficiency of histopathological diagnosis. Sci Rep. 2016;6:26286. doi: 10.1038/srep26286

25. Ginsburg SB, Fan RE, DeCastro GJ, Lipsick D, Rusu M, Greer MD, et al. Radiomic features for prostate cancer detection on MRI differ between the transition zone and peripheral zone: preliminary findings from a multi-institutional study. J Magn Reson Imaging. 2020;52(3):992-1000. doi: 10.1002/jmri.27111

26. Min X, Li M, Dong D, Feng Z, Zhang P, Ke Z, et al. Predicting clinically significant prostate cancer in PI-RADS 3 lesions using radiomics. Cancer Manag Res. 2019;11:3531-3544. doi: 10.2147/CMAR.S198809

27. Gong L, Xu M, Fang M, Zou J, Zhang Y, Peng W, et al. Noninvasive prediction of high-grade prostate cancer via multiparametric MRI radiomics. J Magn Reson Imaging. 2020;52(4):1102-1109. doi: 10.1002/jmri.27193

28. Liu W, Chen W, Ding Y, Wei Q, Gao X, Mao H, et al. Artificial intelligence enhanced transrectal ultrasound imaging for MRI-TRUS fusion targeted prostate biopsy. Eur Radiol. 2020;30(9):5439-5448. doi: 10.1007/s00330-020-06888-5

29. van der Leest M, Israël B, Cornel EB, Zamecnik P, Schoots IG, van der Schoot J, et al. High diagnostic performance of short magnetic resonance imaging protocols for prostate cancer detection in biopsy-naïve men: the next step in magnetic resonance imaging accessibility. Eur Urol. 2019;76(5):574-581. doi: 10.1016/j.eururo.2019.07.033

30. Sonn GA, Margolis DJ, Marks LS. Target detection: magnetic resonance imaging-ultrasound fusion-guided prostate biopsy. Urol Oncol. 2014;32(6):903-911. doi: 10.1016/j.urolonc.2013.11.012

31. Stoyanova R, Pollack A, Lynne E, Parra N, Lam LL, Alshalalfa M, et al. Optimization of radiomics-based risk models. J Med Imaging Radiat Sci. 2019;50(3S):S29-S37. doi: 10.1016/j.jmir.2019.06.004

32. Bourbonne V, Jaouen V, Lucia F, Visvikis D, Koulibaly PM, Robin P, et al. Radiomics predicts the occurrence of prostate cancer lymph node metastases in patients with high-risk prostate cancer using multiparametric MRI. Eur Radiol. 2020;30(8):4107-4118. doi: 10.1007/s00330-020-06769-x

33. Zhang L, Tang M, Chen S, Lei X, Li H, Ye R, et al. Multiparametric MRI-based radiomics signature for discriminating clinically significant prostate cancer from nonclinically significant prostate cancer. J Magn Reson Imaging. 2020;52(4):1109-1116. doi: 10.1002/jmri.27191

34. Smith CP, Harmon SA, Barrett T, Bittencourt LK, Law YM, Shebel H, et al. Informatics in radiology: opportunities and challenges in computational imaging and artificial intelligence in prostate cancer. Radiographics. 2021;41(1):E19-E43. doi: 10.1148/rg.2021200138

35. Hectors SJ, Chen C, Kim BN, Li Y, Shih JH, Mulkern RV, et al. Magnetic resonance imaging radiomics-derived machine-learning classifiers for risk stratification of prostate cancer. NPJ Digit Med. 2020;3:48. doi: 10.1038/s41746-020-0250-3

36. Kania A, Lewicki A, Kania D, Olejniczak M, Pietrzak R, Iwulska A, et al. Artificial intelligence algorithms for prostate cancer diagnosis and prediction of the prostate biopsy results. Urol Int. 2021;105(7-8):614-624. doi: 10.1159/000513591

37. Harmon SA, Tuncer S, Sanford T, Pieterek A, Sanford M, Shih JH, et al. Artificial intelligence at the intersection of pathology and radiology in prostate cancer. Diagn Interv Radiol. 2019;25(3):183-188. doi: 10.5152/dir.2019.18524

38. Schaffter T, Buist DSM, Lee CI, Nikulin Y, Ribli D, Guan Y, et al. Evaluation of combined artificial intelligence and radiologist assessment to interpret screening mammograms. JAMA Netw Open. 2020;3(3):e200265. doi: 10.1001/jamanetworkopen.2020.0265

39. Kelly CJ, Karthikesalingam A, Suleyman M, Corrado G, King D. Key challenges for delivering clinical impact with artificial intelligence. BMC Med. 2019;17(1):195. doi: 10.1186/s12916-019-1426-2

40. Rossi GP, Waks AG, George D, Sherrer SM, Sonpavde G. Artificial intelligence in prostate cancer imaging: where we are and where we are going. Prostate Cancer Prostatic Dis. 2023;26(1):7-15. doi: 10.1038/s41391-022-00531-6

41. Pesapane F, Codari M, Sardanelli F. Artificial intelligence in medical imaging: threat or opportunity? Radiologists again at the forefront of innovation in medicine. Eur Radiol Exp. 2018;2(1):35. doi: 10.1186/s41747-018-0061-6

42. European Commission. Ethics Guidelines for Trustworthy AI. Brussels: European Commission; 2019. Available from: https://ec.europa.eu/digital-single-market/en/news/ ethics-guidelines-trustworthy-ai

43. Armstrong AJ, Halabi S, Luo J, Nanus DM, Giannakakou P, Szmulewitz RZ, et al. Prostate cancer clinical trials working group 3 recommendations for trial design and objectives for castration-resistant prostate cancer: from PCWG3 to PCWG4. Eur Urol. 2020;77(5):508-516. doi: 10.1016/j.eururo.2019.12.004

44. European Society of Urogenital Radiology (ESUR). ESUR guidelines on standardization of prostate MRI. Eur Radiol. 2019;29(4):2244-2246. doi: 10.1007/s00330-019-06309-2

45. Khosravi P, Kazemi E, Imielinski M, Elemento O, Hajirasouliha I. Deep convolutional neural networks enable discrimination of heterogeneous digital pathology images. EBioMedicine. 2018;27:317-328.

doi: 10.1016/j.ebiom.2017.12.026

46. Larson DB, Magnus DC, Lungren MP, Shah NH, Langlotz CP. Ethics of using and sharing clinical imaging data for artificial intelligence: a proposed framework. Radiology. 2020;295(3):675-682. doi: 10.1148/radiol.2020192536

47. Anello M, Chatterjee P, Li X, Ippolito JE, Khosravi P, Vyas N, et al. Patient perspectives on the use of artificial intelligence for skin cancer screening: a qualitative study. JAMA Dermatol. 2020;156(5):501-512.

doi: 10.1001/jamadermatol.2019.5014