AutoRADS: Artificial Intelligence for Risk Stratification of Liver and Ovarian Lesions (LI-RADS and O-RADS MRI) from Reports
Redefining Radiology through AI
The Problem: Barriers to Using Current Radiology Reporting and Data Systems
Assessing the probability of cancer detected in medical imaging is a crucial step in cancer diagnosis. Accurate communication of malignancy risk by radiologists to oncologists and surgeons is key for guiding cancer management and patient care. Traditional radiology reports can be lengthy, and at times fail to clearly convey cancer risk levels.
The American College of Radiology (ACR) Reporting and Data Systems (RADS) provide a standardized way to assess and communicate risk. However, these standardized tools are often underutilized or applied incorrectly due to increasing complexity, reducing their effectiveness.
The Solution: Leveraging AI to better direct management and improve outcomes
This Grand Challenge project aims to optimize artificial intelligence (AI) large language models (LLMs) to automatically apply RADS scores to radiologist descriptions of liver (LI-RADS) and ovarian lesions (O-RADS) accurately and consistently through the usage of the AutoRADS application.
Project Goal
The goal of this project is to create a minimally viable product (MVP) tool, called AutoRADS, to facilitate the categorization of liver and ovarian lesion scores powered by AI-hybrid methods. The tool aims to achieve a classification accuracy rate of at least 95%, encourage adoption and usability to ensure trust in the system, and provide end users with clear and interpretable outputs for optimal explainability and transparency.
The tool will work by extracting text from the current Radiology Information System (Coral) used at UHN, processing the data through an AI-hybrid method to automatically classify with LLM + calculate with deterministic logic to generate a cancer classification score that can be included in the radiologist’s report to the patient care team.
Project Deliverables
Within a 1-year timeline, the project team will work towards completing the following deliverables:
- Model Development: Develop an AI-based hybrid model to auto-calculate liver lesion scores
- Responsible AI Framework Support: Apply CDI Responsible AI Deployment framework to ensure fairness, equity in the AI solution, and improve the framework
- Develop AutoRADS application: Create a functional minimally viable product (MVP) to generate model outputs for liver and ovarian lesion scoring
- Pilot: Conduct a pilot study to evaluate AutoRads in the clinical setting
Additional Information
AutoRADS is the winner of the 2024-2025 Princess Margaret Grand Challenge. The Grand Challenge offers support from the Cancer Digital Intelligence (CDI) program in the form of front-end and back-end development of the project, data science, design, and project management support. Winning projects are selected based on their responsible use of AI, or addressing risks of AI adoption in their projects, alignment with CDI values and priorities, contribution to the PM community, and feasibility of completion within the Grand Challenge funding period, and impact.
This project is a collaboration between Dr. Rajesh Bhyana and members of the CDI project. Team members: Dr. Rajesh Bhyana, Kelly Lane, Tran Truong, Benjamin Grant, Clare Mcelcheran, Sharon Narine, Muammar Kabir, Anton Sukhovatkin, Helena Hyams, Adam Badzynski, Dr. Genevieve Bouchard-Fourtier, Dr.Chaya Ganor-Shwaartz, Satheesh Krishna, Masoom Haider, and Yangqing Deng.
Dr. Rajesh Bhayana is a Radiologist at UHN, and the Radiologist Technology Lead at the Joint Department of Medical Imaging (JDMI).
To learn more about Dr. Rajesh Bhyana and his work, click here https://universitymedicalimagingtoronto.ca/people/rajesh-bhayana/
To learn more about RADS, developed by the American college of Radiology, click here https://www.acr.org/Clinical-Resources/Clinical-Tools-and-Reference/Reporting-and-Data-Systems
To learn more about the CDI team, click here https://pmcdi.ca/