The ACR DSI framework promotes standardization, interoperability, reportability, and patient safety in radiological artificial intelligence development that can help usher in a new era of advanced medicine.
As the hype behind artificial intelligence (AI) is replaced with a recognition of the barriers to adoption, radiologists increasingly are intrigued with AI’s potential to expedite, augment, and generally improve their ability to interpret medical images. To that end—and presumably to pre-empt a Wild West scenario—the ACR has created an AI framework for developers that would ensure that the tools created are reliable, useful, diverse, and pose no threat to patients.
A cornerstone of the framework is an initial set of use cases that were distributed to industry several months ago and recently posted on the Data Science Institute (DSI) section of the ACR web site for public comment through January 1, 2019. The use cases, more than 50 of them, include acute appendicitis, classifying suspicious microcalcifications, TAVR aortic root measurements, scoliosis, and quantification of a variety of left ventricle characteristics.
Software use cases are more than a mere idea; they specify particular data elements, as described in a recent JACR article by Bibb Allen, MD, ACR DSI chief medical officer, and Keith Dreyer, DO, PhD, chief science officer. For AI purposes, the DSI proposes that use cases specify data elements for standardized training, testing, and validation of the algorithms, while providing a pathway to clinical deployment and ongoing monitoring. In essence, a use case defines how an AI algorithm will:
Subspecialty expert panels, similar to those that inform the ACR’s Appropriateness Criteria guidelines, identified and hammered out the use cases with the greatest potential to improve radiological care and make developers aware that the algorithms must recognize diverse patient populations and clinical workflows. The use cases include a high degree of clinical detail regarding the anatomical structures and functional measurements involved in the study, as well as specifying the relevant metadata produced by DICOM modalities.
In addition to including standardized data elements to enable clinical integration with reporting software, imaging modalities, PACS and EHR, the use cases specify the data elements necessary to automatically populate ACR AI registries to provide the post-market feedback developers, practices, and the FDA will require.
The ACR already is working closely with relevant regulatory agencies. In fact, the FDA has selected the DSI Lung Cancer Screening use case as a demonstration project for the validation and monitoring of algorithms in clinical practice. The ACR is also working closely with the FDA to ensure that the ACR’s algorithm validation service, called Assess-AI, is aligned with the FDA review process so that developers who use it can expedite the FDA review process for the approval of new tools.
“The ACR DSI framework promotes standardization, interoperability, reportability, and patient safety in radiological artificial intelligence development that can help usher in a new era of advanced medicine,” said ACR DSI Chief Science Officer Dreyer in a prepared statement.
The feedback from industry has been positive: “Clinical insight from the ACR Data Science Institute validates the impact artificial intelligence can have on radiology,” said John Axerio-Cilies, chief technology officer at Arterys. “The use cases highlight a number of specific opportunities that are supported by value proposition, workflow and dataset details.”
Meanwhile, the work on the use cases is ongoing as part of an effort the ACR is calling TOUCH-AI, which stands for Technically Oriented Use Cases for Healthcare-AI. The DSI aims to produce 200 use cases by 2020. To comment on a use case, request panel membership, or otherwise get involved, click here.