AI Takes on Radiology

Medical imaging has progressed from a film-based world to a digital one. Major imaging advancements including MRI, CT and ultrasound are the drivers in this evolution.

Clinicians were demanding a way to move and display the information acquired digitally to a central archive. For instance, imagine a facility that has two different CT scanners, these two devices acquire, store, display and archive studies independently. Any study that was acquired in the CT scan number one cannot be viewed nor stored in scanner number two. There were no means for scans to be sent to a centralized system to be archived or viewed from all of the machines.

The initial systems used commercially available network technology like Ethernet. The first picture archiving and communications systems mimicked film alternators (devices for viewing multiple films). They were limited in their ability to display and manipulate images because of the changing image manipulation applications and technical requirements. Today, they rely on either a radiology information system or an electronic health record and use a commercial component for processing, displaying and storing images.

The great volume of imaging data can saturate internal IT infrastructures. An alternative is storing medical images in the cloud. The cloud can be used for both primary storage, as well as a backup storage when internal capacity is exceeded. (http://searchhealthit.techtarget.com/feature/What-makes-up-a-medical-imaging-system)

The problem is you have tons of images but limited time and eyes to analyze it. Radiologists look at a new scan every three to four seconds during a normal eight-hour workday. That’s hardly enough time to find the patterns, abnormalities and other markers essential in making a diagnosis. (http://www.modernhealthcare.com/article/20170708/TRANSFORMATION03/170709944)

This is where AI is most helpful. Machine-learning algorithms, are trained to find patterns in images and identify specific anatomical markers. But they can also go deeper and spot details the human eye can’t catch. These algorithms are currently in trials and have been proven to be accurate and fast.

An example of this is a collaboration that IBM and the University of Alberta did on using machine learning algorithms to examine MRI scans and predict schizophrenia with 74 percent accuracy. The technology was also able to determine the severity of symptoms by examining activity in different regions of the brain. The team examined brain MRI’s of 95 participants, using the images to develop an AI model of schizophrenia. The AI was then able to distinguish between those with schizophrenia and a control group. (http://www.radiologybusiness.com/topics/technology-management/ai-%E2%80%98learns%E2%80%99-predict-schizophrenia-brain-mri)

“The ultimate goal of this research effort is to identify and develop objective, data-driven measures for characterizing mental states, and apply them to psychiatric and neurological disorders” said Ajay Royyuru, VP of healthcare and life sciences with IBM Research. “We also hope to offer new insights into how AI and machine learning can be used to analyze psychiatric and neurological disorders to aid psychiatrists in their assessment and treatment of patients.”

There are many other AI imaging applications being tested in the market. But the big question here is, what will be the impact of AI in radiology?

Big data analytics in imaging could lower costs and improve efficiency, but first it must get past some roadblocks.

AI learns as radiologists do. Developers feed the algorithm tons of imaging studies of a brain suffering from a stroke in order to train it to recognize it. The algorithm finds patterns in data and predicts future behavior based on prior patterns. So it can gain new information from new images, learning even more in an on going feedback cycle. (http://www.modernhealthcare.com/article/20170708/TRANSFORMATION03/170709944)

This kind of automation “frees up a lot of physician time and brings a huge amount of consistency to imaging and tracking changes over time in a patient,” said Carla Leibowitz, Arterys’ head of strategy and marketing.

A big roadblock for AI is that it has to be integrated into existing systems. Because AI usually produces “discrete” data elements it’s theoretically possible to bring those data elements smoothly into workflows. Ideally it will run on a case automatically, producing discrete assessments that the radiologist can validate and add to.

But theory doesn’t often work in practice. “We need to ensure that there’s interoperability. One potential problem is how the algorithms are initially trained. Sometimes, the data they’re fed in the learning process come from just one specific model of imaging machine. Because different models have different radiation doses and slightly different technologies, you’ve got an inherent bias that’s built in. Another challenge can be physician’s acceptance” Dr. Bibb Allen, chief medical officer of the American College of RDSI said.

It’ll be awhile before all those areas are figured out. But the potential it has for public health could really change the way healthcare works. It will make radiologists more effective, lowering costs and improving healthcare overall.

 

0