Skip to main content

AI • IOT • NETWORK EDGE

Revolutionizing Cancer Research with Healthcare AI Tools

Healthcare AI

The medical field has a tough job when it comes to cancer—two tough jobs, actually: treating patients and advancing research that will prevent and combat illness. On top of that, longer lifespans ensure an ever-growing stream of those patients to treat. Fortunately, advancements in healthcare AI are aiding the evolution of cancer research and treatment, and those advancements are the result of some amazing partnerships between the healthcare and technology industries.

Let’s take a closer look at one of those partnerships, represented by Dr. Johannes Uhlig, Assistant Professor of Radiology and Head of the AI Research Group at UMG (University Medical Center Göttingen) in Germany; and Dr. André Aichert, Research Scientist at the Artificial Intelligence Germany Department of Digital Technology and Innovation at Siemens Healthineers, who are collaborating on a project called Cancer Scout. Together, they’re bringing technology to the clinic and clinical experience back into the tech, for the benefit of us all (Video 1).

Video 1. Inside the challenges and opportunities for healthcare AI with Siemens Healthineers and UMG Göttingen. (Source insight.tech)

Which advancements in AI have transformed cancer research?

Dr. André Aichert: The impact of AI on the entire healthcare space cannot be understated. As a company, Siemens Healthineers has been around pretty much since the first X-rays were taken; we’ve been improving things, like detail and number of images, ever since. At the moment, the use of AI gives clinicians the opportunity to deal with the vast amount of images and data available to them, and to get the most out of them. Clinicians who start using AI solutions today are at the state of the art of the technology, and they’ll have all the advantages of that.

For the Cancer Scout project with UMG, we’re analyzing a host of different data sources—not just images—trying to define and detect subtypes of cancer. For the radiology subproject, which is where Dr. Uhlig comes in, the question is: Can we recognize certain subtypes of cancer based on radiology images? If we could, then an invasive biopsy might not be required. In general, we’re trying to optimize workflows, to prevent unnecessary invasive procedures, and doing data analysis of all sorts.

Dr. Johannes Uhlig: There are several current challenges in clinical cancer imaging. For one, we have an aging population, and so we see an associated increase in healthcare needs. Additionally, radiology imaging has become more broadly available in the last few decades and is more frequently used. For cancer imaging particularly, we face a massive increase in case numbers, and I believe those images can’t all be assessed by radiologists in the traditional way.

For example, for breast cancer screening in Germany, we use X-ray-based mammography. The most recent data, from 2018, reports that 2.8 million women got mammograms, and 97% of those scans were negative. The current setup is that every mammogram has to be independently assessed by two experienced radiologists. But we have literature suggesting that AI-based methods for mammography assessment are at least comparable in their diagnostic accuracy to those of radiologists. So could we use AI algorithms instead of that second radiologist? From both an ethical and an economic standpoint, is it even imperative to do so?

And breast cancer screening is only the tip of the iceberg. For cancer research, I believe AI is the buzzword of the last decade. For example, our research group has been focused on extracting additional information from CT and MRI images to guide clinical decision-making in patients with suspected kidney or prostate cancer. In the Cancer Scout project with Siemens Healthineers, we use AI algorithms and the syngo.via software to correlate radiology CT imaging of lung cancer patients with a pathology analysis in a large-scale cohort. And we hope that one day these AI algorithms will advance the role of radiology imaging in guiding the lung cancer treatment.

How does software aid your research, particularly compared to traditional approaches?

Dr. André Aichert: First of all, I should explain some of the practical problems of doing research in the clinical environment. First, we are dealing with personal information, so you have to be very careful about that because of GDPR in Europe or HIPAA in the US. Just accessing data and getting to the point where you have the basis for AI algorithms is a much bigger process than you might imagine.

Then, most of the successful algorithms are supervised, which means you need to collaborate with clinicians to give you annotations and give you an idea of what you’re actually looking at in order for the algorithm to reproduce the findings. Therefore, it’s critical to actually get access to this data. But the clinical IT landscape has become scattered between different vendors and departments or sites over time, and sometimes these systems do not communicate. Collecting and harmonizing the data from these systems is actually a lot of work and can be very painful at times.

For example, you have your favorite free program on GitHub, and you just want to run it over some data. You have to make sure you’re even able or allowed to use that software. Then you have to make sure that the data you’re using it with is anonymized. You anonymize it, export it, and copy it over to a different computer where you’re running the software. Then you have to make sure that it actually is anonymized. Then you get your results. But then you have to go back to the original system and reintegrate those results, potentially even with additional information from other IT systems. This is all very different from what I’m used to as a researcher, or as an actual user of IT.

Then, even if you’ve trained your first models and you want to test them on real-world data, that can also be a problem. You run the risk of having your development team work on a clinical use case and develop beautiful software before they’ve gone through the effort of actually releasing it to the clinicians. Then they try it in the real world and all of a sudden realize that there was a very basic assumption that was wrong.

Then you’ve got a problem, and what you want to do is—like in the Silicon Valley—to fail fast. You want to be able to have an early prototype that probably doesn’t exactly solve the problem yet, but it’s something you can bring to the clinician, get feedback, and then shorten this feedback loop. And that is one place where syngo.via Frontier certainly helps.

Basically, the syngo.via Frontier research platform tries to help in all of these steps along the way that I just explained. It’s an end-to-end integrated solution. If you have a syngo.via installation running at the clinic, you can run it on the data that’s available in your packs and download applications from a marketplace that exists in that system. And that is a very, very big advantage over just getting your own software and trying to integrate it somehow with the process that I described.

“The use of #AI gives clinicians the opportunity to deal with the vast amount of images and #data available to them, and to get the most out of them” – Dr. André Aichert, @SiemensHealth via @insightdottech

How are you working with Siemens Healthineers and its research platform?

Dr. Johannes Uhlig: The syngo.via software is deeply embedded in the clinical workflow in our department. For example, we use it for all cardiac CT scans, for coronary-vessel identification, or as an on-the-fly image viewer and reconstruction software in trauma patients. It really performs robustly in all these scenarios. We also had four researchers working full time for several months on the Cancer Scout project—where we had several thousand patients and the project had to run smoothly—and we used it for data accrual, annotation, and supervision.

For me, it’s crucial to have a one-stop shop; I want to use as few software tools as possible for my whole data pipeline. With syngo.via we have one piece of software, and we can extract data from our imaging database. We can annotate cross-sectional images, and we can anonymize these cases in a way that is compliant with the strict German regulations.

What is the importance of working with the university on this project?

Dr. André Aichert: Cooperation is absolutely essential. If we didn’t have clinical researchers, like Dr. Uhlig, who are willing to cooperate with us and share their knowledge—and also to explain the problems that they have—then it would be very hard for us to make any progress in this field. As an AI researcher I also have to understand a little bit about the clinical problem at hand.

But it’s just as important that the usage of the software corresponds to what clinicians do in their routine, and that the presentation of the data is done in such a way that a physician can actually tell types apart, or tell locations and geometry apart. So you have to come together and define an annotation protocol that actually makes sense.

What do you envision for the future of this project?

Dr. Johannes Uhlig: It’s crucial that these AI algorithms we build are assessed and trained in a clinical setting. They have to work on suboptimal scans; they have to work with different scanner types, different patients. But, as André said, there’s also the acceptance by radiologists and clinicians. What is the best way to present the AI results? How should they be visualized? How are outliers reported? But given the mutual trust, I believe that UMG and Siemens Healthineers as partners will find ways to address these challenges.

Dr. André Aichert: One of the essential next steps for the models would certainly be to look at other sites, and scalability is key in this regard. We’ve already used a solution called teamplay to collect the data from UMG, and it could also be used to collect data from other sites that has been produced in a similar manner. That would allow us to integrate or support different IT infrastructures in different locations that may be very different from those at UMG.

What final thoughts would you like to leave us with?

Dr. André Aichert: The medical domain is a really exciting field for AI researchers. You have this very diverse set of problems, this very diverse set of modalities and images. You also want to be able to share knowledge and data and drive collaboration in all sorts of medical disciplines supporting this iterative process in order to ultimately develop and deploy applications.

Dr. Johannes Uhlig: For me as a clinician, I believe AI really is the future. I guess there’s no way that we can work without the application of AI algorithms within the next 10 years, just given the caseloads. And also I have to underline that AI research really is a team effort. We need these collaborations between academic institutions like UMG and manufacturers like Siemens Healthineers to advance healthcare, especially given what is at stake in cancer imaging. And only through this ongoing mutual feedback, adjustment, and fine tuning will we create AI tools that are not only accurate but also accepted by healthcare professionals.

Related Content

To learn more about healthcare AI transformations, listen to Healthcare AI for Cancer Research: With Siemens Healthineers and read The Doctor Will View You Now. For the latest innovations from Siemens Healthineers, follow them on Twitter and LinkedIn.
 

This article was edited by Erin Noble, copy editor.

About the Author

Christina Cardoza is an Editorial Director for insight.tech. Previously, she was the News Editor of the software development magazine SD Times and IT operations online publication ITOps Times. She received her bachelor’s degree in journalism from Stony Brook University, and has been writing about software development and technology throughout her entire career.

Profile Photo of Christina Cardoza