This Week Health

Don't forget to subscribe!

March 23, 2022: Laurence Yudkovitch, Product Manager and Josh Dagenhart, AI Solutions Architect-Manager at iCAD join Brad Genereaux, Medical Imaging & Smart Hospitals Alliance Manager for NVIDIA to discuss the next evolution of radiology solutions with integrated AI. Specifically the NVIDIA AI Enterprise on VMware vSphere with VMware Tanzu solution. What do Customers Look for in an Imaging AI Solution? What are infrastructure considerations? What about high availability and disaster recovery? Where is the market going with modern apps?

Key Points:

00:00:00 - Intro

00:03:30 - 40 million mammograms done annually

00:05:00 - Our key value proposition is reducing the amount of radiologist read time by over 50%

00:12:15- How do you make clinicians more productive? More effective?

00:24:00 - It used to be PACS interoperability. Now it's platform interoperability.

iCAD

NVIDIA

VMware Tanzu

Transcript

Today on This Week Health.

All of these applications with AI, as we start to modernize what we're running in the data center it needs a home. We can't just have a single server for every type of AI application that's out there. We need to create a stack in our data center to run all of these different applications.

All right today, we have a solution showcase, and this is one of those moderated discussions that I did from the 📍 VMware booth. This one is on the next evolution of radiology solutions with integrated AI. We talked to two people from iCAD med around their profound AI solution. This is a Mammo imaging solution that takes those images, puts them through an AI model. Gives 'em almost real time reads and feedback and its really physician assisted AI. It will circle some things and say, take a look at these things and reduces the reads by about 55%. Really fascinating solution. This is part of our our solutions that we're highlighting from the VMware booth with NVIDIA. Brad Genereaux from NVIDIA joins us. Laurence Yudkovitch and Josh Dagenhart join us from iCAD med around the profound AI solution. Fantastic moderated discussion. Very well attended. I hope you enjoy.

Thank you very much. Thanks everybody for coming. I think the hall police will be coming shortly as this group is blocking the aisles, but it's it's all good. Looking forward to this conversation. So today we are going to, we're going to talk about the next evolution of radiology solutions and we have three guests here.

I'm going to ask you to introduce yourself. I would love to have my notes in front of me, but we are actually recording this for a show on This Week Health. Let me introduce myself real quick. Bill Russell, former CIO for 16 hospital system. And now I do podcasts this week health. We have four channels and we interview C IOs, leaders in healthcare. Anyone that can amplify great thinking to propel healthcare forward. If you can introduce yourself and your role, that would be great.

Hi, I'm Laurence Yudkovitch. I'm the technical product manager for iCAD. I work on our technical platform for artificial intelligence, helping radiologists diagnose breast cancer and improve quality of care.

My name is Josh Dagenhart. I'm the AI solutions architect for iCAD. I might say this a lot today, but I'd like to set expectations. That's what I do. Clinically and technically with the CIO, with the radiologist, with the PACS administrators about AI, because this is not 2D CAD. So this is AI for breast imaging and I also assist the sales team. Thank you.

Hey, my name is Brad Genereaux, medical imaging and smart hospitals Alliance manager with NVIDIA. What I do is I cover developer relations on anything visualization, AI virtualization, and the analytics as they touch upon these spaces leveraging our SDKs, our compute to power, the next generation of solutions we're seeing in healthcare today.

Fantastic. All right. So you thinking about your questions, this is going to be interactive, but I'm going to start off. Here's how we're going to do this. I was a former CIO for 16 hospital system. I'm going to do this, like you've just come into my office and you're going to try to pitch your solution and where I usually start is explain the problem. What problem are we trying to solve?

Sure. So every year most women over 40 or so come in for an annual mammogram. We have 40 million mammograms done annually and ideally those are read right as the woman's. But when my wife goes in typically she spends 10, 15 minutes. They take all the images and then we get a report seven days later in the in the mail.

And so if she needed any supplemental screening or to come back that's a return visit. It's a lot of increased anxiety. So we're trying to do a couple of things. We're trying to help the radiologists read faster. So mammogram traditionally it's been two to four images. Two from each breast talk, view and view, and then the radiologist is doing something like, Hey, where's Waldo on these.

Now today's mammograms gold standard is 3D tomosynthesis where we're actually doing a full 3D reconstruction of the breast. So imagine a hundred page book of where's Waldo cartoons where you're trying to go through it and find two or three instances of Waldo. What our software does is that we go through it and it's like the big older sister or smart older sister, I should say, who went through, put bookmarks or post-it notes on each of the pages, a hundred page book that had Waldo in it. And she circled them for you. So instead of the radiologist, having to go through and find everything, he's just, he or she are just flipping through to the bookmarks and agreeing or disagreeing. So our key value proposition is reducing the amount of radiologist read time by over 50% and helping them improve their accuracy to make sure they're calling the right artifacts as breast cancer and reducing that recall rate. So you're not being called back for unnecessary supplemental screening.

All right. So I heard three things. As CIO. I heard three things I heard the clinician experience, you're going to help them, correct. We're going to hear AI about a thousand times on this floor, but this is one of those cases where AI is actually looking at the image. Circling some things and saying, Hey, look over here. Not necessarily doing a diagnosis, but saying, this looks, suspect. Essentially So we're going to increase their efficiency. We're going to improve their overall experience. But the other thing I heard is today, the patient has to wait seven days for a read. So there, they're sitting there thinking, Hey, I might have something. It's seven days. And so we're going to improve the patient experience as well.

Right? Yeah. Quite a bit. It there's a book that I was listening to, audio book that I was listening to by Sheryl Sandberg, Option B where she talks about having just lost her husband. Going to parent-teacher conferences, trying to decide which room to go to. And then she gets a call back from her doctor because she had taken a mammogram earlier in the day. And the doctor tells her she needs to come back for supplemental screening the next day. And so having gone through this traumatic event she flips around and just leaves the school building going this isn't that important.

I want to spend the time with my kids. Now that level of anxiety is something nobody wants to go through. And unfortunately it happens quite often these days as based on the current standard of care. So if we can get that result to the patient while they're still in the room taking the exam, they can get the supplemental screening right away and reduce 24, or sometimes you can't even do the supplemental screening next day. You may have to schedule it a few weeks out. That's a lot of anxiety that nobody should go through.

Alright. So you had me at hello here. You're going to improve the efficiency. You're going to improve quality and you're going to improve the patient experience. These are three things we're obviously looking for. So let's, let's start to delve into the solution. What does the solution look like?

So power look is based on a Dell 7920 or a Supermicro rack system. We have NVIDIA RTX 4,000 cards that we are a GPU based processing device. A.com node is created at the mammo system. The doc on images flow into our GPU based processing system. And we can process a case in two and a half minutes or less. And my team will build a workflow out based on the total volume per year at your facility. Doesn't matter how many entries you have? We can centralize, we can deploy in, in, in several different ways on prem, by the memo system or in a data center. So we're, we're very flexible in our deployment.

Wow. You used a lot of, a lot of terms very quickly there. So I'm a CIO. You got to remember, I'm not technical anymore. I really work with doctors. So when I hear that, the first thing I think about is, well, the first thing I heard was two and a half minutes to process. That's that's pretty neat we can almost get to the point of realtime reads.

So that is really fast. I mean, the media cards do a great job of processing those BTO files and it's really when we get the data. So if we're not on prem by the gantry, we are dependent on the wind speed from the gantry to the data center. But again, once we get the data, we will process in two and half minutes.

So I would love to say that our health system is standardized and we have all the same equipment across the board, but we don't. In some cases we have multiple PACS systems. In some cases we have different devices at the edge. So my next questions would be around just integrating with all those various systems.

Yeah. So when iCAD hired me three years ago, they wanted me to be build to interoperability matrix. So as of today we have 68 PACS versions and we're very flexible. All the major PACS vendors we're going display. We have the dotcom SR structured report. We have the.com DSPS, the gray scale salt copy presentation state.

And we have the dot-com secondary capture. All three are very flexible. Again, it's been my job to try to work with the PACS vendors as the liaison to get everyone up to the SR. But again, the SR can toggle the GSPs and toggle. The secondary capture is a roadmap for the radiologist. They build their hanging protocol with the secondary capture on one side and the Tomo stacks on the other. And again, that's part of my job, setting expectations with the radiologists at the, at the first. So they see the product, so they understand how it's going to display.

Yeah. I'm going to go into that, the experience for the cardiologist, all the clinicians in the minute. Brad, tell us about the architecture.

Yeah, no, absolutely. So all of these applications with AI, as we start to modernize what we're running in the data center it needs a home. We can't just have a single server for every type of AI application that's out there. We need to create a stack in our data center to run all of these different applications.

As we start to add in breast cancer, lung cancer, liver cancer, we start to look at pneumonia, pneumothorax, and conditions of the brain. All of these applications need to have a home. And what we've done with NVIDIA AI enterprise is create a stack that runs all of these applications. We partnered with VMware, a fantastic virtualization partner for us to run all these different workloads at the same time.

So from the bottom layer, we have our certified systems. Dell is one, Supermicro and many others. We run our virtualization stack using virtual GPU for, we can slice that GPU for all these different workloads. And then we have all of our applications on top of that. So whether it's powering our Berkeley that's tops or powering our AI applications like iCAD and others this is the way that we bring all these applications at scale resiliently in the data center.

We talked earlier. When you shared some of those things as a CIO, I'm sitting there going, this is pretty interesting. So I'm going to get your infrastructure and stuff in place. I'm going to bring it in via clinical front door and the clinicians are going to support it. They're gonna say, Hey, this is going to make me more effective and whatnot.

And then I hear you say, all right, we're going to put these pieces in place. We're going to put the ability to virtualize, essentially that AI stack. And now we're going to be able to layer in other technologies. I'm going to start to be able to look at different things to bring in and potentially change the overall experience on a lot of different areas.

And we're going to actually, we'll be on the stage again tomorrow talking about It. Yep. So are, let's talk about the, clinicians. I've got a sell here. So I'm going to have to go back to them and say, I'm not gonna use the word AI if you don't mind, I'm going to say we are going to make you more productive. We're going to make you more effective. We're going to help you to improve quality and outcomes. Talk about the systems that you sat down with and how the process goes from the time you brought in until the time they actually say, yeah, this makes sense. I see it.

So speaking from just a radiologist, what I do when I talk to them, we talk about their current CAD product 2D CAD. And we had to start there because they'd been using it so long. And we have to explain that this is AI that is completely different than 2D CAD. The false positives are not there. I mean, 60% of the time, they're not going to have a mark with profound AI. So when you start talking about that, they kind of start laughing. Some of them do because they don't believe you. All right. Because they've read with 2D CAD all these years and there's all these false positives. So we have to start there. So then we start there. Then we give them a case. Sometimes we run their own case. And we take 50 of their cases and we run their current and their prior and we show them our sensitivity and specificity because we do have a reader study.

Laurence talked about the 52% reduction in reading time. Our sensitivity or specificity, it's all on the reader study, but they want to see it on their own results. So that's another way that we can, we can do that. We retrospectively run cases and we show the radiologists how we perform.

So again, once we do all that, we've got clinical buy-in from the clinicians. Now, how does it display? All right. What's the buttonology what tools that they need to click? How's it going to look in their hanging protocol? What step is it going to come up? Because again, 2D CAD is a second look, but now profound AI is for concurrent read.

As soon as they open their case, they can see our product. Okay. And that's a big difference for a radiologist. They have to trust it. They have to use it. They have to understand that this was made to read at the beginning. So again, its a tool. We want them to continue to diagnose, but at the end of the day, they understand the buttonology. They understand they can see it in their PACS and they can use it very quickly.

So the other thing about selling into healthcare, they're going to say, I want to call my friends who using this and I want to talk to them about it. What kind of stories do you have around. I need names. I need people that they can call. Have you guys been doing this in a lot of health systems?

We definitely have, so it made me know if you want me to name names now. You can drop names if you want. So, so Dr. Kathy Shalan down at Boca. She utilizes our product. Dr. Randy Hicks at RMI in Flint, Michigan. Solis Mammography has implemented our product across the board. So those are just three to throw out. And we definitely can offer you to call those folks.

And so if I'm calling those folks and I give those stats 50% less time improved outcomes, those kinds of things, they're going to say, yeah, absolutely. This is what we're seeing.

They going to give you their metrics for their specific radiologists.

And likely they're going to be as good if not higher. Exactly. Fantastic. All right. So let's talk about the patient experience a little bit. Iit would be interesting. I'd like, I'd like my radiologists talk to some patients about how this changes the game for them from an experience standpoint. And this is what goes into where they actually do have choice. They can choose where they're going to go for these images and we're tryng to really create a competitive advantage in our health system. And I'm wondering what are we hearing from patients?

Yeah. So that's one of the great things and we actually provide marketing material for our customers to use, to help educate their patients about the technology being used. To be honest, the technology is really behind the scenes. So the patient doesn't see it most of the time, but what they feel is a faster read. Depending on the facility, they may be getting the results before they leave. And that's a differentiator that can be marketed. We also talk about the reduction in false recalls or unnecessary recalls where the patient doesn't have that increased anxiety that I was talking about earlier.

And one of the novel things that we're working on is a risk estimator. So a lot of many women are familiar with the risk of breast cancer being linked to family history and other environmental factors. We actually have a risk product that is based solely on the mammogram and has a much higher accuracy than these family history is.

I believe the statistic is over 85% of breast cancers come to women without a family history of breast cancer. So what our software does is it looks at the rate at the mammogram and tries to identify things that a radiologist wouldn't read or identify as breast cancer, but that appear earlier. And so we generate a report. Along with the software, we can generate a report that gets delivered to the patient at the end of their mammogram, that gives them an indication of their personalized risk for breast cancer in the next one or two years. And this can either give you increased confidence, or it can also give you the indication that you have to pay special attention for either additional screening or follow up care.

All right. So we're putting AI in front of my workflows. And I have to ask the question about business continuity, disaster recovery around this. Talk to me about the architecture that's going to allow for that.

Yeah, no, absolutely. And I definitely want to hear it from the iCAD folks too as well, but, well,

Actually, yeah, as you're sitting there, if you're thinking about questions, feel free to, to think about, and we'll, we'll come to those in a minute.

Definitely. Absolutely. So as we started to think about putting these solutions in our data centers, we need to be thinking about resiliency. We don't want to build a system just like with our EMR and our act and other solutions, all of the other applications that are being supported as we start to build out, AI need to have that same sort of fail over high availability, disaster recovery, if something bad happens. If we're running on single box solutions, which is what we're seeing with AI a lot today, which is I buy an AI solution. I buy a box, I put my data center and I forget about it. As I start to build up 10 or 15 or 20 of these, if one of them just, I lose a power supply, I lose the GPI, I lose the storage. That system goes down and those radiologists are left waiting, waiting for results because they're still using their same applications. They're still using their PACS which happens to be up. We need to give all of those AI applications, the same level of service where we've virtualized that application.

We put it on top of a VMware stack and we have multiple boxes. And in the event that one system goes down, we just flipped those machines, whether they're containers or whether they're BMS onto the next box. Right. And so there's, no one feels anything. The IT departments are aware immediately with the single pane of administrative blasts, but the end users don't even know because their applications still continue to function.

And we can do that in a hybrid cloud environment. We do that multi location. We can the same way we can do. I mean, the nice thing about having VMware on top is the same way. We've been able to move workloads around the cloud, around the internet. We can do that same thing with now, these AI workloads.

We have to, right. And as we look at scaling up our environments some applications maybe run during the day. And so, or maybe certain days of the week, being able to scale up and do a hybrid cloud when we need to, it makes a ton of sense. Or if we're, we want to try something out, let's say that the next version of a product without having to deploy separate infrastructure, just grow VM on the stack, try it out and really get that experience. So absolutely we have to support those workflows.

Just to build on what Brad was saying. I was talking with a customer a couple of weeks ago and they have a modern PAC system. It's set up on the VMware environment, behind F five load balancer. And he set it up so that if anything goes wrong with the PACS, it'll just float down to the backup data center in, under, I think he said 18 minutes.

So we're talking about the end nines they are going on. And he said, he's had to do that a couple of times. The reliability of the PACS obviously is critical. That's a key mission, critical system. The AI though he said, okay, if I'm down for an hour, that's fine. The radiologist can read something else. But if it's down for more than that, like a day of one of the towers breaks, then they changed the billing application. Now it's taking you twice as long to read the mammograms. So that's not really acceptable. So does he need the same level of resiliency on the AI?

Not necessarily, but does he need a high level of resiliency? Absolutely. And that's where we're putting this on the VMware stack in the data center with that reliability really gives him the confidence that he's looking for to deliver to his clinical team.

I love technology that when you know, it, it takes a while to get the clinicians to say, yeah, this makes sense. Let's do it. But when you take it away from them, they get upset. And that's when you know, you've done something right. Technology. Here's what I'd like to do. I'd love for each you, this, this is interesting as we think about the introduction of these new technologies and what you've been able to do on the technologies. This is more, just a general question. And that's where do you see the market going for these modern apps? Where do you see, how do you see the apps evolving in? How do you see healthcare potentially evolving as a result of it?

So very interesting question. With mammo AI, traditionally, the AI has been sold along with the gantry. So people almost think of it as an option to the gantry. And that's why we've had a lot of towers that were sold with the systems. As we move forward with AI and AI increases in prevalence throughout healthcare we have a lot of these modern AI app stories coming up. So the AI may be running after the PACS. And the PACS will determine which mammo which AI application to run. And there's a lot more choices to pick from. So we have a lot of AI stores in general, and we also have the PAX AI stores. Now mammo AI is a little different. I like to think of it as some of your premium cable subscriptions, like Showtime or HBO just because we have, we're dealing with very large file sizes.

So, if you move in after the packs, you have to consider the entire flow of information where the images now have to go from the gantry to the PACS. And if the ag stories up in the cloud transfer up to the cloud. So that impacts your overall reading workflow. There are a lot of companies and customers who are looking to do that because it's easy to buy your like you get your cable subscription as a bundle. It's easy to get your AI subscription as a bundle to but at the same way, Showtime and HBO went direct. A lot of customers prefer to get it direct. And I think having the IT infrastructure to support this mixed environment is really crucial for CIOs and CTOs to consider as to how they, how do they service all these different needs going forward.

If I could add it used to be PACS interoperability but now it's platform interoperability. Cause that's what you as a CIO is looking for. And that's what my team now that's the future. And that's what we have to start doing today. And that's why we're up here with NIVIDA and VMware is for the future. For the platforms to be embedded inside that.

I'm glad you brought that up because we throw the platform around, word around a lot, but I see it here. Right? So I'm going to be able to bring in AI capabilities and to be able to virtualize those AI capabilities. And I'm going to have a single application on top of it, but then I'm going to bring in additional AI applications on that same infrastructure over time. It's interesting because I mean, we can start layering these things in a lot of different ways. And as a CIO of, I'm starting to think, all right, I'm building out my AI infrastructure. So now when I talk to other partners, if they're not going to be a part of the platform that I have, they're just going to increase my complexity and my cost. And so I'm going to look at them and say, not have your application work in this platform because this, this is how we're bringing AI to bear in our environment.

Yeah. And that's something I forgot to mention. We were talking about modern apps which generally refers to Docker applications. And that's where a lot of the AI marketplaces are looking to get their AI algorithms from. And so we recompiled their software so that it runs under Docker. And so we were part of many of these AI marketplaces as well as the direct offering. And it really gives customers a lot of flexibility to implement the best AI on their preferred platform or technology stack.

All right, Brad, where do you think it's going?

AI is going to be everywhere. We've done so much looking at where we plug AI across the entire workflow from ordering to protocoling, to post-processing to detecting things in the images and to delivering reports and give them to the referring to. We're seeing AI at the population health level. We're seeing AI at the departmental analytics level to make sure that our radiology departments are working as efficiently as possible.

We're seeing interoperable AI where we've got models that are going to do just part of the process. So for example, we've worked with one for, it was a COVID 19 model where we had one model that was identifying months, and then another model that was identifying disease. And that's something that we hang together as we start to build this up for all across the spectrum to have that. I mean, while they use the word platform, but to have the ecosystem to run all of these things, that's what we're going to see in the future. And that's really more about how do we power all of these applications in a way that's scalable, that's resilient, that's all ultimately effective and driving the right behaviors and what's going on in the hospital and the departments.

I'm going to end probably on this. The architecture is important to me because most of the applications I'm talking to people about right now have to do with images. Yeah. I mean, we were talking a mammography here, but I heard somebody essentially say, look, we're putting cameras in a supply closets and we're running it through an AI algorithm and it's, it's helping us with our inventory management. We have these cameras in the parking lots that are now just looking at things, but it's again, those images are being processed through this whole stack and it's coming up and saying, Hey, here's our recommendation. Here's our here's what we're seeing. It's actually computer vision being applied to this. That's an awful lot of bandwidth. That's an awful lot of infrastructure. Is there something I need to be thinking about as a CIO at this point?

You need to be thinking about what do you need to have in place to make this real right? You need to look at what does that infrastructure look like for all those different workloads working with the champions in each department and working with the champions in health rrcords. And other places to build out that team to create a roadmap for where you want to go. But then also making sure that you've got the pieces in place to make that happen. You can't construct this piecemeal effectively. You need a plan and to really step through what that looks like three to five years out. Absolutely.

All right. So I'm going to put this to you guys. It's more, not a CIO question, but just a curiosity question. I do this to entrepreneurs all the time. And it's all right, so you're doing this AI stack omography. If you weren't in this space, let's assume your IPO'd you're out and somebody else is going to run it and I'm sitting here with a lot of money going, all right, where do we go next for this? What's an area within healthcare that you're, you're saying, Hey, if we applied similar ideas around imaging AI, that you would apply within the healthcare stack.

I mean an interesting question, because it almost feels like a lot of this is being done. Dermatology imaging is another one that makes a lot of sense to me in terms of taking pictures cancer or any sort of detection skin cancer detection or any dermatology issue. But I know there's a lot of research being done on that.

And you can do that directly with the patients, right. The iPhone camera is so effective. And for dermatology, you take that picture. Maybe it's not 3D, like what you're talking about here, but it's.

No, yeah, I'm sorry and I didn't condition elaborating, but I yeah that's what my thought was that do it at home. So at the same way I have a doctor mom's auto scope. So if my kids complain about a sore earache, I can take a look and decide, do I need to take them to the doctor or not? Or is it probably, is there nothing there. It'd be nice to be able to do the same thing for skin issues. Take a picture. Does it show something? And if so, then we know to schedule a visit or not.

All right. Hey, I want to thank the three of you. I want to thank you for the work that you're doing and I want to thank you for this time. It's been fantastic. Thank you. Thank you so much. Thank you. Thank you.

What a great conversation with Laurence, Josh and Brad Genereaux with NVIDIA and the iCAD solution. I love talking to them about this. It's really exciting to see the efficiency gains for the clinicians. It's also exciting to see the quality gains as well. We appreciate VMware making the booth possible. NVIDIA obviously in the iCAD solution as well. And if you're looking for some more conversations like this, this is the conference channel. We have another channel it's called this week health newsroom. And we have a bunch of interviews just like this, that I've done over the last couple of weeks at 📍 the various conferences. You can head on over there and check those out as well. I wanna thank you for listening. That's all for now.

Thank You to Our Show Sponsors

Our Shows

Newsday - This Week Health
Keynote - This Week Health2 Minute Drill Drex DeFord This Week Health
Solution Showcase This Week HealthToday in Health IT - This Week Health

Related Content

1 2 3 247
Transform Healthcare - One Connection at a Time

© Copyright 2023 Health Lyrics All rights reserved