This Week Health

Don't forget to subscribe!

August 19, 2024: Jacob Hansen, Chief Product Officer at AvaSure joins Bill for the news. How are AI-driven solutions, like computer vision and natural language processing, reshaping the workflows in hospitals? Jacob Hansen sheds light on AvaSure's approach to leveraging AI to augment and automate care, ultimately aiming to reduce the cognitive load on clinicians and improve patient outcomes. The discussion also touches on the evolving role of data analytics in healthcare, particularly in tackling complex challenges like sepsis. As AI continues to advance, what ethical considerations must healthcare providers keep in mind to balance innovation with patient privacy? And how can healthcare systems ensure they’re fully utilizing the potential of AI while maintaining transparency with patients?

Key Points:

  • 02:18 AI and Virtual Care in Healthcare
  • 06:08 Nursing Workflows and AI Integration
  • 12:05 Privacy Concerns in Virtual Care
  • 14:27 Analytics and AI in Healthcare

News articles:

This Week Health Subscribe

This Week Health Twitter

This Week Health Linkedin

Alex’s Lemonade Stand: Foundation for Childhood Cancer Donate

Transcript

This transcription is provided by artificial intelligence. We believe in technology but understand that even the smartest robots can sometimes get speech recognition wrong.

We want to thank our partner, AvaSure. Over 1, 100 hospitals are using AvaSure's virtual care platform to engage with patients, optimize staffing, and seamlessly blend Remote and in person treatment at scale, drive measurable outcomes and augment clinicians with an AI powered solution that deeply integrates into your clinical workflows.

Aversure offers virtual care solutions supported by a secure, scalable infrastructure that helps you to lead your organization into a future where cutting edge technology is at your fingertips. and Compassionate Care Converge. For more information, check them out at thisweekhealth. com slash AvaSure.

 Today on Newsday.

you've got real time flowing data that now you take this form of AI.

And you put those things together and that can create remarkable outcomes.

My name is Bill Russell. I'm a former CIO for a 16 hospital system and creator of This Week Health. where we are dedicated to transforming healthcare, one connection at a time. Newstay discusses the breaking news in healthcare with industry experts

Now, let's jump right in.

(Main)   All right. It's news day. And today I'm joined by Jacob Hanson, chief product officer for AvaSure. Jacob, welcome to the show. Thanks so much. Thrilled to be here. Hey, so tell us chief product officer. So what does a chief product officer do day in and day out?

Yeah, good question.

I would say the answer probably depends on the organization, some organizations, it's very specific. To inbound market intelligence, road mapping, prioritization, and strategy. Other businesses, sometimes that'll include technologists, architecture. I've had roles like this that also included UI, UX people.

But at AvaSure our team product, our core focus is on gathering and aggregating inbound information from the market. What challenges do our target markets face and which of those problems make sense for us to solve and why, how do they align to our strategy? How do they shape our strategy?

And then how do we combine that into a roadmap that provides a compelling vision to the marketplace and to really guide our business going forward on IoT. Technologies to acquire, to build, to partner for, et cetera.

So you're one of the companies that's looking at computer vision and that's looking at AI with regard to cameras and those kinds of things and bringing that all together.

I don't find health systems saying, Hey, give me more AI, but I do find them asking to solve problems that AI is uniquely suited to solve. What are you seeing in the market?

Yeah, I would say, aI is a topic that we are almost constantly asked about not necessarily from a lens of, Specific understanding, just a knowledge of, hey, we feel like we need to be doing something with AI.

It helps a lot to get a finer point on what that means, right? Because AI is computer vision, it's ambient listening, it's natural language processing. Generative AI, it's machine learning. There's so many ways to talk about it. I think it's actually becoming so common to talk about now that our market is getting far clearer and smarter about what they expect.

When you think about our business, we're a virtual care platform. We call ourselves an intelligent virtual care platform. That really means that we're a hybrid. Hardware, software, solution, and inpatient acute care that needs to connect any virtual care team member into the patient room. Our primary space has been in med surg, EDs, spaces like that.

That means that when a health system thinks about adding that technology to the room, they want to layer as many use cases to drive efficiencies for their care teams to succeed in their roles as possible. And historically, virtual care, especially when we started 15 years ago, was a point solution kind of situation, right?

You were focused on a workflow, i. e. virtual sitting. And that was delivered via mobile devices with decisions being made on a unit by unit basis, right? A nursing director had a budget, they would choose something.

That has changed so much and so fast over the last few years, really driven on the heels of COVID. Most of our conversations now, and most of the new deals that we're doing are phased, but still device in every room kind of conversations where they're trying to get this platform in there so that.

Nursing. We're trying to bring joy back to the bedside, right? Take all of these extraneous tasks that don't need to be at the bedside and optimize around, a virtual sitter who can monitor many patients, not just one. A virtual nurse who can really stay dedicated to bedside care with, if they're on the unit, right?

Because, virtual nurses are doing things like admin discharge or proactive rounding, etc. So when we talk about computer vision circling back to your question for us, computer vision becomes a real dedication to workflows that are either about augmentation, Are we augmenting a human in the loop, i.

e. sending them notifications driven through what a camera sees in the room? Best example of that would be sending notifications to patients for About a falls risk patient that's starting to, move their legs over the side of the bed, or an elopement patient trying to leave the room. Or, are we automating?

Which in the case of our newest models, an example of that would be early mobility. Did a patient ambulate enough, sit in the chair enough, stand and walk enough to meet a discharge requirement before being discharged or bed flow is another example of automation where it's not augmenting a human being that's involved with them.

It's, watching for something so that a manual task doesn't need to gobble up a care team member's time.

That's a good segue to our first story. So automation augmentation, and the first story we're going to cover abridged partners with Epic Mayo Clinic to generate. To create a Gen AI nurse workflow tool.

to be available by the end of:

Collaboration leverages abridged AI capabilities, EPIC's developmental resources, and Mayo Clinic's nursing expertise. here's the thing I like about what you were just talking about and what this is talking about is nursing. We've talked about these workflows. In fact, if we talk ambient clinical listening, About 98 percent of the conversations are going to be about the doctor and the doctor patient visit and those kinds of things.

But the nurses and the nurse workflow, especially the nurses who have to put the information into the EHR, that's a significant portion of their day, of their time. And we can make them much more effective take a lot of the cognitive load and burden off of them. and potentially be more accurate, potentially with some of these tools.

I like the fact that the nurses are getting more of a spotlight, in terms of what can we do for them.

It's also interesting because this is not a new ask from nursing. I remember when I first started one of one of the jobs I had way back when was with This is pre Baxter days. And I spent an inordinate number of hours watching nurses work as we were evaluating ways to think differently about the deployment of. Nurse communication tools. And I cannot tell you the number of nurses who said, Hey, you've got a device here in the room with a mic on it.

Can't you use that device the way we know, Dragon's working for physicians? We spend so much time documenting. At the end of my shift, I'm sitting here trying to capture her. things, this would make us so much more efficient. Now, these conversations 15, 16 years ago.

you're saying it's only taken us 15 years to hear the voice of the nurse and respond.

like, I was CIO at St. Joe's:

Now, with that being said, ambient clinical listening, in the last two years has just completely been reshaped by generative AI. I

was going to say, it helps a lot consumer grade ambient listening products have taken leaps forward. And I think the technology components out there that companies like Abridge in this case can take advantage of make a huge difference because It's one thing to provide a tool like that to nursing to support them moving faster, but if the, if you have an NLP model that's taking unstructured data from recorded voice and turning it into structured data, but 80 percent of it's incorrect, then they're going to spend every bit as much time just correcting the incorrect words.

Notes captured. So I think it helps that the technologies come so far, and then I think it also it's a big deal that we now have organizations who got large language models that can really take aim at nursing specific um, Things, right? The things that a nurse talks to a patient about are different than what a doctor talks about or some other care team member talks about.

Not completely different, but different enough that you can't take a physician to patient focused model and then point that nursing discussion and have the same level of accuracy come out of it. So I think all of those things make a big difference. We're excited about this because as a Again, as a virtual care platform player, one of the things that we've just released is we've opened through a secure RESTful API access to our platform to third party algorithms and applications so that we are not just a gateway for our own tools but let's say an example like a bridge, then they can use the hooks from our API to tie into the microphone in the room that we've installed and now that microphone that's part of a fixed device.

on the wall, in the ceiling, what have you. And it can allow that third party artificial intelligence to listen and they can consume it and then on behalf of the health system, do whatever's needed with it, right? So as a platform, seeing these types of advancements is something we're super excited about so that more value can be captured on the back of that expense to put that technology in the room.

  📍 📍 📍 📍

th,:

Join us for dynamic sessions, interactive workshops, and keynotes from trailblazing women in the industry. This event offers actionable strategies and fosters genuine connections. Whether you're a health system employee or a vendor partner, SOAR provides unique networking and growth opportunities.

at bluebirdleaders. org slash:

📍 What are we hearing from a privacy standpoint? So I'm a patient in the room. What kind of requests am I making with regard to either the microphone listening in or the camera watching me?

Yeah good question. I think there's couple of things there.

The first thing I would say, most important is transparency. When is a human being watching me? Who is it? How long is that happening? Why are they monitoring me? More often than not, the more open and clear we are with education on ADMIT. The more a patient values it, right?

If they know that it's actually helping them stay safe, recover faster, etc. Then a patient is generally very accepting of technology's role in their care. They just want

to know what it's doing.

And then also, so for example, there are some solutions that in our platform, there are indication lights on the device that will show them, is audio active?

Is video active? And that way they know, hey, somebody's connected with me right now. And there are some other things that we're looking at to go along with this. When you think about when AI is continuously running for an automation focused outcome, in that case can we do more with a secondary sensor, that is still computer vision but imaging that's not off of a traditional RGB camera?

There's lots of examples out there. Thermal, LiDAR, radar, sonar, right? That produce an image that AI can interpret. We have now four coming on, soon to be five or six different computer vision models. Some of those could work off of technology like that. So we are exploring things , that make a customer feel good about the fact that, Hey.

Even though imaging is being captured, it's not a picture of me. And then the last thing I'll mention is the difference between live streaming and recording, right? If all you're doing is got a stream of audio or video, but it's not going anywhere and being stored anywhere. A patient generally feels a lot better about that, whereas entities that are capturing that and keeping it for some reason, you better find a way to make sure that it is anonymized.

All right Jacob, I just want you to know that this is being recorded and we are going to release it to the public.

I came in

ready for that. Thank you.

Just want the transparency there. Another story, how Tampa General is boosting efficiency through analytics. I find this one interesting because they partnered with a really defense contractor, Palantir Technologies, and it's led to significant improvements, including 30 percent reduction in sepsis, patient length of stay, and millions in revenue cycle enhancements, leveraging Palantir's analytics.

The hospital has optimized sepsis protocols, improved charge capture by 4 million. It's interesting. We did meaningful use all those years ago. And the promise during meaningful uses, man, if we get all this data, we are going to be able to do some amazing things with the data.

Do you feel like we're finally at that point where we're doing some amazing things with the data?

Isn't it funny? This is the second time in our conversation. We've referenced us as an industry talking about stuff for decades. actually makes me think a lot about population health. There was a time at HIMSS you'd walk around and population health or actionable insights was a phrase used in like literally every booth you went to.

I yes, a hundred percent. Gen ai you take an example like I actually think the article here about Tampa Gen mentions copilot. Things they want to keep doing with ChatGPT and Copilot as examples. There's effort that goes into extracting useful things out of data. And data sets in healthcare are massive.

There's so much that could be interpreted. But when you look at a case like sepsis, I worked with a talented woman who went into the hospital after a business trip with the flu, and it moved into sepsis within 48 hours and she passed away. No reason somebody in her situation should have moved into sepsis that quickly.

Are there things that we can extract data in real time? that can help avoid outcomes like that. And I think many of us have probably examples from our own experience like that, that were hard pills to swallow, especially when we work in this industry We don't want to use too many stories like that to try and motivate us because they're personal. And we see cases like that where we can do better. I think Sepsis is a great example and one where, yeah, there's plenty in there that AI can be pulling out on. So here we have these great analytics tools, assets.

We have one at AvaSure. market leading analytics asset, centralized, nationally benchmarked, all of those things. How can we use AI to bring the data to the right people at the right time to change what they do? That's the question.

The quality of the data has always been a challenge. And the when you're looking at Cephsys and also we talked about computer vision before as well, these are some of the purest data sets we have.

A camera doesn't lie. Camera gives you information. You can then run models on that. You can then determine, how a patient sort of moves through the system, and then you end up with historical data. You could look at things. The thing I like about sepsis is the telemetry data from those devices at the bedside are beautiful.

It's beautiful data. It's just, it's a lot of it, but it's essentially it's a string of data all along the process. Yes. And again, historically, you can look back on things and then you can make predictions based on this. I think the beautiful thing of bringing Palantir in is, that they've applied this in defense.

And so they've applied this in a lot of different ways. And now I think they're making a significant move into healthcare, but you take those same techniques, those same processes of how do you analyze things very rapidly and, And what it requires is a steady stream of real high quality data, and then the intelligence and the mindset and the skills to put together algorithms around that and to see the trends and see how things work out.

And you can drive things like, 40 percent or 30 percent reduction in sepsis patient length of stay. and millions of revenue cycles. The revenue cycle stuff is, I don't want to say easy, but a lot of people are doing it. low hanging fruit, that's for sure. Yeah, it's definitely low hanging fruit, but it's millions of dollars, so you absolutely should be doing it.

But the sepsis improvements, that's the kind of stuff I would like those findings, those algorithms, those practices to be, essentially broadcast across the entire industry.

The interesting thing will be to see what happens when that level of machine learning, to extract intelligence out of data when that AI meets other forms AI like computer vision.

We're talking partners now about computer vision organizations who can detect rate and pulse rate from streaming video and it's remarkable. had it demoed with me. in the room, the level of accuracy is astonishing, right? It's looking at the patient's forehead and cheeks and extracting real time.

And if those devices are in the room anyway now all of a sudden you've got a patient that you think needs to be monitored in, it doesn't matter if they're in ICU or step down or med surg or what have you, now you're watching a patient you've got real time flowing data that now you take this form of AI.

And then you connect model that's asking, is this person trending the wrong way or the right way? And you put those things together and that can create remarkable outcomes.

My last question is this, and this is a curve ball for you.

Gen AI. We have Chat GPT, we have OpenAI. Essentially they're the, huge gorilla in the space. they have it feels like first mover advantage. I'm not sure they did, but they somehow were able to really just catapult into the consciousness And so they got this big lead, they're doing a bunch of stuff, but everybody's been chipping away at that. And I'm curious, talking to a product guy. So you're sitting there, you're going, okay, we're the big gorilla here and that kind of stuff. And then all of a sudden you see okay.

Big players like Google are starting to chip away and You have Anthropic starting chip away, and then you have Meta dropping a free version that is getting comparable results, and now you're sitting there going, Oh my gosh, this is, I realize everybody's projecting this to be a, multi billion dollar space, but man, as a product guy, if I put you in the chief product, role at OpenAI right now, how worried are you that's the level of competition for that space?

Rising tides. Float all boats. Float all boats. Certainly Yeah, you think about inbound pricing pressure from platform players who do lots of other things and they're just droplet bundling this in. Okay, yeah, that's disconcerting.

we see that model in healthcare, we see that model in lots of other places, and that can be really disruptive. And, if you're ahead in the quality of the thing you have, and you see the added innovation and competition coming to the space as useful intelligence and information to continue to look for new ways to innovate and disrupt.

For me, as a product guy, that's exciting because it now means I'm likely to go get more money to go do the next interesting thing, right? I'm going to go seek funds to say, look, we created this. We're clearly viewed as an organization that can do really interesting things. And based on the technology we've got, here's adjacent spaces that we can go pursue, maybe there's a temptation at times to be like, ah, we're in deep trouble and it's time to pack up and take my toys and go home.

After you've

raised the amount of money they've raised, you don't get to pack up your toys and go home. No, you do not. You get in trouble for that, yep, a lot of trouble. Hey, Jacob, I want to thank you for your time.

It's been a great conversation. Appreciate it. Yeah, it's fun. Thank you so much

Thanks for listening to Newstay. There's a lot happening in our industry and while Newstay covers interesting stuff, another way to stay informed is by subscribing to our daily insights email, which delivers Expertly curated health IT news straight to your inbox. Sign up at thisweekealth. com slash news.

Thanks for listening. That's all for now

Contributors

Thank You to Our Show Sponsors

Our Shows

Related Content

Healthcare Transformation Powered by Community

© Copyright 2024 Health Lyrics All rights reserved