This Week Health

Don't forget to subscribe!

Medical AI can’t interpret complex cases yet. The arrival of multimodal large language models will start the real revolution – and real assistance. MedicalFuturist.Com


Today in health, it multimodal large language models in healthcare. We're going to take a look at what's coming next. My name is bill Russell. I'm a former CIO for a 16 hospital system and creator this week health instead of channels and events dedicated to leveraging the power of community. To propel healthcare forward. We want to thank our show sponsors who are investing in developing the next generation of health leaders. Short tests are decide parlance, certified health, notable and service. Now check them out at this week.

If you get a chance, share this podcast with a friend or colleague use it as foundation for daily or weekly discussions on topics that are relevant to you and the industry. They can subscribe wherever you listen to podcasts. All right. As you know, we've partnered with Alex's lemonade, stand to raise money and awareness for childhood cancer. And we set a goal of $50,000 for this year, and we have exceeded that goal. We still would love to exceed even further. We ask you to join us, hit our website in the top right hand column. You're going to see a logo for the lemonade stand. Click on that to give today. We believe in the generosity of our community. And we thank you in advance. All right, one last thing. This Thursday, one o'clock Eastern time. We have a webinar. That is really. Can't miss webinar. It is on our AI journey though. So far. And we have a. Michael Pfeffer with Stanford. We have Brent lamb with, UNC, North Carolina. And

we have Christopher Long Hearst with UC S D. And we're going to talk about where we are in our AI journey thus far. , if you hadn't had a chance to go ahead and hit our website top right-hand column again, just underneath the lemonade stand logo, you're going to see. The image for the webinar. Go ahead and click to sign up today. We look forward to seeing you there. All right today, I'm going to talk about multimodal large language models. I've talked a little bit about this last week in terms of the, , the. AI hype cycle. And the, , the curve that exists for AI. And the things that are going on, and this is one of the technologies we talked about. , I found a great article. It's a, the medical futurist. Dot com and the title is why it's important to understand multimodal large language models in healthcare. Really well done articles September 5th, 2023. And let me give you a little, what it says. , medical, AI can't interpret complex cases yet. We've talked about this yesterday. The arrival of multimodal large language models will start the real revolution. And real assistance. Okay. So I like this article because they give you a couple of key takeaways. The development of a multimodal large language models, M L L M's is crucial for the future of medicine as they can process and interpret multiple types of data simultaneously. Unlike current unit model AI systems. This will enable comprehensive analysis and medicine facilitate communication between healthcare providers. And patients speaking different languages and serve as a central hub for various unit model AI applications in hospitals. When you think about it, This is almost like having a single person who's a specialist or a team approach to medicine. And multimodal large language models take a bunch of different models. Bring them together. And give you the ability to interact with it in one way and have it process based on who's the experts sort of dish it out. And understand who's the expert. Who's going to supply the information so you can have an AI model, a large language model that's trained on. , medical coding. You could have one that's trained on medical diagnosis. You can have one that's trained on drug, drug interactions. I mean, you get the picture. We could have trained models in a lot of different areas. We interact with one. It decides what part of the brain is going to provide the best resource.

Again, this is a really good article. I'm going to give you a little bit more. Here's another key takeaway while the public debut of large language models like chat GPT has been a resounding success. Current AI systems lack the capability to process multiple types of data,

making them inadequate for the multimodal nature of medicine, the transition to M LLMs. We'll be necessary to substantially reduce the workload of healthcare professionals. , their final takeaway, the journey is challenging, but necessary to move medical AI from being a calculator to matching the supercomputers, we call doctors. Okay. So, let me give you a little bit of the article is sell

the future of medicine is undoubtedly inextricably linked to the development of artificial intelligence. Although this revolution. Has been brewing for years the past few months marked a major change as algorithms finally moved. Out of the specialized labs and into our daily lives, the public debut of large language models, like Chatsy petty, which became the fastest growing consumer application. You got the picture. We've talked about this many times. , Large language models will soon find their way into everyday clinical settings simply because global shortages of healthcare professionals. Is becoming dire and AI will lend a hand. With tasks that do not require skilled medical professionals. But before this can happen before we can have sufficiently robust regulatory framework in place. We are already seeing how this new technology is being used in everyday life.

To better understand what lies ahead. Let's explore another key concept that will play a significant role in the transformation of medicine. Multi modality, doctors and nurses are supercomputers medical. AI is calculator. A multimodal system can process and interpret multiple types of input. Data such as texts, images, audio, video, simultaneously current medical AI only processes, one type of data. For example, text or x-ray images. However medicine by nature is multimodal as are humans to diagnose and treat a patient. A healthcare professional listens to the patient, reads their health files looks at medical images and interprets laboratory results. This is far beyond what any AI is capable of today. The difference between the two can be like into and they give you some different analogies. At the moment, large language models, like Chatsy BT. our unit modal, meaning that they can only analyze texts. Although GPT four has been described as able to analyze images as well for now, it can only do so via an API. For the medical futures perspective, it's clear that multimodal LLMs will arrive soon. Otherwise AI won't be able to significantly contribute to multimodal nature medicine and medical care. So they go into the future. AI will handle multiple types of content from images, audio texts, analysis, image analysis, sound. Video complex document analysis. It's going to bridge language barriers. An MLM. M L L M will easily facilitate communications between healthcare providers and patients who speak different languages. They can input. They can process it can then output in the appropriate language. , including the grade level and those kinds of kinds of things. Finally, the arrival of interoperability can connect the and harmonized, various hospital systems. And M L L N. Could serve as a central hub that facilitates access to various unit modal AIS. That's what I was talking about earlier. Single interface, single. , human language interface.

That instructs and harnesses, the power of, you know, modal, AIS, various used to run the hospital. They, you could be one in radiology, one in insurance handling. One in electronic medical records. And essentially you will have all of that. Be controlled by one. Let me, , give you the end of this article. The significant step will be when M LLMs eventually become capable of understanding the language. And format of all these software applications and help people communicate with that. And average doctor will then be able to easily work with Ray radiology.

Radiological AI software, the software managing the EMR. And the fourth and eighth, et cetera, AI used in the hospital. This potential is very important because such a breakthrough won't come about. In any other way? No single company will come up with such software because they don't have access to the AI data developed by individual companies. The M L L M. However, we'll be able to communicate with these systems individually and as a central hub. We'll provide a tool of immense importance to doctors. The transition from Yuna modal. To multimodal AI is a necessary step to fully harness the potential of AI in medicine. , again, really interesting article really interesting concept. I keep coming back to this first of all, because we have the webinar this week on AI. So I'm doing a lot of research on where AI is in medicine. The medical futurist is a great resource for that. They publish a lot of articles focusing in on what the future of technology is in healthcare. I like this concept. The, so what for us is. Too, whenever you're putting together an architecture, it's important to have an idea of what it looks like at the end. Right. If you were going to build a house, it's important to have a, an image, a vision of what it's going to, going to look like at the end, you don't start by saying, ah, let's. Let's let's put some, , plumbing together. Let's put some, , electricity together. Let's put some concrete together and let's, let's put that down and then let's see where we're at. And then we'll build from there. That's a bad way to do architecture. So, what I'm suggesting is it's important to look at all these different AI concepts. And try to determine what it will look like when it's done. How your organization will function. And what I'm envisioning is that you will have a lot of unit modal AI systems throughout you'll have one in your EHR. You'll have one in your imaging platform. You'll have one, as we talked about earlier, one in your. , you're coding one in your insurance. You have one in, in the 800 applications you're using. You might have a single unit model. , LLM that is able to really understand that data. Let's assume there are 800 of them. Where you're going to want is one master. And that one master is going to be able to take the input and then interact with the various systems. When you think about that and how powerful that's going to be, essentially sitting from your desk, you're going to be able to access all the patient information. You're going to be able to access. All the insurance data, you're going to be able to access all the coding data. And you're going to be able to do that with natural language and it's going to respond. In a way and interact with those systems in a way that it can create the discrete data in all of those elements in all of those systems. So. , I think it's important for an architecture standpoint to understand where this is going and to try to design intelligent design around where this is going and what it's going to look like. All right. That's all for today. Don't forget to share this podcast with a friend or colleague, and we want to thank our channel sponsors who are investing in our mission to develop the next generation of health leaders. Short test artist, site parlance, certified health, 📍 notable and service. Now check them out at this week. Thanks for listening. That's all for now.

Thank You to Our Show Sponsors

Our Shows

Solution Showcase This Week Health
Keynote - This Week Health2 Minute Drill Drex DeFord This Week Health
Newsday - This Week HealthToday in Health IT - This Week Health

Related Content

1 2 3 251
Transform Healthcare - One Connection at a Time

© Copyright 2023 Health Lyrics All rights reserved