Data professionals would love an easy button to be able to make clean, complete, uncompromised data for driving healthcare value. Is this more fantasy than reality? It’s definitely not an easy task. It's more of a journey of trying to figure out how to make data continuously better. We are blessed with advanced technology but we still have a lot of policy and governance and people work to do. However, we have moved in leaps and bounds with the help of modern day tools, like Machine Learning, that are giving us the ability to move faster, gain more visibility and essentially provide an extra set of hands, which is phenomenal in the field of healthcare.
Sign up for our webinar: How to Modernize Your Data Platform in Healthcare: The Right Fit for Every Unique Health System - Wednesday December 7 2022: 1pm ET / 10am PT.
This transcription is provided by artificial intelligence. We believe in technology but understand that even the smartest robots can sometimes get speech recognition wrong.
Thanks for joining us. My name is Bill Russell. I'm a former CIO for a 16 hospital system and creator of this week Health, A set of channels dedicated to keeping health IT staff current and engaged. Welcome to our briefing campaign on modernizing the healthcare data platform with CDW Healthcare's leaders in this space, Lee Pierce and Rex Washburn. Today is episode four, The Promise of Clean, Complete, and Unpromised Data for Driving Healthcare Value. This podcast series is going to culminate with an excellent webinar panel discussion with experts talking about how to modernize your healthcare data platform, the right fit for every unique health system. That's gonna be on Wednesday, December 7th.
Check out this week, health.com/webinars, and click on the link to go ahead and. register We wanna thank our sponsors, Sirius CDW, and Talend for making this content possible now 📍 onto the show.
all right. We're gonna talk today about clean, complete UNPROMISED data for driving healthcare value. When I read that there's somebody saw the title and they're like, I want to, I want to know what these guys are talking about. And my first question is, is this a fantasy? Clean, complete unpromised data in healthcare? Is this a fantasy? Who wants to take that one?
I'll, I'll jump just quickly into that. From a healthcare perspective, I can tell you that any of the data professionals that are probably listening to this wouldn't it be great if we could say we have an easy button to be able to make all of that possible.
But the real, it, it really is more fantasy than it is reality. So let's, let's be honest. It's, it's not it's not an easy task. It's more of a journey rather than an easy button that we all are, are on trying to figure out how to make data continuously better
is this the age old marriage of, hey, we have really advanced technology, but we still have a lot of policy and governance and people work to do in order to make this work.
Absolutely. That, that's part of what makes it hard to get to the reality of, of complete and clean data. But again, it's hard work. This isn't just a technology problem, right? And organizations, when you look at the number of applications and data sources that health a healthcare provider organization has to deal with, let alone the complexity of adding device data and adding external data sets I mean even just the internal application data that's generated, it's, it's hard to be able to bring that together in a meaningful way.
But, but there is hope, I guess, is what I'd say. Yes, it's a bit of a fantasy today, but there is hope.
All right, so now they're sitting there saying, . All right. Don't leave me here. , how do, how there's a path out. So let's start talking about the path out.
I think, you know so far, we've talked about, and prior episodes the fabric and the modern data platform, and I right there are some of the pieces.
So I mean, obviously with governance we have people, policies, processes, and that's the heavy lifting. That's culture change. That's we can't buy that. But being able to say, I'm gonna connect to data that I just got access to. Maybe it's new device data, maybe it's a new data feed, third party data, whatnot, and bring that in and join it to a well-governed data set, core data set, maybe from your ehr, maybe from other work that has been done and it's sitting out there in a nice clean cloud platform, we can now being able to rapidly iterate that and pull it together is the starting point.
Because historically when we go back to the, the days when I was building just warehouses data Mars and was my, was a Kimball disciple if I had a mistake, I had days and weeks for every mistake that you detected in the data. If I bring the principles of a modern data platform and fabric together, I can iterate a lot.
Then the tooling that's coming out nowadays is everything has ML pervasive, right? So there's machine learning being brought to bear on data quality, master data management, metadata management, that is allowing us to both see what we're pulling together faster, detect data anomalies bring do fuzzy matching between there's Lee Pierce and Lee P and two different systems.
We have to pull him together. We see his addresses similar. Ah, that's who he is. But ML looking at that can move things a lot faster than I ever could, writing code and the days gone by. So that's really, that, those are some of the pieces that are, that are bringing it forward. Just this, the speed of the native tooling that exists in the clouds.
The completeness of data that, that we talk about. Again, I reflect on the days that I was also like Rex building legacy data platforms and for various reasons, when we touched a data system, you would only bring in initially the data that was, that was needed. That was kind of the best practice is okay, which, which tables for this use case, do we need for this dashboard that we need to create.
Mm-hmm. , let, let's grab these 15 tables because that has what's needed for this project. That was the, those were the days of extract, transform the data and then you load it into your system. You look at and part of I think what drove that was also the cost of storage. Yes. Look at what the modern data platform these days, cost of storage is.
That's the least of the worry in a, taking a modern data platform approach. And because it is so inexpensive to be able to, if you touch the system, bring in as much data as you think you're gonna need for the hundreds of use cases. That's the extract and load part of it and the ability to load and replicate and set up the the change data capture and replication is easier than ever.
That was complex code that we had to write in an ETL job in years past. It's almost a plug and play now, and it's cheap to store. So that, that's one change that I've seen around the the ability truly to be able to have as complete data as possible. The first step is you gotta be able to access it and be able to load it, and that truly is easier and cheaper than it has ever been.
you guys have thrown out ml not AI as much as ml, although ML is a form of ai. So this is one of the things that's really changed since I was involved in this. We're, we're using ML tools to look at data, to process it very quickly and and to do tasks that used to be, I guess, programmatic for us. We used to have to write all these programs and do that, and now ML can just do what are some of the things ML is doing with the data?
one of the things that's, that's also enabling it is what Lee had mentioned is since we can, ML requires a lot of data, right? And it once, it once context. So bringing. Larger volumes of data into a lake, a lake house kind of structure. That's where it starts. Now, ML can sit there and say, Hey, listen, I'm noticing anomalies in this field. And we have a pattern that this should be a patient field, so why am I seeing numbers in it? Patient name, field numbers shouldn't be there.
Those kinds of things. I'm seeing a lot of nulls ml just at the level of data quality can start detecting things far faster than we could. I mean, in the old days, we'd probably say, if it's not, if it's not if it's a null, go ahead and drop that record into an extra table and we'll go ahead and analyze it at a later, later date.
So we're at least 24 to 48 hours before we even detect data quality issues. The, the MDM that I mentioned earlier, just being able to leverage ML to say, Hey, listen. This bill looks like other bills that are sitting in the system in different records. Let's bring them together. And if, if we have, if ML says this is a definite match, this is, this is all Bill, we're good.
If it's not maybe perfect, the best MDM systems are also able to sort of raise that to a steward to say, Hey, these look similar. Should we bring it together? So now ML'S making our stewards, our data data appears a lot smarter and it's also coming into looking at anomalies within the data from a security perspective as well. Lee, did I miss anything that you wanted to chime in on there?
Yeah, just one other example of that, that in the application of machine learning for as it as it relates to a modern data platform, Is the cataloging of what data is, is available and what data then that you can bring into the system so the cataloging tools that are available are using machine learning to be able to scan the systems, be able to identify the data, to be able to make inferences around the lineage and the potential relationships of data.
Even the identification of what data data classification. So what data is phi, right? And what combination of data is a problem. For PHI or not. That's something where machine learning is being applied in a meaningful way. Also that we are seeing,
yeah, the, the ability to see where we might have risk within our data. Just from correlation of data sets that we're pulling together. They may look fine separate, we may have patient names masked and all those good things, but machine learning can identify certain patterns and all of a sudden, boom, we know who you are with even when the data's masked. So being able to pull that together can save a lot of time for security and risk officers.
Yep. And it's fascinating. Clean, complete unpromised data for driving healthcare. . I'm not sure we solve the problem, but what we've described is the tools are giving us the ability to move faster, giving us more visibility, giving us an extra set of hands.
The ETL are the AI and machine learning's giving us an extra set of hands, identifying things a lot quicker. And almost everything you've described is just, we're moving a lot faster and we're able to move faster, which is phenomenal for healthcare people to hear. Cuz we are moving the, the, the healthcare is just moving a lot faster and we're being asked, the data demands are pretty great.
gentlemen, I want to thank you for this conversation. Fantastic. And we're gonna close this out in the next episode with data governance framework for healthcare. And that should be a fun conversation. Thanks again for your time. Thank 📍 you. Thank.
What a great discussion. I wanna thank our sponsors for today, Serious CDW and Talend for investing in our mission to develop the next generation of health. Don't forget that this whole series ends with a great webinar on Wednesday, December 7th, Lee Pearson, Rex Washburn will be joining us along with Jared Nunez, Executive Director Informatics and Analytics at Memorial Care. We're gonna take this discussion one step further by including you and your questions. So go ahead and register at this week, health.com. The link is in the top right hand corner and don't forget to drop your questions in the form so we can make sure to 📍 cover. In the webinar. Looking forward to that discussion. Thanks for listening. That's all for now.