When C-suite executives at Tampa General Hospital first expressed interest in implementing Copilot, the last thing James Bowie wanted to do was to extinguish that spirit of innovation.
“They want to be at the forefront of AI, which is a great position to be in,” he said during an Unhack the Podcast interview. But for a CISO, “it’s scary.”
And so Bowie, who has been in the role for 2 years, responded by asking for two weeks to “lock everything down.”
James Bowie
Why? Because when Copilot goes live in an environment, it acts on permissions set by the user, which means if a document has been improperly shared, such as a patient census spreadsheet or a disciplinary report, “everybody has access to it,” he said. “You may not know you have access to it, but Copilot is going to know as soon as you turn it on. And that allows you to make mistakes on an exponential scale.”
That, of course, is something no one wants. Fortunately, Tampa General’s leaders listened – and quickly learned. While demonstrating Copilot, Bowie asked it to find documents with nine consecutive numbers; it immediately revealed that an employee file with social security numbers had been inappropriately shared.
And so, the cybersecurity team got to work, engaging with Varonis to resolve around 3 million incorrectly shared file permissions, and established an automated process to set up triggers and remediate attempts to break into the system. “It’s been a win,” he noted.
It has also marked a dramatic departure from the past, when cybersecurity was “an offshoot of IT,” staffed by individuals who were technically skilled but lacking in communication skills. Despite their efforts, cyber folks were often viewed as ‘no-bots’ who constantly rejected requests, or would “come in and shut everything off,” Bowie said. “It was a bunch of nerds in closets trying to keep things safe.”
As a result, he sought to rebuild that image by developing a stronger sense of empathy among his team. “The idea was, ‘we’re here for you. We’re not just here to tell you what you can’t do. We’re here to help you to be safer all around,’” he noted. And that meant “being nice until it’s time to not be nice.”
With that foundation of trust in place, the next step was to build buy-in. His team chose to focus on the active directory, which has become a common target. “If they’re in, it’s over. They’re going to get your domain,” Bowie said. “They didn’t understand how important that is.”
His team took action, investing $20,000 to have a vendor demonstrate how easy it is to hack the active directory by using LLMNR poisoning attacks.
The plan worked perfectly.
“As that class was going on, requests for changes were coming through to disable this and that,” he recalled. “It was a huge win. For $20,000, we probably reduced $5 million of risk in one weekend.”
The exercise also helped prepare them for events, such as the ransomware attack that threatened the blood supply. “That’s an issue on its own because you already have the PHI components and issues with logistics, but they literally couldn’t deliver blood. That will shut down a hospital [especially a level 1 trauma center] very quickly.”
Fortunately, the incident command center was immediately activated. But with that comes difficult decisions about which systems to bring up. “You constantly have to weigh options,” Bowie said. “I can sit here as a cybersecurity expert and say, ‘No, ‘we’re not turning this back on,’ but there is a chance that someone could die” if the wrong decision is made.
Although it’s not an easy conversation to have, it’s an extremely important one. “It woke my team up quickly,” he continued. “It helped drive the impact home that yes, we’re dealing with bits and bytes, but at the end of the day, those bits and bytes have people attached to them with real consequences.”