We should embrace the human side of cybersecurity, where knowledge and vigilance become our most potent weapons in the age of digital health. In this episode, Esmond Kane, a CISO at a leading healthcare organization, sheds light on the critical role of human risk management in safeguarding our digital lives. He emphasizes that no matter the technological advancement, the human touch will always be among our most potent weapons against threats. Cybersecurity is a shared responsibility, and together, we can forge a safer digital future. Tune in now and learn more about cybersecurity in healthcare!
Listen to the podcast here
Securing Digital Health: A Journey Into Healthcare Cybersecurity With Esmond Kane
Here, we equip you with the tools and insights to safeguard against digital threats arising from unintended human error. Join us as we dive into cybersecurity threats with Esmond Kane, Chief Information Security Officer at Steward Health Care. Friendly Fire gives you the knowledge to defend your digital realm. He is the Chief Information Security Officer at Steward Health Care and International Healthcare Delivery Organization, serving millions of patients annually.
Esmond helps clinicians and leaders deliver world-class care aligned with cybersecurity frameworks. He has many years of experience leading IT and security programs in multiple industries. Before joining Steward, he was the deputy CISO at Partners Healthcare, Mass General, Brigham, and in various roles for many years at Harvard University. In his spare time, Esmond likes to fret about his spare time and annoy people who read bios, and I am 100% the target of that annoyance. With that, welcome to the show, Esmond.
Thank you for the platform. I’m looking forward to the conversation. Please note that these thoughts and opinions are my own and do not necessarily reflect those of my employer.
When we were prepping for the interview, I saw that tease and was like, “I’m leaving that in there,” because it’s such a wonderful bio tease. It’s fantastic. I don’t always like to dive into having the guest give their background. Let’s take a few seconds and talk to me about how you got into cyber, the TLDR of your career. I know we mentioned some of the companies and places you’ve worked, but a little bit of the why you chose to go down the path you’ve gone down in your career.
The accent tends to come and go, but I’m originally from the West Coast of Ireland, a small rural town called Ballina in County Mayo. I guess it’s where Biden’s ancestry hails from. When I grew up in the ‘70s and ‘80s, it wasn’t a hotbed of computer science. It wasn’t where you would go to pursue those more technical avenues. It was sports-mad, crazy. I was somewhat of an outlier and very much gravitated towards things like science fiction and the burgeoning science around computers, which was fascinating to me, but it was at a distance.
When I got the opportunity in the mid-‘80s to get hands-on, DEC( Digital Equipment Corporation )was a big presence in Ireland at the time. At that time, there were a lot of Apple computers. IBM was getting involved. The IBM PS1 was a thing and there were a lot of things like the Commodore systems and the ones that were unique to the UK and Ireland like Sinclair Spectrum, and other kinds of things.
I developed a natural aptitude for this, back when you had to type code in from magazines to watch pixels move around the screen. When I got the opportunity in the ‘90s to specialize and gravitate towards it, I was a duck to water. I remember all of the early systems like Archie, Gopher, and the heavy use of Telnet. In the UK and Ireland, BBS wasn’t as predominant as they were in the US because you paid per minute for some of these dial-up calls. I missed some of that cult of the dead cow war dialing, very fond memories of reading about it, but certainly, I didn’t have the pocket change to get involved.
In the mid-‘90s, I had my own company. The internet was just becoming a thing. I have fond memories of things like Hot Wire, Mosaic, and some of the early systems. Everybody remembers that most people had wallpapered various CD art installations. It was crazy. For the last many years, I’ve been very lucky to work in IT and security. Everybody works in security, in my opinion. It’s not only everyone’s responsibility – some of us have chosen it as a profession and some had that profession chosen for us, but it’s no end of fun. It rewards curiosity and perseverance.
Everybody works in security. It’s not only everyone’s responsibility, but some of us have chosen it as a profession and some had that profession chosen for us.
I’ve been very lucky to work in academia and healthcare where you’re advancing science, human knowledge, or the ultimate mission of helping cure patients. They get to leave our institutions with a bundle of joy squirming or they get to ring that bell when they finish a particular treatment. It’s something I find a lot of value in. Similarly, we’re also there as part of that first responders when that bundle of joy perhaps stills or they don’t get to ring that bell.
That’s also the other aspect of healthcare. COVID has been challenging. Healthcare, in the last few years, has gone through a change. Some of it was necessary, and some of it was forced upon it. We had to meet our patients where they were, which was in the home. We had to change the delivery of some of our systems. Certainly, even our workforce, we had to make some changes towards it. Whether it’s been cybersecurity as a discipline or IT as a calling, I find a lot of value in healthcare. If anybody’s interested in a career, reach out to me, happy to walk you through the whys and the hows.
I’ve not had the positive benefit of having worked in the healthcare side of cybersecurity. I’ve also worked in academia. I have experience there. It’s interesting. Given your background in healthcare and academia, you have a unique vantage point that gives you certain color glasses on cybersecurity, the impact of cybersecurity, and the cultural benefits of cybersecurity that others in the audience may not have. If I were to ask you what the primary difference in being a CISO, a cybersecurity leader, or an individual in healthcare versus academia versus traditional enterprise companies, what would be the 1 or 2 primary key things that you’re like, “These are the drastically different deltas between those three universes?”
I’ll start with academia. I worked at Ivy’s for ten years. Nobody chooses a college based on the caliber of, cybersecurity, or infrastructure. Students choose it because of the caliber of the faculty and their fellow students. As an IT or security practitioner that was something that the faculty and the students would tell you explicitly, “We don’t want security. We’re more interested in breakout rooms, big screens, and computational access” [than we are in multifactor authentication].
To overcome that push-back, a lot of it was collaborative. It was socializing controls. A lot of our researchers didn’t want multifactor authentication. It was inconvenient. Until you had those notable incidents where the high-performance computing clusters, everything was available once upon a time, they got compromised. The electricity bills were creeping up everywhere, and all of a sudden, it made sense to talk about multifactor.
Some research efforts were cutting-edge. They were involved with different aspects of the federal government. There were expectations of a level of maturity that wasn’t common in academia. There were research projects like computational science, biotech, and stem cell research. Some academic institutions have nuclear reactors on site. They’ll have various biohazard disposal mechanisms and things of that nature. If they are involved in animal research, my former academic institution is going through a review of how they deal with some of this cadaver research and stuff. It has not been well governed. To answer your question directly, academia requires a lot of socialization, collaboration, and education. It required a lot of being able to react on the spot and be creative in trying to improve the status quo so you would appeal to their own interest.
I remember, at one point, a common control like encryption, we got a sixteen-page diatribe from some of the faculty on why encryption was going to interfere with their academic progress and freedom. I sympathized with what they were trying to do, but by that same measure, you have to be able to combat things like device theft and misplaced devices. It’s a common control and, if nothing else, it prevents someone from reading the dissertation or research before you get to publish it.
When I think of healthcare, when I was working with the School of Dental Medicine or University Health Services, there was certainly an easing of my way into the process around healthcare before I dedicated myself. [As much as no-one] chooses a healthcare organization for its reputation on cybersecurity, there’s an expectation that HIPAA has enshrined.
It’s several decades old. OCR has enforced since about 2004 or 2005, where there’s a set of common practices that everybody’s expected to have, things like minimizing privilege, “Make sure that you’re minimizing any exposure to data, and you’re only providing the bare minimum that they need to conduct business.” These common criteria or principles can be articulated in many different ways. The systems that you’re developing or implementing to achieve those controls are built around confidentiality, integrity, and availability, a lot around that trifecta, which is behind everyone’s security framework.
Healthcare in the US has been developing. You’ve had things like meaningful use. They’ve been developing patient-centered electronic systems for well over a decade at this point and the federal government incentivized that creation of converged electronic systems. IT was identified as becoming more critical to the delivery of patient care.
Some thrilling systems were and are leaving research labs. We’ve already seen some of those medical imaging platforms aid early diagnosis. The next generation of that, whether it’s clinician decision support tools driven by AI or things of that nature, it’s a great time to be involved in healthcare.
If you look at the vaccine for COVID, the mRNA one, it’s such a ripe opportunity to target other kinds of pathology and symptoms, in particular, personalizing medicine. There are trials for curing AIDS and various other kinds of vaccines. It’s a thrilling time to be working in healthcare. To go back to your question, like healthcare or academia, you need to be prepared to socialize. You have a little bit of a higher ground or expectation of a higher compliance framework, but it still behooves you to socialize, collaborate, communicate, and educate. Those are the soft skills that will guarantee success in both academia and healthcare.
I could compare and contrast academia to healthcare for the rest of this interview, the whole thing based on this question alone. I’m going to shift a little bit and focus on healthcare because the answers are going to be similar in some ways, which we’ll see by the end. When we’re talking about healthcare, we’re talking about a relatively open system. When we’re talking about academia, we’re certainly talking about a more open universe, system, and collaborative source.
To some degree, healthcare is as well, but they have to be able to tie things down a lot tighter for patient records and things like that. Can you talk to me about insider risk within academia? Insider risk is difficult to quantify because of what is seen inside a largely open system. Let’s set that debate aside and focus on insider risk inside of healthcare or risk that comes from unintentional human mistakes inside of the healthcare environment. How do we go about getting our arms around that problem? Is it a cultural thing? Is it an education thing? Is it an incentive-based problem within the healthcare organization to try to level up cybersecurity in that arena?
Let’s start with a data-driven approach. Protected health information is about eighteen different characteristics as defined by HIPAA. It’s a very large definition. It includes things like the date of birth and address, but also things like diagnosis and some of the event dates associated with patient care. There’s typically a care team associated with curating that patient. There’s the doctor and some of the nurse practitioners. There might be a pharmacist involved. There may be a registrar involved in checking someone in and out. Those are typically the four minimums that are involved in every single patient transaction. The fifth one is one you don’t typically think of, which is the payer, the insurance carrier. That’s reconciling and making sure that the hospital continues to be able to provide a high level of quality care to that patient.
If you go look at Carnegie Mellon’s Healthcare Datamap from 2008 or so, it’s interesting the sheer amount of parties that are involved in any single healthcare transaction. Some of that is going to get more complicated with all of these data analytical platforms that are starting to be introduced across healthcare. Realistically speaking, you still need to make sure that you’re only exposing the minimum necessary data to authorized parties, to have good access control mechanisms. You’re doing some step-up or multifactor authentication when necessary. If it doesn’t look like you can proof the identity, when an access determination is being made, that you’re including role, entitlement, and device hygiene. There are other considerations including location.
The other aspect of this is you’re monitoring. You hope to protect yourself from unauthorized access. You hope that only the minimum necessary information is exposed and that you can prove that it was to an authorized party. When it’s not, you then need to make sure that you’re minimizing that exposure. You’re having these suspicious activity reporting tools that are checking for things. I remember one example when Madonna’s had a baby, no longer a baby anymore!
I remember talking to fellow IT practitioners where that baby was born. Every single time something like that happens, and this was thankfully not a system I worked within, there’s typically someone who gets let go because they think they can look up that patient, and that’s not the case. You’re not part of the care team. You have no business to be involved in looking that up whether it’s a celebrity or a notable individual associated with a particular event or it’s an athlete or politician. When that occurs, you want to make sure that you detect it pretty quickly, shut it down, and take the appropriate action.
Compliance is a huge part of information security in Healthcare and responding to those compliance incidents to make sure that the offending parties are terminated or reassigned. It’s a huge element of this. Insider threat is something we deal with, both intentional, unintentional or inadvertent disclosure. If you go look at the HIPAA wall of shame, for the first ten years, it was device theft and misplaced paperwork. It was misdirected emails and faxes. Those things do still happen.
At a bare minimum, these kinds of incidents are an opportunity for reeducation. Every single incident is reportable but once you get above the 500 threshold, OCR requires you to go on public record and be a lot more public about what happened and within 60 days. In the first ten years, the wall of shame was these old-school ways of losing data like theft, loss. In the last few years, there’s been a phase change. Now it’s application and security. It’s a lot of ransomware, but some of that is occurring through things like phishing and accidental disclosure of credentials or reuse of passwords or things of that nature.
It’s the threat model that has shifted underneath you in many ways.
Some of it is preventable. I spend a lot of time educating my community on how to detect a phish and how to get better at doing it. The irony is, as you indicated, it’s evolving. Many years ago, things like FraudGPT or WormGPT didn’t exist. The actors now can translate something perfectly, whereas previously you would’ve detected poor grammar or punctuation. It’s now being perfectly written by all these GPT tools.
Isn’t a piece of this also the education of your audience? If we were to stereotype both academics and doctors, they tend to be very headstrong, want certain things, and have an idea of what they want to do. How do you incentivize a doctor and academic personality or a professor? How do we incentivize them to do the right thing culturally? Let’s take ransomware as a topic without diving into the risks of ransomware in the healthcare universe. We can talk about that later as well. Part of that is educating the audience, the doctor, and the academic to do the correct thing at the correct times. How would you impact the cultural shift and the cultural change from an incentive standpoint for those audiences?
It takes an army. It requires clear guidelines, clear communication, and as much communication as you can handle early and often. It requires speaking to the limitations of what they can do. You have to be nuanced. When you have a clear obligation to comply, you can’t bend. Some of this is, for example, think of a physician who’s on holiday and needs to access the systems. That physician wants to treat patients. It’s their calling. How do you facilitate the remote care when traveling?
The last thing you want to do is to get in the way of that patient care prerogative. Sometimes, unfortunately, there are things that we need to do to be pragmatic around the risk. Typically, that involves a conversation. To your point, it involves speaking these principles of strength that you can build upon, the same way that a clinician has a duty of care. They walk into an OR or they prescribe medicine to a patient.
They’re not going to be expecting to use every single scalpel out there or have their patients use every single medicine out there. There’s a set of prescribed tools that the doctors are using to treat their patients. There’s also a set of hygiene practices. Every single doctor is very familiar with the principle of an apple a day. These are the things that we try to reinforce. These are the routines and checklists that we do. If all of a sudden, you’re in Mexico and you’re trying to log into our EMR, I can’t trust you are who you say you are. You sent me an email from your personal address. That’s not something I can accept.
You should have let us know in advance. Somebody should be able to prove that you are who you are through a voice or a video, preferably a video call. Hopefully, there’s an admin who knows, for instance, that they were expected to be traveling. These are examples of things you can do to appeal to base interest to engage the community at their level but also to take the discussion into realms that they will understand. It’s about integrity, proofing, and making sure that there are authorized, sanctioned avenues and tools. They understand that.
I appreciate that approach and vantage point. I’m a professor at a college of a four-year university as well. I get the trade-offs that are required there. In your discussion there, you mentioned the word cyber hygiene. In our prep calls, we discuss pretty heavily cyber wellness. Can you help me understand the difference between cyber hygiene, which in cybersecurity circles has been somewhat positively received, somewhat negatively, and everybody has their opinion on it? What’s the difference between cyber hygiene and cyber wellness, and does it matter?
When I think of cyber hygiene, I think of established common practices that you’re doing, things like picking a good identity mechanism, good passwords, MFA, and keeping systems up to date. I think of principles like minimizing data exposure and least privilege. These tend to be transactional. You may do them once a month or a week. If you’re sending information to a supply chain provider or you’re consulting with someone externally, don’t accept the trust. Try and validate it where possible. For me, the concept of cyber wellness is more around instilling these practices. The behaviors are instinctive. I tend to think of it as almost like you can go run once or twice a week and hopefully meet some of your recommended goals. Maybe you can eat your daily “five a day”.
The thing about wellness is more towards exercising these things frequently to make it more of a muscle memory type analogy. The thing around cyber hygiene is there’s one core tenet on instant response, which is what you do when something hits the fan. There’s this concept in cyber wellness, which is preparing for failure. It is accepting that your system needs to be resilient and that user behaviors will drift and you must try to reinforce them. It’s not necessarily a negative reinforcement. You’re trying to encourage good practice and behavior.
When I think of the two, cyber hygiene may rely on a trip to the ER on the occasion when you need to get treated with some element of cyber “Narcan”. Cyber wellness is where you’re trying to get to mindfulness, you’re accepting and treating people as humans who may fail, but to reinforce and exercise instinct, their strengths, positive reinforcement where you can. Theres more there around resilience.
I love the delta there. I’ve even seen analogies made or storylines that have gone along the lines of continuous assessment of your environment and not necessarily looking for attacks or threats, but understanding whether you are operating healthily. That could mean the configuration state of certain things, but it also could mean if your processes are appropriate. Do you have the right people in play? It is taking a more holistic 30,000-foot view of the cybersecurity environment.
I want to take another double-click back into something else you brought up, the COVID storyline. COVID has impacted everything across businesses, healthcare, academia, and everything we do. It certainly has, in my opinion, shifted or at least had somewhat of an impact on how cybersecurity operates in these environments. What I want to try to understand is the impact on healthcare is fairly clear from an outsider like me in the sense that we are now doing a lot more telehealth. We’re a lot more remote, distributed, telehealth-related, and not necessarily coming into an office. You couldn’t do it for a while simply because there were so many sick people that you were in an environment you couldn’t go to. How has that shift in healthcare impacted how we do cybersecurity in healthcare?
There was already a journey that most healthcare was on, this journey to a smart hospital and towards digital health, which involved removing some of these barriers to entry that it was in the home. It was more convenient for the patient. It was promoting not just patient portals but mobile applications and things of that nature. That was already occurring. To your point, during COVID-19, all of a sudden, you couldn’t go attend your PCP. You couldn’t sit in a waiting room. It was a potential super spreader location. You have to minimize the potential for the patient to pick something up that was unpleasant. You were postponing some of these more elective procedures. Rapidly, the whole industry had to flip into this telemedicine mode. They had to accelerate what they had already been planning.
I was very lucky my organization was already thinking in this space and it didn’t require too much. We were already traveling down the route and found a great partner with Microsoft around this digital health effort. By that same measure, we also have to flip not just patient care, but the workforce had to also go work from home. They needed pervasive access to some of these systems that were traditionally on-prem. You had to create bridges. You had to scale out a lot of your remote teleworking solutions.
You may have had a digital health effort that you could accelerate and scale. Maybe you partnered with a cloud vendor to do this. There were lots of conversations in March and up until May and June of 2020 to get these out there.Maybe vendors hopefully that you’d already opened a conversation. Digital health technologies, by their very nature, are a little bit more convenient. You’re leapfrogging that barrier to entry into different conferencing applications. We saw a massive upscale or uptick in adoption through COVID. It was relatively accessible and user-friendly for a certain population.
The federal government had to assist as well because there’s a certain population that may not have convenient access to some of these devices. They may not have a smartphone with a camera. They may not have access to an expensive internet connection in some rural areas. They may not have a high-speed internet connection. Our colleagues in the federal and state space created depots and locations like churches and libraries. These sites may have access to the technology to prevent an on-site PCP visit.
By that same measure, I described the teleworkers earlier. They had to learn to become accustomed to people who only existed from the belly on up. No one had legs anymore. I describe it as we all turned into the British reporter whose children came barrelling into the room. He’s on camera. We all got used to those kinds of interruptions. You may have been speaking to your team or senior executives and your spouse or partner may have opened up the door and said, “Can you take the dog for a walk or can you do the dishwasher?” There was a cultural change there that both the patients and the teleworkers had to become accustomed to. By that same measure, there was still frontline staff coming into our hospitals. They had to isolate themselves. There was a lot of uncertainty around PPE.
Some of them were being kicked out of temporary locations because nobody knew what the initial diagnosis and some of the threat factors associated with COVID were. It was very unpleasant for frontline workers for a long time. There are documentaries out there on it. They’re horrendous, and we owe them all a tremendous debt of gratitude for everything. We all got bored of being on Zoom calls and stuff all the time. They didn’t have that luxury. They still had to come on-site to conduct emergency procedures to maintain those emergency rooms.
We owe front liners a tremendous debt of gratitude for everything.
Those are the ones that were wearing 2 or 3 masks and different kinds of PPE for 12 or 18 hours at a time. Digital health complements it. Hopefully, we are empowered. We’ve leapfrogged the adoption curve. It’s here to stay as I also think remote work is here to stay. It’s a lot more user-friendly and accessible. Care in the home has better outcomes. I’m not a clinician but, certainly, I found the experience a lot more pleasant as a patient myself.
With the increasing adoption of connected health technologies, how do you balance convenience for patients with the need to maintain robust security measures? What I would equate that to on the enterprise side is the balance between user experience, ease of use, and security controls. How do you find a balance of ease of use for all the patients in the medical facility with the security controls that inevitably are going to make it more difficult for them to act, make decisions, and get access to their docs and all those kinds of things?
It’s a fine balance. It’s a tightrope walk. A bleeding edge means something else entirely different in healthcare. We can’t necessarily adopt all the latest and greatest. They do come with some peril to the promise. It requires careful analysis, a lot of collaboration with our colleagues, and focus on patient quality and outcomes and discussions with our particular divisions that focus on that. That something is effective across all the different socioeconomic and population health management demographics.
You want to put technologies in play that are convenient and easy to use, but to your point, that aren’t necessarily insecure. I do worry about those kinds of adoption of technologies. The OCR did a great job communicating upfront that it was okay through COVID to take some shortcuts, but you had to be able to unroll those. At the beginning of May 2023, when they rang that bell and said it was time to retire the emergency, they gave us until, some instances, August and December to get rid of some of those use of public cloud systems or applications that maybe didn’t go through review.
You took a shortcut but then you had to reverse course. There’s always a concern around regulatory compliance, patient quality, and those kinds of things. Inherently, health and safety and risk management are a conversation. You’re appealing to base interest, as I described earlier. The concern I have is these latest technologies and trends. There’s such hype and over-marketing associated with some of these things that they’re not considering some of the downsides, the bias that might be inherent in these, and the data mining associated with them. Everybody may have jumped into the latest AI platform, but if you were using it to analyze patient information, we shut that down very quickly.
With these latest technologies and trends, there’s such hype and over-marketing associated with some of these things that they’re not considering some of the downsides. The bias that might be inherent in these, is the data mining associated with them.
We had to because, in the adoption of these personal and non-corporate systems, there was no business associate agreement or patient privacy confidentiality considerations. In some instances, some of the agreements you were clicking through were giving up rights to that data. You no longer owned it. There’s a great episode on season six of Black Mirror where you circle the drain on what you’re clicking through on an agreement. It’s a hilarious episode. It’s about the fact that you do need to pay more attention. At the federal space, some of these privacy obligations and security considerations aren’t well enshrined. They’re not available pervasively. They’re in certain states, but certainly from a strategic perspective. The onus is on the practitioners and the patients to exercise care. Sometimes, that’s hazardous.
I haven’t watched the season of Black Mirror, but I definitely should, given that endorsement. I want to talk a little bit about medical devices and products. Healthcare as an industry has a significant or potential risk level about using third-party medical devices and products. How do you ensure that cybersecurity is integrated into the design and development of third-party products that you aren’t developing or building yourself? How do you recommend healthcare CISOs go ahead and implement something that helps mitigate the risk of that problem?
When I started in any position, one of the first people I reached out to was compliance, legal, and development business analytics, but also the procurement team. The people who are responsible for onboarding vendors and bringing solutions to the table are incredibly important to establish a relationship with to start to define criteria on how you assess these devices when they’re coming off the assembly line into your production infrastructure.
There’s an element of this where the medical device manufacturers, the GEs, Siemens, and those kinds of ones, got a lot better. They’ve got religion over the last couple of years. Beginning in October 2023, FDAs don’t require them to have a software bill of materials and more of an established plan around vulnerability and patch management. It came out with a binding executive order stating that they’re no longer countering that with this concept of shared responsibility.
They’re expecting the device manufacturers to do a better job of making things secure by design. It’s not just these big ones that you’re starting the MRIs and the CT scanners that are in medical facilities. Now, you’re also seeing wearables and implantables. To a certain extent, the FDA is assessing them, but the FDA isn’t necessarily getting involved in things like various unnamed watch manufacturers who are doing EKG monitoring and things of that nature. We had to intervene in some of these in-home genetic testing kits as well. They had to step their game up. It’s something we hope that our colleagues in the FDA do a better job of. I’m thrilled by the conversations I have with these medical device manufacturers.
In my opinion, the issue is that it’s great for the next generation. What do you do with the several decades of equipment? That’s where I have cadences with these vendors. I have had discussions with them. It’s like, “If you’re going to bundle something to prevent tomorrow’s problem, can you also make sure that it helps us with the one that we bought 5 or 10 years ago?” It’s very self-serving for a manufacturer device with Windows XP or Windows 7. They know upfront it’s got a 5 to 7-year lifespan on it. They don’t do a good job communicating that there’s inherent obsolescence as part of that device lifecycle. As a consumer, I’m going to run that thing until it falls over. I want to benefit from it. It’s like driving a car. Once it gets to 5 years, 60 months, or I’m paid for, I’m at a profit state.
Those are the same kinds of things you do around. It’s an education and establishing these assessments, asset management, and visibility platforms. A lot of it’s around telemetry monitoring for when things go wrong. It’s making sure that you’re correcting any inaccuracies. It’s continuous to the point you brought up earlier. You need to have clear policy guidelines. You need to have those established lines of communication. Those lines of sight to when something goes wrong are getting better at the federal state. It remains to be seen what we do with the legacy tech. That’s very much a hallmark of healthcare.
I could dive into SBOM discussions with you, given its hotness in the market concerning efficacy and things like that. We will set that on the sideline in favor of a question that I want to end with. That’s a very positive question. Hopefully, your answer will be a very positive question in the sense that you have a very long and successful career in security and have worked in many different positions and many different roles. What advice would you give or do you have for other CISOs or folks who are up and coming into the CISO role and are looking to enhance the human side of cybersecurity within their organizations? In particular, what’s the biggest ah-as that you learned over your career that you could say, “If you do these couple of things, you’re going to be better off than you were before.”
People, process, and technology. No amount of technology is going to compensate for a broken process that no one’s reviewing. It’s something as a security practitioner I’m very conscious of. You can aggregate all of the logs on the planet, but if nobody’s reviewing them or generating alerts from them, you’ve complied. You’ve made terrible efforts and created a compliance prerogative but have not benefited from any security. To your point, there is due care to the human firewall. Starting with the workforce, you need to create those liaisons, champions, and outreaches so that you’re creating a rising tide and the cultural change is starting from the bottom up. You also need to reach out to your C-Suite. You need to make sure that they understand the value of cyber risk management and being proactive.
There are lots of statistics you can lean on there. The goal is never to eliminate risk. It’s to manage it so that you can maybe take on more and that you have solutions that help you protect should things go wrong. Teaching the C-Suite to engage early and often is also a huge element of this. The workforce hired hundreds of people over the last couple of years. One of the things I thought was great through COVID was this focus on a diverse workforce.
Appeal to as many people as you can. There’s still a huge debt when it comes to the workforce. There are more jobs than there are security people. Some of that is because people are creating ridiculous requirements to come on board. There is a lot to be said for curiosity, passion, ambition, and perseverance. Gaining that backbone and sticking with something is huge. Beyond hiring, I, as a leader, need to focus on retention and sustaining an environment for my staff, whether it’s competitive salaries, continual training, or creating those career development roadmaps that people can grow.
Some people don’t want to become leaders or managers. They want to become distinguished engineers, and that’s fine. You want to create that environment. In summary, the prerogative around collaboration and knowledge sharing starts with awareness and education. I do think that you need to meet people where they are. You need to reinforce, at a point in time, a snapshot. If you are reporting a phishing, let them know as soon as possible. If it was a simulation, let them reward them.
You need to meet people where they are.
Create these champions that recognize when you’re doing things well. Don’t just focus on the negatives. Fearmongering and scaremongering, there’s value in doing it, but if you are the bearer of bad news all the time, nobody’s going to want to talk to you. For us to encourage our workforce and change culture, it’s around vigilance. That’s around awareness, positive reinforcement, and building on your strengths, not focusing just on negatives. You’ve got to change, but turn it into a positive.
I love that sentiment. Thank you so much for coming to this show and for answering my difficult questions. I appreciate you spending a little bit of time with us.
Thank you very much. Readers, find me on LinkedIn if you want to continue the conversation. I’m happy to chat.
With that, we have come to the end of another episode. I’d like to extend a heartfelt thank you to our incredible guest, Esmond Kane. Your profound knowledge and passion for impacting cybersecurity change have been truly inspiring. Your insights into mitigating unintentional human risk will undoubtedly empower our audience to safeguard their digital lives. Remember, everyone, cybersecurity is a collective effort, and every step we take to fortify our defenses counts. Stay tuned for more compelling discussions on cybersecurity until next time. Stay vigilant and stay secure.
- LinkedIn – Elevate Security
- Facebook – Elevate Security
- Steward Health Care
- LinkedIn – Esmond Kane
About Esmond Kane
Esmond Kane currently serves as Chief Information Security Officer (CISO) at Steward Health Care, an international healthcare delivery organization serving millions of patients annually. In his role at Steward, Esmond helps clinicians and leaders deliver world-class care aligned with cyber-security frameworks, regulations, and industry best practices.
Esmond has over 25 years of experience leading IT and Security programs in multiple industries. Before joining Steward, he served as Deputy CISO at Partners Healthcare/Mass General Brigham and in various roles at Harvard University. In his spare time, Esmond likes to fret about spare time and annoy people who read bios.