This digital information age puts businesses vulnerable not only to external attacks but internal as well. Even more threatening when these happen to those in a highly-regulated industry with a significant amount of information needed securing. Suffice to say, friendly fires happen and must be dealt with as much as other threats. In this episode, Tim Callahan, the Senior Vice President and Global CISO at Aflac, dives deep into the potential friendly fire that happens within the business or organization. He talks about defending against unwitting human error and being aware of both outside and internal attackers, highlighting the importance of risk management and prevention. What is more, Tim lets us in on the challenges he encounters in his role as CISO, dealing with his teams and the board to ensure their philosophy to prevent, detect, and respond are rooted in what they do. Tune in to this conversation and gain great insights on navigating the threats in the cyber world, wherever they may come from.
—
Listen to the podcast here
Tim Callahan: Is Any Fire Friendly?
Here on the show, we are bringing you the top experts in the industry for a chat about everything that is interesting and maybe sometimes scary, hopefully not in the notion of keeping our world secure. Speaking of that, we are very excited to welcome Tim Callahan. Tim is the Senior Vice President and CISO at Aflac. Maybe you’ve heard of them. I promise I’m not going to do any duck jokes. Actually, I can’t promise that. I might do one at some point. Also, 25 years in security, including a long stretch in the financial industry and stints at SunTrust Bank and People’s United Financials. Also, a 23-year veteran of the United States Air Force. They get up to some pretty interesting things when it comes to security. I feel like Tim may have an idea of what he’s talking about. Tim, welcome to the show.
Thank you very much, Matt. I’m happy to be here. It is interesting that being military, we have an axiom that friendly fire isn’t. In other words, when you’re being fired, it’s not friendly, no matter who’s doing the firing.
You beat me to the punch on that. It was going to be the first question. Let’s dig into that notion and how that has formed your philosophy over the years. You’ve been doing a lot of interesting things for a long time. I’m not saying you’re old, but the notion is we tend to say, “This was friendly fire,” as opposed to, “This was a malicious attack.” At the end of the day, you and your team are taking this heat. Who cares where it comes from? Do you need and choose to differentiate between an external threat and an internal threat?
You do for categorization purposes, but I’ve often said that IT mistakes or human errors can be the same result as DDoSing yourself. If your systems are down, they’re down. Whether someone from the outside did it or someone from the inside made a mistake. When you’re planning contingencies around resiliency and recovery, things like that, you’d have to plan for both, because when it’s down, it’s down.
I’m sure we’ve all been involved in instances where there was an outage. For at least some time, we didn’t know the cause, whether it was internal or external. Until you can determine that, you don’t know if you’re under attack or not. Again, the result is we’ve erupted business so it doesn’t matter. It’s like the Friendly Fire. I won’t go into war stories, but when you have those instances, the consequences and the end result are the same.
You’re certainly welcome to bring the war stories into it because those are great illustrative things for this, but we don’t have to do that. I do want to touch on the idea of human error compared to an unwitting insider threat and a malicious attack. At some point, it doesn’t matter even in postmortem, like forensics.
I’m sure we’ve all been involved in instances where there was an outage. For at least some time, we didn’t know the cause, whether it was internal or external. Until you can determine that, you don’t know if you’re under attack or not. Again, the result is we’ve erupted business so it doesn’t matter. It’s like the Friendly Fire. I won’t go into war stories, but when you have those instances, the consequences and the end result are the same.
You’re certainly welcome to bring the war stories into it because those are great illustrative things for this, but we don’t have to do that. I do want to touch on the idea of human error compared to an unwitting insider threat and a malicious attack. At some point, it doesn’t matter even in postmortem, like forensics.
When you are now on the other side, the breaches happened. Not saying that you guys have been breached, but philosophically the breaches happened and you are looking at all of the things. You start breaking things down where this happened to be human error or it happened to be an internal malicious thing or an external attack. What does that matter when you are looking at the future as far as laying out your plans for the next thing you need to do to prevent that thing that just happened from happening again?
That’s where you strike the difference. There are nuances. There is internal human error that permits external exploitation. It’s another nuance. Clicking on a link, falling for a phish, or whatever, and then the criminal exploits that. That’s a combination of internal human error with an external attack. Basically, it’s a launch-and-leave situation and then they wait until they hit a strike and it calls back to them through the command and control server however they’ve configured it. That’s when they exploit. Other than that, they don’t even know what’s out there. You see all of these scenarios, but the key is understanding the defenses are a little bit more nuanced.
There is internal human error that permits external exploitation.
Our biggest defense against unwitting human error is certainly education and process. That’s why we have change control and release cycles. That’s why we put code through so many paces to try to identify human error before it gets into production and causes harm. A lot of my more modern IT buddies think that’s old. They’re like, “We’ve got DevOps. We don’t need all of those controls anymore.” You do need those controls. You need some segmentation between putting it in production without testing versus testing before you put it in production. There are processes that you can do that try to prevent that human error inside. Those are very important things. There is obviously education. It gets into our security awareness campaigns.
Again, I’m not talking about the boring tests that we do to meet regulatory requirements. I’m talking about good campaigns that help people apply and understand why we do this kind of campaign. Make it real, interesting, winsome, or whatever so that they’ll pay attention, and then the fish testing and things like that. Those are preventions. When you’re looking at outside attacks, then your preventions are more on the technological side. The layers of protection that you have, the threat intelligence, which is becoming more automated through ML and AI mechanisms. You have to have that, so that differentiates. In the postmortem, when you’re looking at it, it’s like, “This has happened.”
Again, the outside attack you have to treat separately or differently from the inside because you have a difference in regulatory reporting, for instance. You don’t necessarily have to report the mistake of an inside that brought around the system outage unless there was data leakage in that. From the outside, you do. Not only should you, maybe you’re obligated to by law or regulation, but you should share that with FS-ISAC or one of the communities so that they can gain intel from it. At any rate, that differentiates those different spaces.
You had mentioned all three in the holy trifecta of people, process, and technology, but then you dropped in number four, the R, regulation. You are part of a highly-regulated industry and need to be on high alert all the time. Is it fair to say that Aflac is at DEFCON 5 at all times given the amount of information that you are charged with securing? Also, that means you need to be up to the minute to be aware of both outside and internal attackers. When we talk about SecDevOps and continuous code improvement, how do you even keep pace? Knowing we have to keep things literally up to the minute, but we also have to deal with the regulations that are wrapped around everything that you need to do.
There’s a dichotomy there. You have to always be aware and alert. Again, through military experience, readiness, and leading your folks, you know that no team can stay at peak all the time. There have to be periods of down. You can’t always be at 100%, or your experience complacency and fatigue, etc. What we have done to stay in that alert is we’ve started building automated process detections using a threat intelligence platform. It’s a whole lot of mechanisms out there so that we can bring ourselves up to DEFCON 5 when we need to. Say it about 3.5 or 4.
I got it wrong. I was just thinking about it. It’s DEFCON 1, not DEFCON 5.
It is.
That’s terrible research on my part. Can we get ChatGPT to change that, so it says the right thing?
It’s fine. The intent is there. You can’t always stay at the peak. That’s important to realize because we’re leading a team of professionals that get great. In my background, when I was in the Air Force, I wasn’t in cyber because it probably didn’t exist back then at least as a capability.
I saw WarGames. That was the Air Force. It was before you were in the Air Force?
I was a bomb disposal guy.
Is that all?
We had periods of time where we were in sheer boredom. Less boredom was when we were training. The biggest boredom was when we keep the lights on jobs to keep our shop in shape. There were periods of time where you have intense adrenaline because you’re on a mission, a bomb has been found, or one found the runway, or whatever. There’s a live bomb that’s been armed and you have to get out there and do it.
That’s the analogy. There are a whole lot of comparisons between bomb disposal and cybersecurity. I’ve written a whole page on the comparisons. A risk management matrix comparing it to the bomb in the military uniform, the bomb suit, etc. because there’s a whole lot of correlation. At the end of the day, it’s risk management. The one correlation is we are trained to do a job.
Other people would see that we have a terrorist bomb on base or downtown or wherever we are as a negative. To some degree, for us, it was like, “This is what we’re trained for. We finally get to do it.” That’s what I see in a lot of our security ops folks. That passion about, “This is what I’ve trained for. This is what we’ve exercised on. This is what I get to do.” They put this whole adrenaline in there. When you have a huge adrenaline rush, you’ve got to have a release for that at some point. That’s where you want to be able to bring your team back down in a way that they don’t burn out.
You talk about this through the prism of being in a bomb disposal unit, which by the way, wow. It’s not dissimilar to the team that you lead now. You hope your team is always bored because that means you’ve done your job preemptively. You are in prevention mode as opposed to a reaction or protection mode.
When it comes time to talk to the C-Suite, the board, and those people, they’re like, “Your people aren’t doing anything.” You’re like, “Yeah because we’re that good. We are not getting beats from it.” How do you work that angle on the idea of, “We need to not only keep the budget that we’ve had, it would be better if we had a little more so they could be even more bored? I would love to walk into the sock and everybody’s asleep because they’re so bored.”
The way you have to have that conversation again, in my view, is we have a strong metrics program or key risk indicators program. When I speak to the board, I’m speaking statistics that say, “Here are the things that we’re doing to be preventive. Here’s how we know that they’re effective. This is maybe where we’re not as effective and we need to increase capability.” That’s the real conversation. It’s helping our board understand the threat that’s out there, how that impacts our company, and what we’re doing to prevent the threat to become a reality. To some degree, I personally believe very strongly in frameworks in order to help convey that message, too. We’ve adopted the NIST cybersecurity framework. We rate each category and subcategory.
You have to go all the way down to actual capability to score it. We rate our level of maturity against the risk tolerance that the board has given me. We also map what we spend against that, so that you get a clear view of when you look at identify, protect, detect, respond, and recover, we display what are we spending in each category.
We’ve made the corporate decision that our biggest bucket of spend is in prevention, which starts to make sense when you look at the overall program. Our greatest maturity is in prevention. Our philosophy is we prevent, we rapidly detect, and we rapidly respond. That’s the bucket of our spend and our capability of what we concentrate on. If you can rapidly detect and respond, then you can contain where recovery does not become as big of an issue. That’s how I display it to our senior leadership and to the board.
Speaking with senior leadership and the board, are they technical or do you need to spend some time upfront laying out, “Here are the elements of what I’m about to talk to you about?” Help them understand that, and then from there, explain the things. The most chaotic element of any corporation is the people by definition. Technology and process are quantifiable but people are not.
It’s important that people in my position, CISOs and the like, even risk managers and those kinds, speak to the board in terms that are real to the board. I don’t rely on a lot of technical jargon. That puts that in a negative light. I don’t try to educate them on the tech. I try to tell them what the risk is. I will tell you, board members are very good at risk. That’s what their seat is, whether it’s a financial risk or market risk. If you can relay to them, “This is our inherent risk. This is our residual risk. Here are our mitigating factors and residual risk, that tunes into our risk tolerance that we have as the company decided.” They tie it all back to, “Here’s how we are either meeting or not meeting our risk tolerance.” You do that through these things. In the last couple of years, the board did make a decision to retain a more technical person. A former tech executive is now on our board. She has brought a good level of discussion. She doesn’t go out there and say, “I am the cybersecurity executive. I’m the tech expert.” What she does is helps as a buffer or as a mitigator. Quite honestly, I get some very good questions from her.
When you come out of those kinds of meetings, how often are they giving you nuggets? Part of your job is that you have to come and present, “Here is everything that has happened.” You’re going to be asked, “What is going to happen?” It’s like, “A balloon floated over North Dakota this week.” How do we anticipate those kinds of things? You come out of those things back into meetings with your team and say, “Here’s what we need to do because we thought of 99 things.” I didn’t think of that one but somebody brought it up.
It is good. Board members tend to talk to other board members. I think that brings into our company a lot of cross-experience. Every once in a while, one of our boards, maybe because of their experience on another board or conversation, they’ll ask, “Have you thought of? Have you anticipated?” We had a lot of conversations in the early days when Russia invaded Ukraine on what does that mean for us. I had prepared for that question. I obviously anticipated that would come up. I was able to have a good discussion about it. I don’t think that the Russians or the Ukrainians or anybody else wants to attack the duck, per se. You had a good opportunity for a duck joke there and you missed it.
You’re free to make all the duck jokes if you want to. I’m trying to be respectful.
What we have to be prepared for is the collateral effect of an attack because you can’t forget that we are part of an ecosystem and we’re part of a vigor. If we lose our power because of a cyber war-type attack on the power grid, what do we do? What you have to do is think of it from the expense. Good questions sharpen us and I’ve always believed that.
What we have to be prepared for is the collateral effect of an attack because you can’t forget that we are part of an ecosystem, and we’re part of a vigor.
My job is to be very forefront and truthful with the board to take from them the direction. Basically, the board has this fiduciary responsibility to the shareholders. They are exercising that responsibility through their questions, direction, and concern. I have a real obligation to make sure when I get those minutes or time with the board, I’m there to help them fulfill their obligation.
This isn’t about me or anything else. It’s to help them fulfill their fiduciary obligation. That’s why I try to bring that conversation in a way that is real to them. They can make an assessment, “Are we doing the right things to protect our shareholders and our customers?” I try to always concentrate from that standpoint.
You lead a team with a company that most people associate with consumer products. Aflac, they’ve got the duck, the football coaches on the commercials, and that sort of thing. I go back to Jack Nicholson in A Few Good Men where he says, “I have a larger responsibility than you can possibly fathom.” That extends out into the idea of all of the people, all of the processes, and all of the technology because the ripples from where the rock hits the water extend so far.
How do you even plan for knowing what you don’t know or for the things that we talk about? How far out do you have to look? Maybe even a better question is, how ridiculously do you have to think in order to anticipate the types, whether it’s external attacks or human errors? Is it something as simple as somebody unplugging a machine without even thinking about it?
I’ve always been a student and practitioner of scenario planning. One of the other functions that I lead, in many cases you see associated with the security team but sometimes you don’t. I also lead the crisis management function. That trickles down into business resiliency or business continuity as well as planning for all manner of events.
Because of that discipline, my team looks at things probably from a what-if standpoint. Our core crisis management planning is an all-hazard scenario. We cobble together a plan that can operate as a framework for any hazard. We then have annexes for particularly our higher concerns like cyber pandemic and physical. We come to that after we’ve gone through many exercises of imagining what could go wrong.
Scenario planning to me is a science. There was a professor at Harvard that spoke at a conference one time. Paul Harvey I think was his name. He spoke and wrote a book about scenario planning. It made such an impact because it’s such a safe thing to do. We can imagine our worst scenario and then back into it on how we would prevent it. You can do it in a safe environment because you’re not poking anybody in the eye. You’re just saying, “Nobody made a mistake here. We can all be honest with each other. This is a bad event. Let’s back up from it and figure out how to prevent it.” That comes into all of our aspects of planning. I really believe that you have to have a good community, capability, and relationships that foster intelligence.
The only way in the cyber business is, it’s much like the bomb business, you rely on what is the bad guy doing. You have to have intelligence. Much of our training in bomb disposal was figuring out what the bad guy was doing, the scenarios in which they could build a bomb, and the different firing mechanisms or timing mechanisms, whether it’s a physical clock, electronic clock, or whatever.
All of that was figuring out what they could do. Therefore, we think of what is the scenario to defeat that. We do that in cyber. We figure out, “Here are the things that they could do to us. Here’s the attack path that they can get to us. Here is our attack surface. Let’s do it to ourselves. We have an ex-red team that attacks us. Let’s do it to us to figure out how the criminals are thinking.” We then know better how to set up our defenses.
I’m going to stick to the bomb allegory here because I think that’s really interesting. It’s one thing when planes fly overhead, the bomb has dropped. You know it’s coming but there’s nothing you can do. You could try to shoot it out of the sky but there’s still going to be damaged or there is a bomb that is fired ground level coming at you. Again, you know it’s going to hit, or somebody is coming in wearing a bomb in a vest.
From your perspective, leading a large team that is disseminated around the world, Aflac is obviously huge, does one take precedence over the other? Going back to the DEFCON reference, you are always at 2.5 because anything could blow up at any time. We need to make sure that we are safe, our users, shareholders, and employees are safe.
In my view, you always measure what is the penalty or what bad is going to happen and then you prioritize based on that using the bomb analogy. If the enemy drops a bomb and it detonates, it’s not my problem. There’s another group of people that’s going to repair the hole or the building. It’s detonated. I can’t do anything about it. I can’t undetonated a detonated bomb. If there is a dud-fired projectile or a dud-fired bomb outside of our hospital where we have patients in ICU, etc., all of a sudden, that takes on a different priority. Same bomb, different scenario. The penalty is there. I would personally take more risk because of that penalty than I would with a bomb out in the middle of the desert that dud-fired.
If it’s dud-fired out in the middle of the desert, I’m just going to blow it up. There’s no penalty. It’s going to make a hole in the ground but it’s out in the middle of the desert so nobody cares. If I did the same thing in the hospital, there would be a higher penalty, even the different types of ordinance. In the cyber scenario, you have to understand the penalty and take the path that brings about the least penalty.
Very early in the DDoS attacks back when the al-Qassam Cyber Fighters were attacking the financial institutions, I happened to be at a bank at that time. We wanted to figure out how we could defeat the attack outside of our border, so we extended our DDoS protection at the edge. We did a black hole and we did all kinds of things to drop it at the edge because as it got closer to our core, our potential for error had a greater penalty.
When it’s way out there, if we make a mistake, we can recover. When it’s in here and we make a mistake, we can’t because it’s inside and we do it. You have to take that risk mindset against, “I will extend the effort and take the appropriate precautions based on risk and penalty.” Even when you’re doing your budget or you’re building your program, that’s got to be in the back of your mind. I’m not going to spend a ton of money to protect this one laptop that I can otherwise isolate versus spending the money to protect the whole ecosystem.
You are the leader of a very large security organization for a very important company to lots of people. Insert duck joke here. What’s got your eye right now? Not necessarily it’s like, “We have to stop this from happening,” but what are some of the things that you are looking at and thinking, “That one, maybe. This one, not so much?”
To some degree, all of it is driven by our business strategy. Our technical responses to that business strategy drive where we have to set up our protections and how we have to do it. Certainly, like many companies, we are looking at cloud more. We’re migrating to the cloud. We already have several of our subsidiaries and a cloud environment. Making sure that as we move into the cloud, we’re doing it in a security-first model.
The leadership at Aflac is amazing. I’ve never had a group of executives that were more dedicated to protecting our customers, our shareholders, and our company. When we made the decision to go to the cloud, I postured that we should probably first create a strong cloud security model so that as we go into the cloud, we’re doing a security-first kind of thing, and that’s what we’ve adopted. What we’ve found is, as we put different workloads in the cloud, we have to take maybe a little bit different peak at what that security-first is. That’s one of our big initiatives as we move more into the cloud. We started small with one line of business, learn from it, brought in another line of business, learn from it, and we’re continuing that progress.
Another thing that we’re passionate about is what’s known as customer identity management. I probably painted a broader brush on what that is than a lot of people. That is, how do we make our customer experience very positive but at the same time secure? How do we validate that the transaction that is happening is coming from the appropriate authorized person? Those kinds of things.
We’ve got a very strong basic platform for that but we’re modernizing it to make the user experience better. When we talk about user experience in Aflac, we have to include our partners. In fact, we just stood up a team called Partner Security because our partners are brokers and agents. That’s how we deliver our products to our clients. We have to make that experience a very positive one for them. We’re looking at that from a standpoint of, “Yes I can secure it but do I secure it at the price of experience?” Certainly, you never want to do that.
We are rolling into the season where everybody gets together in glad hands for a bit. We’ve got South by Southwest coming up, HIMSS, and then RSA. Eventually, we’re going to get into Hacker Summer Camp with Black Hat, DEFCON, and all the different BSides around the country. Shout out to all the BSides because they’re amazing. How do you feel about the industry right now? You’ve mentioned cloud. We had our glorious dance with blockchain, God forbid, NFTs, and all of the things. When you survey the industry, what are you thinking?
It is funny because when you look at it and you forensically back up into most of the cold compromises or the hacks, whatever you want to call them, most of them are because we forgot basic IT or basic security hygiene. I do think that with all the different pieces of noise going on, if you’re going to have an effective program, you have to boil it down to, “What is the protection that will protect that,” and configuration management, especially in the cloud.
Configuration management is always important, but when you had it behind four layers in your fortress data center, you could probably get away with a mistake here and there. When it’s exposed at the edge or in the cloud environment, it’s so much more visible. People can see it. That’s why basic hygiene is so important, and I keep emphasizing it.
I want to tell you another thing. Any leader of a sizable security organization or whatnot needs to realize, “What is my personal limitation? Who do I need to surround myself with that can cover off on that?” I’ve come to realize that there are so many things going on in the tech community with different ideas floating. I have to have a team almost that is like, “Sam, you go over and you concentrate on this. Sally, you go over there and take care of this. Linda, you can take care of this,” so that we have a holistic picture of it. As you know, there are more tech theories coming out every day. We’ve got the quantum computing thing.
Any leader of a sizable security organization or whatnot needs to realize their personal limitations.
That’s going to save everything just like blockchain was going to, artificial intelligence, machine learning, the cloud, and computers in general.
NSA wrote a paper on the quantum computing threat. It caught my attention. It’s like, “That’s true. Every type of cryptography we have becomes obsolete overnight once you achieve quantum computing.” That’s another thing we need to think about.
You may have just mentioned it, so please feel free to say, “I just answered that question.” To that point, is there anything that you are seeing out of your peripheral vision that you think we may need to be a little more focused on looking straight at it as opposed to just keeping an eye on it?
I could answer that from a few standpoints. One, I’m the chairman of the board of a nonprofit organization called the National Technology Security Coalition. It’s NTSC.org. We’re non-partisan, non-aligned, and nonprofit. What we try to do is advocate public policy around securing technology, our business, and our nation’s defense.
From one standpoint, I think we need to get a handle on public policy around the broader thing of cyber. It’s very difficult with the climate in Washington right now to get things done, but one of the things that we all need to agree on is protecting our people, country, and eCommerce. Our ability to do business has to be non-partisan. We have to lock hands and say, “I don’t care what else is going on there. This is important. We’ve got to do it.” From that standpoint, we really need to concentrate on it. We need to have appropriate and reasonable oversight over our software manufacturing.
You’re talking about the government. You just said appropriate and reasonable. That’d be great.
It’s a tough one. That’s one reason NTSC exists. It’s to try to bring out appropriate and reasonable. The fact is if we produced airplanes like we produced software in our country, there would be planes laying on the ground that have crashed and would say, “We need to fix the cable on that aileron.” That’s exactly what we do. We have a terrible history of producing high tech that then you have a whole hacker community trying to figure out all the flaws. There has to be a way to get a little more disciplined upfront on that. I don’t want to stifle innovation. Every company in America should not have to spend the kind of money we are to defend the company, our customers and our consumers because we’re not making good decisions upfront when we’re producing technology.
I’ll get off my soapbox on that one but that does concern me. Certainly, what plays tightly into that is the nation-state threat and the cyber war threat. You can’t separate those two things when we see adversaries penetrating technology to service providers. They’re being very disciplined about getting at the code level, and then now all the companies are putting that code into their environment.
Each company has a responsibility. In the analogy of the military, we have a national defense to protect us from the hored so that each of us can protect our own community. We need to start thinking about that from a cyber standpoint. We need to be protected from the horde. We need something to get enough protection off of our backs so that we can defend ourselves. Taking the analogy a little bit further if I bought a bunch of missiles and started setting them up on my roof to defend against aircraft, I’d probably get arrested. You’re not authorized to have that missile.
I would but you’re an Air Force veteran, so you’re probably okay.
If you think about it, that’s almost what we’re doing with cyber. In our present environment, when a breach does happen, who do they come after? It’s obviously the company. If the company is negligent, they deserve it. I get that. When you’re fighting off the horde, it’s hard to do it. The consumer winds up paying for that in product prices and lack of privacy.
Horde is such an amazing word. It’s so visceral. Dear reader, if you have not considered the notion of what a horde is and what people like Tim to deal with and his extended team, horde is perfect. It’s an anthill. It’s like that zombie movie with Brad Pitt. What was it called? I can’t even remember.
World War Z.
Yes. It’s a horde of attackers.
I was corrected by one of my associates that it’s not Z, it’s Zed.
Europeans call it different things when they’re in Europe. I get it. Let’s shift a little bit and get away from the horrible notion of attacker zombies coming into the cybersecurity spectrum. Let’s do a little bit of leadership corner. Hopefully, out of the 24 hours of the day, you have a little time to not be doing all of these kinds of things. If you are not doing these things, keep all of Aflac’s users, consumers, and everybody involved safe, do you ever take a break or read a book? What’s on your Spotify list? Do you have a bicycle?
I have all of those. One thing I really enjoy when I get downtime and helps in my downtime is myself and my family living on 24 acres. I have two daughters, their husbands, and their children. They live on the acreage. They’re two separate houses from our house, so we have our privacy.
They’re on the South 40. Their view is quite as good as yours.
Actually, they get the lake. Anyway, I don’t know how that deal is. I enjoy getting on a tractor and doing tractor work. I’m not professing to be good at it. Sometimes I look at it and say, “I really messed that up,” but it’s fun to do it. In my life, I’ve worked bomb disposal and cyber. We do a lot of work, but oftentimes, we have nothing to show for it.
When I would blow up a bomb, it was gone, or when I would disarm it, I might have the evidence. What do I show for this? Working on the tractor building, doing the yard work, and all that, I’ve got something I can look at and say, “I did that.” I enjoy doing that. I love reading. I have several books open at once. It’s a novel for enjoyment and a leadership book for professional development. I’ve read all of Dick Clark and Richard A. Clark’s books. I really enjoy those. I love theology, so I read theological books at the same time. Those are my hobbies.
Throw me an author or a title. It doesn’t have to be the last one you read, but the one that you’re like, “This is a book that I love.” Fiction, non-fiction, development, or anything like that. Give me one.
I just read a book called Envy by Sandra Brown. Sandra Brown is a novelist. I don’t know that much about her but I love her books. She’s one of those authors, especially if I’m reading a novel. If they don’t capture my attention upfront, I’ll throw them away, but she does.
That’s perfect. Anything you feel better after you read it as opposed to before you read it? That’s the perfect answer.
It’s mindless. It’s one of those things I enjoy while I’m reading it. I don’t walk away with any great knowledge, but for those times that I’ve relaxed and read it, I just feel refreshed.
That’s why we like ice cream.
Yes.
Shameless plugs for you personally, Tim Callahan. Anything cool Aflac is doing, you’re involved with, charities you like, or something where you’re like, “That was pretty cool. I’d like to say that out loud.”
I’ve got two things I’ve got to mention then. The Aflac Childhood Cancer Center in Atlanta is one of the charities that I most definitely support. It’s an amazing story of how Dan Amos many years ago contributed money to the Atlanta Children’s Hospital to open up an Aflac wing to treat childhood cancer and blood disorders. They do an amazing job there. They touch lives.
It’s not to the scale of St. Jude. St. Jude is another one that I support, but it definitely affects lives and saves lives. It’s a wonderful work. My work with the National Technology Security Coalition, that’s an important piece of work. We invite more CISOs to become involved in that. Mainly because having the CISO opinion on Capitol Hill is so important to help our lawmakers see the importance and the need to make good public policy.
You had mentioned them earlier, that was NTSC.org. I’m pretty sure at Aflac.com, they can find out more about what they’re doing with the cancer center. These are the things that sometimes the guests are so shameful that they don’t understand what shameless plugs are. They need to be like, “This is the thing. Here’s where you can find it.” What about you? Are you doing anything? Are you writing? Are you speaking anywhere? Do you want to point at anybody else? You’re a security guy, so there’s also that.
I am doing a session in the next week or so. I can’t even think about it, but it’s around adopting the CISO career path, and there are 2 or 3 of CISOs doing it. I’m sure if you Google it on LinkedIn or whatever, it’ll give you more details.
I like that you use Google as a verb without referring to Google as a place to go to find information. That’s where we’ve arrived as a society.
That’s right.
This is why we love the guests that we get on the show because they are so humble. They don’t even know where they’re going. They will tell you about the organizations that they support so you can go support them instead of the actual guest. Tim, thank you so much. I’m going back to Jack Nicholson’s quote, “Responsibility that you cannot possibly fathom.” We didn’t even scratch the surface of the surface to scratch anything on this. I hope you’ll come back and we can do this again.
I’d love to, Matt. Thank you so much.
Until next time, thank you for joining us on the show. A friendly reminder that all comments reflect the personal opinions of the participants, not necessarily those of their employers or organizations, producer Sharon or myself. Although, sometimes they might reflect my opinions. For more information on all that’s good in the world of cybersecurity, make sure that you check us out on LinkedIn and Facebook. Of course, the mothership at ElevateSecurity.com.
My name is Matt Stephenson. You could find me @PackMatt73 across the socials. The show is everywhere you go, that’s where we are. All we ask is you subscribe, rate, and review. You’re never going to miss out on all the great people who are doing things to keep your world secure. People like Tim are also pointing out how to help people fight cancer. If you don’t want to go do that stuff, I don’t know if you’re really a good person. Until then, we’ll see you next time.
Important Links
- LinkedIn – Elevate Security
- Facebook – Elevate Security
- Aflac
- SunTrust Bank
- FS-ISAC
- South by Southwest
- HIMSS
- RSA
- NTSC.org
- Envy
- Aflac Childhood Cancer Center
- St. Jude
- @PackMatt73 – Instagram
About Tim Callahan
Tim Callahan joined Aflac in 2014, bringing more than 30 years of experience in information and physical security, business resiliency and risk management. He was promoted to his current role in January 2016, where he is responsible for directing Aflac’s global security strategy and leading the information security, business continuity and disaster recovery functions across the company to prioritize security initiatives and allocate resources based on appropriate risk assessments.
Prior to joining Aflac, he served as senior vice president of business continuity and information assurance for SunTrust Banks, Inc. He also held leadership positions at People’s United Bank. Tim served in the U.S. Air Force for 23 years and earned an associate’s degree from the Community College of the Air Force and a bachelor’s degree from the University of the State of New York, Albany.