Leaders and entrepreneurs know that there are a lot of factors in growing an organization. You must think about structures, making better decisions, and what truly matters. Matthew Stephenson sits down for a conversation with Dr. Shayla Treadwell on the importance of building a culture of trust in your organization to drive excellent results and happy people. Dr. Shayla manages information security programs and helps drive stakeholder collaboration on IT governance, risk, compliance, and assurance issues. In this episode, she shares in-depth insights on aligning the values and ethics of executive staff because that significantly affects the company as a whole. What mistakes are you making, and how can you manage your people effectively? Tune in to learn more about cybersecurity risk management programs, industry-leading security regulations, frameworks, and certifications.
—
Listen to the podcast here
Understanding The Importance Of Building A Culture Of Trust With Dr. Shayla Treadwell
We are excited to welcome Dr. Shayla Treadwell to the show. I need to read these things because you need to understand how cool it is that we’ve got some of her time for now. At ECS, she is the Vice-President of Governance, Risk, and Compliance. She is also the VP of their Cybersecurity Center of Excellence. She’s a member of the Cybersecurity Summit Think Tank.
You need to check this out. This was in October 2021,. Over a thousand global thought leaders across sixteen critical infrastructure sectors probably were checking out. In previous lives, over ten years in security, she was the Cofounder of the Treadwell Agency. She has a PhD in Business Psychology from The Chicago School of Professional Psychology, and not for nothing, a fellow Bradley Brave. I don’t think I wasn’t excited about that. Dr. Treadwell, welcome to the show.
Thank you so much, Matt.
You’ve got a lot going on. I don’t even know where to start. Let’s go ahead and jump right into it. What is your top security priority these days? Like I said, there are lots of things happening, but when you walked in, everyone was like, “What do I have to do first?”
From a top security perspective, there are a lot of things going on, especially as the threat landscape changes and more technologies are introduced. Cloud security is huge right now. It’s something that we talk about a lot, additionally, architecture. Given the architecture thing comes from the Federal government. The White House mandate that came out in 2021 started talking about zero trust a little bit more. Given we all know zero trust is nothing new, however, it’s something that people are trying to implement into their architecture, putting those principles, and understanding what those principles mean.
DevSecOps is top-notch. However, in my world, risk is everything. It is important for me because it helps organizations understand their strategic direction. People are starting to understand now that strategy is not implementing a random framework. Strategy is understanding how you implement that framework. It’s the framework for how you make decisions in the company. I tell people all the time, “It’s lovely if you’re building a house, and they tell you, you have 3 bedrooms and 2 bathrooms.” However, putting all of the bedrooms on the first floor and putting the bathrooms on the second floor, strategically, that probably wasn’t smart. Add a little risk if you’ve got to make it or not. That’s the top thing for me.
A truly epic analogy that I have not heard before, and probably emblematic of what you have to deal with if you are involved in security when all of the bathrooms are upstairs from the bedrooms. Probably at the bottom of a hill, I might add.
Let’s talk about risk there.
It’s important to understand that every decision that we make from a risk perspective is a business decision.
How do we stack rank these things? Which one do we want to go first? Risk is a huge topic and all-encompassing because that’s the one where you’ve got to deal with everything. It’s got to be human, physical, and technology. In your position, how do you even define risk before starting to break it down and sort it out among associated teams with you?
The way that I view risk is a little interesting. From a foundation cyber security perspective, we have enterprise risk management. From there, you have operational risk. Under operational risk is where you will find information security risk or cybersecurity risk, depending on your definition, understanding of resiliency, and things of that nature.
However, it comes down to the organization’s risk appetite at the end of the day and what industry you’re in. That’s where I have my organizational psychology hat come into play because it is thinking about those structures, how you make those decisions, and what’s going to matter. My team and I build risk score cards for folks. When we’re looking at our risk score cards, we try to break it down to the most important things. If I was in it, the most basic level. We’re looking at human risk. We’re looking at some of our technology risks. We’re looking at our applications and things of that nature. However, it comes down to what our contractual obligations look like, what are our compliance requirements when it comes down to those authoritative sources, and what the appetite of the company is.
At the end of the day, if I work for an energy drink company that makes people sign waivers to say that they will take as much risk as possible even if it makes them sick. I’m not going to be risked the same way as a financial organization. It’s having those difficult conversations with executives that they don’t want to have, and board members and helping them understand, “What do you have to do? Is that good enough for you?” That’s how we ended up attacking risk and understand what we need to track.
Can we roll that back for a second, an energy company that makes you sign a waiver that you will drink it, even if it makes you stink? Is that an actual thing?
It could be at this rate. I don’t know.
I need to know if that is one of those, if I had 1 or 12 of them, maybe today. The appetite for risk is an interesting phrase because we’ve heard a lot of different conversations around risks. In your position, how do you leverage risk for defense and for gain? Where do you find the acceptable level of, “We can take this level of risk in order to learn from it to prevent the next thing from happening?”
That’s a phenomenal question, and my answer might be controversial. It sounds cool to say you work in cyber. People think about people who sit in basements with hoodies on, and there’s a matrix screen, “Cyber, lifesavers,” and we all have stuff like that. I view myself as an information security professional, and the reason why I say that is because with information security, at the end of the day, I care about business resiliency a little bit more than I care about cyberbullying. That sounds terrible to say, but in my job, that’s what I care about. Unless cyberbullying is going to cause us reputational damage, that goes right back to business.
When I start thinking about it in that lens, it’s important to understand that every decision that we make from a risk appetite perspective is a business decision. The best example I can give is that if we don’t do something that we’re supposed to do, but it’s going to cost me $1.2 million if we get fined and someone finds out, versus I’m going to gain $5 million from doing it, the risk appetite is, “Go ahead and do it.” You’ve got to make money. You get caught, you lose $1.2 million, and you get $3.8 million. That’s cool. Those are the tough conversations that people have to make because it’s a business discussion.
Along with that, business discussion talks about the ethics and culture of the organization as well. I may say, “I make another $3.8 million even if I get caught.” Does that align with our cultural values and the ethics of our executive staff, how we make decisions, and reputationally what we want people to know about us?” That’s included in that calculation and in that discussion, and some companies have to decide.
That’s why we see plenty of companies getting a lot of trouble from doing a lot of bad things. Those executives got paid off. However, are they still happy or do they have to do with that reputation for the rest of their lives? It’s a deeper, more complex discussion to understand offensively and defensively what you want to implement as you’re mitigating risks or if you want to transfer or accept it.
I want to congratulate you for making it all the way to sub-question two of question one and for completely blowing up the rest of the outline by dropping some bombs that I feel need to be addressed. In your position, you have to be business-focused first, but you cannot dismiss the notion of culture and the human factor. When we look at the number 3.8 plus 1.2 negative, you’re coming out almost 3.2X ahead. That feels like a win, but you mentioned reputational damage, and there is a security component to that when it comes to dealing with the human factor.
When you have to have those conversations up the chain, whether it’s with at board level, investors or partners or whoever, how do you quantify things like that in a way that they get, “Here’s the dollars, it cannot be disputed. This is reputational damage.” This is a little different type of conversation.” How do you do that?
It’s actually very difficult. Anyone who says they can quantify risk is tough. Is it possible? Yeah, it’s possible. Are most organizations doing it? No, you can’t quantify risk easily. It’s a qualitative discussion. Statistically, you have to understand that. I can qualitatively look at risk and quantify what I qualitative looked at in order to give you a number on a scale. That’s how we ended up looking at maturity models and things of that nature. At the end of the day, it’s not that simple.
Something that I do is I try to look at what changes over time look like because even if we don’t get that 3.2 right now, or whatever it may be, over time, if our reputation maintains where it is, qualitatively we can ascertain that in the next five years and maybe at a different level. You’re going to make money anyhow. It’s a time discussion based off of the strategic direction of the business.
That’s why I keep talking about risk and strategy going hand-in-hand because it’s not a point-in-time discussion. We’re talking about over time and predicting where people want to be based on indicators of the future. A lot of organizations struggle with that because everybody wants to get things done right now.
Building a culture of trust is a little bit harder than it used to be. Looking at birth cohorts and generations is one factor.
This is not our right now conversation. It is talking about the puzzle pieces to get you to where you want to go. Sometimes people can use predictive analytics, and they want to talk about these discussions if you want to get fancy with numbers, or you can take the futurist approach and look at indicators of change, and figure out where you fit in there in order to sustain the business. The goal is sustaining the business.
The idea of predictive modeling is interesting. Artificial intelligence has become a major player in this thing, but the biggest piece of chaos involved in what you do has to be the human element. When you’re looking at risk and the idea of insider threats, humans are threats, whether they choose to be or not. We are who we are. How much does that come into play in what you do? How much time does that take compared to the things that you can quantify as opposed to qualitative analysis?
The human factor is something that takes a lot of time, and it’s underappreciated in this industry. In this industry, a lot of people think that you could buy a piece of technology and turn it on, and it fixes all your problems. You can’t do that. I tell people all the time, “Technology does what we tell it to do.” People don’t, and you’ve got to deal with that.
A lot of times, I’m taking a psychological or sociological approach to the way that you build teams, the way that you look at how your organization is mapped out, and looking at indicators of behavior in your organization. People love to send these phishing simulations, but there’s so much dark data in those phishing simulations. You can get to know your culture. I can figure out that my organization cares when I say, “Someone sent you a card that said you did a good job, versus someone signed you up for a dating app account.”
They probably won’t click on that, but they care about that card. If I send something saying, “Your PTO is not looking the way that it should.” People want to click on it. You start to learn a little bit more culturally about your organization and what they care about. You have to filter that over to your HR departments and your MarCom departments to start looking at how to communicate with your population to continue gaining trust, because trust is a huge factor when you’re looking at human risk and have it mitigate that.
Trust, risk, and culture, I’m sure, are things that all of the quants want to sit down and talk about. How do you build a culture of trust? You’re constantly hitting up your team with things like phishing tests. There is a human nature thing to assume that they don’t trust that I’m good enough to understand this, but at the same time, you need to be doing your diligence to make sure that they understand this. That’s a tightrope to walk.
Part of your job and your team’s job is to do this thing. How do you do that in a way that gets everybody to lock arms and come in and be like, “We got this, not just for us, but we got it for our customers and investors,” if you’re in government, “for the population.” How do you build a culture of trust in order to create security?
Building a culture of trust is a little bit harder than it used to be. There are factors that I can discuss. One, looking at birth cohorts and generations is one factor. When we look at our silent generation and Baby Boomers, they like hierarchy. Trust was innately easier to gain if you were in a certain position of power. I could discuss this on and on about different factors that impacted these birth cohorts.
Let it go. Let’s get weird.
By the time you get to Gen X, they don’t trust leadership as much. When you get to Millennials and down to Gen Y, they throw it completely out the window because those are the generations that were impacted by financial crises where companies showed that they were not as faithful to them as they were to their parents. We got that factor going on, your birth cohort.
The second thing is that when you look at leadership, it is key when you’re trying to build trust. When I did my dissertation, my studies were centered around this topic. It was interesting because when you look at leadership models, whether you’re looking at transactional leadership, transformational leadership, servant leadership, or authentic leadership, I can keep naming them all. We looked at security professionals at the highest level in the organizations. Suddenly, we had to ask the question, “Were our security professionals leaders or managers?” There’s a difference. It was all about authority. Most security professionals do not have the authority to do a lot of things. They have to go to somebody else in order to get it done.
Therefore, when you’re looking at trust in that space, most security professionals are not as trustworthy in a lot of things. The best ones are risk-averse, which is fascinating when you think about who we put in positions in security. Thinking about that, if I have a risk-averse person that is not comfortable sharing a lot of information because we view it as secretive in a leadership position in security, how are you going to gain trust if you don’t share? There’s no authenticity there. They’re only authentic with their small group of people that do the same work as them.
Continuing down that path, the next one is about communication. What do we communicate about security? Our industry is interesting in the fact that we don’t tell people what things are going good. You only find out when things are going bad. If you hear something about security because someone has been breached, someone has been hacked by this nation-state actor, it’s always something negative that’s jarring and that’s always putting out little fires.
What about the good things? What about that grandmother there who realized that something was a phishing email, and she didn’t get caught? What about that kid that got a FaceTime from someone asking them some personal questions and realized, “I don’t talk to strangers,” even if that’s digital? We don’t talk about the good things in security. In order to get the trust within your organization, there’s a level of authenticity and openness that we have to embrace in order to get everybody on the same page. That’s something that we got to work on.
It’s like being an umpire in baseball or a bassist in a band. They’re aware of you when you blow a call, or you break a string. Otherwise, it’s like, “We’re good. Keep going. It’s fine.”
That’s what it’s all about. It’s authenticity and communication. I know it’s tough in our industry because you don’t want the adversary to know certain things, but we can communicate a heck of a lot better than we do.
Training and awareness are not the same thing. Awareness is making something known to you. Training is telling you how to deal with it.
I feel like you flex the PhD in Business Psychology right there like, “I’ve got a piece.” This is what it would feel like to sit in one of your 700 level classes if I had the good fortune to be there. When we look at how you analyze people and read and react to people being part of the organization, unfortunately, AI is still going to be here at least for a while. What do you see, not with the people themselves, but with the idea of the human element? What are the biggest weaknesses in securing an organization? I’m an English Lit major, so words are a big deal for me. How do you protect them from themselves in order to protect the organization at a higher level?
That’s where you start to see training, awareness, and education falling to play and understanding the difference. A lot of people think that training and awareness are the same things. They’re certainly not. Awareness is making something known to you. Training is telling you how to deal with it. A lot of times, when we think about training and awareness, we’re thinking about these little computer-based training that goes out annually to tell you to do something once a year and follow these processes and procedures. That’s not how this looks.
All those great videos that we pay close attention to.
I hate to say nobody cares, but nobody cares. That’s a compliance exercise. If you want to ingrain that in your culture, you’ve got to make it matter to them. I had the wonderful opportunity of working with probably some of the greatest security professionals ever in one of my careers. Most of us at that point in time did not have security backgrounds.
I’m talking about people that were high school educators. I had someone who came from a marketing background. I had someone who was a hardcore techie, someone who did advertising. It wasn’t the people that you thought should be in that room. Building security awareness programs with them, we modeled it out the entire year. We marketed security to the company. That’s what we did.
The first thing we did in our first quarter campaign was securing what matters most. We’re talking about our people. I care about you, your kids, and your parents. Let me deploy educational material to make sure your Social Security number doesn’t go missing. I’m trying to make sure that when you’re on vacation, something doesn’t ruin your experience.
When we made it personal, we found that in the next quarter, we can talk about how to secure your team because you work with your team a lot of times, and they probably need to have some things too. Let’s make sure your department is good. Eventually, it wasn’t until the 3rd or 4th quarter that we started talking about the company, but by then, you have people forming habits.
I think the statistic is that it takes, on average, twelve times being told something and doing something to form a habit. We put in those twelve times talking about you and your own personal thing. You bring that with you to your job. A lot of times, I know it sounds odd to a lot of people, but if I can make you care about this for yourself, you’re automatically going to do it for the company. It looks a little different when you do it that way.
The team that you described is amazing. To me, it sounds like the Avengers because if you have 12 Hulks or 12 Iron Mans, you can only fight one type of bad guy. You cannot win, but I also think what’s fascinating is you are coming from a psychology angle. You mentioned there are people in there from marketing, you’ve got your hardcore tech, and you have to, but someone from advertising. That’s the one where people are going to roll their eyes. It was like, “Is anyone going to buy this gross Cola if we don’t advertise it the right way?” It is how do we get you to want to do it. It sounds like that’s the key part of the culture. We can make great security, but if you don’t want to use it, you’re not going to.
I tell people all the time why TikTok is addicting.
Society is going to hell, but that’s a different conversation for a different day.
I’m like, “Why does my great aunt get her recipes on TikTok?” You could have made it watching this journey. Amazon got it right many years ago when you started tracking cookies, but can you psychologically start tracking the cookies in your own company? Yes, absolutely. That goes into me figuring out what matters to you.
That’s why I tell security professionals to get to know the members of your SOC and understand incident management. What mistakes are people making in your company? Pure mistakes. What are they curious about? What’s happening in the threat landscape? What do people care about? Turn those things on top of their head. That’s the way you do awareness. I need to get your attention, and getting your attention is not saying you protect the company’s assets. That’s not going to work.
Right about now is what I would say, “Consider this your official invite to come back for another show.” I’m going to say, “Consider this your invite to host a show.” We could do a half-hour a week minimum talking about all of this, but let’s roll it up. You’ve built this team. If you don’t want to get specific to your particular team at ECS, but in general from your experience, how do you see this organized? You’ve put this room together of these disparate talents. The next level up is looking and going, “That guy is in here? That woman is in here? What are we doing?” Who owns what? Where does the buck stop and start?
When I built that team, I had a phenomenal leader that trusts me. I did some crazy stuff, and he probably would talk about that to this day. I was buying pop shops. We were looking at March Madness and doing a security event around that. We had a money machine and put fake money in there to teach people that, “You protect these dollars. Grab as many dollars as you can.” We did crazy stuff. I may be the wrong person to talk about that.
No, you’re exactly the right person to talk about that. Please, keep talking.
Organizations that don’t have fiduciary duties tend to lag slightly in that space, and it’s not until something goes wrong that they start to pay attention a little bit more.
It’s all about trust. I tell people there’s a method to the madness. There was nothing that we did in our programs, and even now, building out new programs that ECS, there are metrics behind everything that we do. We’re looking at changes in behavior. If something doesn’t work, we get rid of it fast. Convincing leadership is about making sure that you’re not pitching a bunch of random stuff, but that you’re saying, “No, I can show you the number of changes in behavior. I can show you that the report rate is going to go up if I do this.” You test that thing out.
That’s the reason why I keep saying, “It’s advertising. Do you understand behind those machines, the data that they crunch, and to do what they have to do?” You have to view your job, especially in security awareness, in the same exact way. When I say security awareness, I’m not just talking about training and awareness of the organization. I’m talking about when you have to brief the board, what does that look like? How do you tell them, “I need more money?” That is a tough conversation. You’ve got to make sure you have your ducks in a row to show what that looks like. You’ve got to have one person on your team that’s good at numbers in analytics because if not, you can’t tell the story.
You mentioned the board and your experience as you are walking through the various levels of this. At the board level, do you see security, not savvy or adept but expert people? How much time do you or your team have to spend explaining the security aspects so that the board understands?
It depends on the board. I find that for critical infrastructure, manufacturing, financial, and organizations like that seems like their board members are in tune because there are Federal regulations around security that make them fiduciarily responsible for it a little bit more. We’re finding that when it comes down to regulations and things of that nature, that makes the board a little bit more keen because we were like, “What’s going on in this space?”
I find that organizations that don’t have fiduciary duties tend to lag slightly in that space, and it’s not until something goes wrong that they start to pay attention a little bit more. However, I do think that organizations can do a better job at storytelling security, so that it’s not a bunch of techs thrown at people. You have to tell the full story of how that fits into the operational standards and budget of the organization and what it’s protecting.
It’s difficult talking to a board member about a group that’s a debit to the organization. I can never show you how much money I saved you. That’s a little difficult. Maybe someone has figured out a formula. I don’t know what it is, but I can show you the benefit to the organization in a different way. The conversation took a little bit because it does get into that space of requirements and appetite.
I’m going to make the host-level decision to cut it here because I have many more questions, which means you have to come back.
I’ll come back anytime.
Let’s move into the leadership corner. Shayla, when you are not doing all of this stuff, when you are not protecting everybody from everything, what’s on your playlist, and what are you reading? Are there magazines in the bathroom? Are you cooking, gardening, or riding horses? What’s going on?
I’m a big gardener. That’s my therapy. I have a little garden in my backyard. I got all my little cucumbers, squash, zucchini, tomatoes, peppers, and all this cool stuff. That’s my jam. I love gardening. Additionally, I like to read a lot. I’m nerdy in the fact that I like to physically hold a book.
I’m noticing that’s a recurring trend in our guests. People remember these things made out of paper and cardboard with words on them.
I like to read physical books. I always have 3 to 4 books in rotation. That’s how it works in my head. I’m reading This is How They Tell Me the World Ends by Nicole Perlroth. I like that one a lot. Talking to Strangers, Malcolm Gladwell. It has helped me with communication so much. I highly recommend that to people. Billion Dollar lessons is another one that I love, and Against The Gods. It’s about the origins of risk. I like that one too. Those are the four that I’ve got going on right now.
Producer Sharon, we’re doing a book club episode. We’re bringing every guest back, and each one is going to get 5 to 10 minutes to talk about whatever books that they’ve named check so far because these are always my favorite answers for everything that we dig into. Dr. Treadwell, if people want to know more about what’s going on at ECS, going on with you, social media, are you speaking anywhere? Are you publishing live events? Any of those kinds of things. Tell the world where they can find you and all the cool stuff.
You can always go to ECSTech.com. We’d always have a ton of blogs about different trends that are happening. You’ll probably see my name up there. Treadwell Agency is the company that I own. We’re always doing cool stuff. I have a cyber security summit in Minnesota that you mentioned earlier. I will be there. That’s going to be an awesome summit and shameless plug, Minnesota has headquarters for all of the different types of critical infrastructure for the US. Check that out. I’ll be in MSSP Alert Live in Washington, DC on September 12th, 2022. That’s going to happen. Uniting Women in Cyber 2022 is happening in Tysons Corner here. It’s going to be on September 27th, 2022, and I will be speaking there as well.
It is a food town as well. While you’re doing all that cool ICS stuff, you can also get good meals any time of the day. We got so much more to talk about and the book clubs. Please, do come back. Thank you for taking your time with us.
Thank you very much. I appreciate it.
Thank you for joining us. For more information on all that’s good in the world of cybersecurity, insider threats, gardening, and cool events all over the world, including in Minneapolis, check us out. You can find us on LinkedIn, Facebook, and ElevateSecurity.com. You can find me @PackMatt73 across all of the socials. As far as Friendly Fire’s, everywhere that you go. That’s where you can find this. All we ask is you subscribe, rate and review. We got so much cool stuff to do with many cool people. I get fired up at the end. You can usually hear it in my voice. Until then, we will see you next time.
Important Links
- LinkedIn – Elevate Security
- Facebook – Elevate Security
- ECS
- Cybersecurity Summit Think Tank
- Treadwell Agency
- This is How They Tell Me the World Ends
- Talking to Strangers
- Billion Dollar lessons
- Against The Gods
- @PackMatt73 – Twitter
About Dr. Shayla Treadwell
Dr. Shayla Treadwell is vice president of governance, risk, and compliance (GRC) at ECS. In this role, Dr. Treadwell oversees the company’s information security programs and helps drive stakeholder collaboration on issues related to information technology (IT) governance, risk, compliance, and assurance. Dr. Treadwell is also responsible for the establishment of ECS’s cybersecurity risk management program, ensuring the company remains aligned with industry-leading security regulations, frameworks, and certifications.
In addition to her role as VP of GRC, Dr. Treadwell serves as VP of ECS’ Cyber Center of Excellence. In this role, she works with subject matter experts throughout the company to facilitate research and development and support ECS’ strategic cyber goals.
Dr. Treadwell is the co-founder of the Treadwell Agency, a small consulting firm dedicated to advising businesses on digital transformation, strategic planning, and IT security. She has a doctorate in business psychology from the Chicago School of Professional Psychology. Her dissertation focused on organizational leadership and examined the correlation between information security and leadership practices.