Many people in workplaces around the world have clicked dubious links thinking they’re important emails from work. These people have become victims of spear phishing, one of the biggest threats to workplace security right now. How can we protect ourselves from the bad actors that do this? In this episode, Camille Morhardt, the Director of Security and Communication at Intel, shares a three-pronged approach that can help to protect us from spear phishing. The three-pronged approach is about awareness, training, and tools.
As part of the development and security world, Camille sees it as her mission to make cyberspace secure so that people can compute with freedom. Tune in for some of her tips on how you can start to protect yourself and your company from these attacks.
—
Listen to the podcast here
Camille Morhardt: Inspiration Is Better Than Motivation
Here on the show, we’ll be bringing you all the top experts in the industry for a chat about anything interesting in keeping our world secure. Speaking of doing exactly that, in this episode, we are excited to welcome Camille Morhardt to the show. Camille is the Director of Security Initiatives and Communications at Intel. For many years, she’s been overseeing a myriad of teams, including products and the Internet of Things at business software. She’s a part of the Intel Security Center of Excellence and host of What That Means Podcast, which is a fine podcast. Camille, welcome to the show.
Matt, it’s good to be here. Thanks for having me.
One of the great things podcasters get to do is talk to other people who are podcasters. You also are an amazing person with a huge career inside one of the most important companies in our industry. Before we get to any real stuff, many years at Intel is such a rarity. I got to ask this. What is the key to not only staying in a company, not like Intel, but just a company at all and wanting to stay that long?
I didn’t start right out of high school or anything, either. I started there after grad school. To me, I moved around within the company. I’ve been lucky enough to do so many different things there. I started off in a marketing role and ended up building a product from scratch, going around the world, and talking to actual people who were end-users who were going to use the product and build something from the ground up.
After that, I went into the Internet of Things division, which fairly shortly after was renamed from The Embedded Group. To me, I felt like I was always at the leading edge of things. In that group, I was looking at strategy around stuff like blockchain, machine learning, and all these things that are starting to emerge. I’ve joined the Security Center of Excellence because that is a very hot spot now. I don’t feel like I’ve been in one place the whole time. I feel like I’ve been doing all kinds of different things.
For all our technical brothers and sisters out there who don’t love the marketing department, it’s a prime example of the type of massive things that can grow out. Give us a break. We contribute. We build things as well, including incredible careers. Let’s get to the thing that we are here to talk about. The thread of what we’ve been doing on the show is insider threats. You are the leader of a security organization inside a major tech organization. What can you do at the top of this thing to help your team minimize the threats that your population poses to the company? When we talked about this on the prep call, you had a three-prong approach. I will let you open with that, and we can start breaking it down.
We were talking about insider threats that are unwitting. There are insider threats that are not unwitting, but that would be a different approach. Some of the biggest threats that exist out there, and this is not going to surprise anybody, are poor password protection, not updated devices, and phishing, where people are going after a specific individual or a broad email that you might click a link before you realize that was not from your IT department, Amazon, or whoever you think was maybe sending it. The best of us could end up almost or clicking on some of those things. It’s hard. The biggest thing you’re going to do is awareness. Let people know that these things are out there. With awareness, there are different ways to go about it. Everybody relishes great training.
Please tell me their video too.
There are ways to get people excited and learn about this thing and integrate it. We can get into that later, but you can’t be sitting there thinking, “Let’s launch mandatory training to make people aware of phishing.” Maybe you need to do that also, but you’ve got to also do more, and it’s got to be interesting and bring people in. For the three-pronged approach, I would say there’s number one, awareness and training or methods to make people aware that is more exciting than what we think of as training. Next, you have to give people actual tools. When I think of tools, I think mainly of integration and automation. What I mean by that are tools and practices.
If you have a secure development life cycle in your product line, and then you probably have a product development life cycle, you want to integrate those, so they’re one and the same. You haven’t got various checkpoints within a product development life cycle. You then get to the end and go, “Let’s do the security review.” That can’t be a stop-gate. It’s got to be there from the beginning when you’re designing the product.
It’s the same thing with anything or any kind of software, practice, or anything you can give people to help them, like your developers, for example. There are lots of software out there that, across the entire continuum of depth and complexity that you want, will help you look and search for existing known vulnerabilities that are public information that you may have incorporated into your software without knowing it.
It’s things like that that remove a burden from the developer or the engineer and allow this automated check. The more things like that you can do to make it seamless and simple so your engineers can focus on the stuff that matters and not worry about the things that a piece of software can scan for them. The final thing is community and support. You’ve got this awareness and the training that can be an isolated event, and then you have the tools and the automation.
Do you have the culture? Do you have the leadership getting in there? Do you have people in your organization who can help an engineer if they’ve got a question? “I’ve hit a decision on how I’m going to architect this. I’m interested in what would be the most secure or getting somebody else’s opinion on that. Is there somebody I can reach out to?” Ideally, you’ve got a security architect or an architect with a specialty in security incorporated into every development team. If you don’t, you might have a few in the company that can float around and help out, weigh in, and support one another, as opposed to leaving people isolated and trying to make a decision.
It’s because they’re more often than people anyway. Maybe introduce them to a new friend for no other reason. Let’s open with prong number one, three words that are practically swear words anymore, awareness, training, and education. The videos I have had the misfortune to sit through may not be quite as laughable as the HR versions, but it feels like there was a real apex of, “It’s time for you to finish your 90-minute corporate insider threat training program.” What can we do to get the audience to buy into this? You are part of a big company and have been on all kinds of different teams and where you are now. It’s one thing to say, “Be smart,” but it’s another thing to get people to want to be smart. What’s your approach or your philosophy on this thing?
I’ve seen some very cool things. There is always going to be some subset of training, as you described. I’ve seen interesting things, like Capture the Flag competitions, where you’re getting engineers and developers to look at if they can find a certain hidden vulnerability, bug, issue, or something within the software and the hardware. They’re looking for stuff. You can also incorporate some of the training into helping you, the red team, and your own product by getting your engineers to look at it. If people are doing things in a team environment, they’re learning from one another too. There are internal conferences that can be very successful, especially if you bring in speakers within or outside of your company who can give real examples of something they’ve been up against.
You might be able to do public conferences that way, but sometimes it’s internal conferences where everybody can let their guard down and everything. You can talk about some difficult stuff that you’ve come up against and how you’ve gotten through it or how you should reach out in those cases. Another one is there are things like belts. Some people are black belts in this. You give certifications, black belts, green belts, or blue belts. I can’t remember the order. You can move up through them so people can add something to their resumes. It’s good for them. This is a transferable skill. That’s not just something they have to do for their company, but now they’re bigger in the world too.
A lot of us assume who don’t live in the development world that although those engineers, why do we need to teach them all of this stuff? They’re smart. They’re developing these things. There is security. That’s not always the case, but if we move out of that floor or building into accounting, HR, or, God forbid, sales, where some of the stuff they have to click links, and we’ve got to inspire them on trying to use a better vocabulary of these things to step their game up and understand things better about how this works. You can’t just make it, “I can’t click any link anymore because there’s stuff that they have to do.” How can we get them involved in the same way? Odds are, they’re not going to be bug-down type people. They might be, with all due respect, but there’s a significant percentage that probably isn’t.
I’ve seen amusing things where IT will send something out, and if you click it, you get a little email back where now you have to take training or something. It’s not horrible. It’s a quick five-minute here’s why you don’t click on these links. There can be a short feedback loop. You click it, and right away, “You shouldn’t have clicked this for these reasons. You checked the box here.” It’s not this onerous thing. It’s a constant reminder or refresher that things like this are out there.
I talked to my cybersecurity CTO, and I loved it. He said, “It’s not their job. They have to click on links. It’s our job. We’ve got to make it secure. Ultimately, we all here in the development and security world have got to make things so that they are secure and people can compute with freedom.” I like that. I thought that was an interesting perspective. I hadn’t heard it put quite like that before.
We need to make cyberspace secure. The development and security world needs to make things so that they are secure and people can compute with freedom.
It’s a nicer way of saying, “We need to protect them from themselves.” Is that too cynical?
I don’t think it’s from yourself. I’ve heard stories from people who are deeply embedded in the security world. Somebody was telling me he was walking between buildings, and on his way from one meeting to present at another meeting and rushing down the thing, he got a phone call from his bank that was checking to make sure that nothing had gone wrong because they’d seen an unusual transaction. His wife was traveling or something. They said, “We need your account number.” He’s like, “I gave them the first three numbers before I realized what was happening.” People get you at the wrong moment. Your head is not in that game. Sometimes that’s why those quick feedback loops can help because it reminds you that you always gotta be thinking about it.
Moving on to the next thing, when you are talking about tools, I thought that was interesting because, in my head, it was tools where we need them to be free to compute the tools that are implemented. You’re saying it may be tools not implemented by the structure or the network itself, but by the people that give them tools to use as opposed to the tools they have to operate inside. Am I getting that wrong?
Maybe it’s both. There are best practices or something that you might already have in your organization or not because it depends on how you’re starting too, how mature your product is, and all different things. You should have a secure development life cycle. If you haven’t thought about that, you should think about that. There’s all the information online about that. Everybody has a product development life cycle. You want to make sure those two things are integrated, and the checkpoints are one and the same with every checkpoint. You’re also looking at the security all the way back to the architecture and all the way forward to the future when it’s in operation.
You should have a secure development life cycle. If you haven’t thought about that yet, you should.
You can look at a bunch of different tools, everything from free to extremely customized, that can be brought into your environment, like pieces of software that can scan your code and check for common errors that are made in code or check for existing known vulnerabilities because they look through the libraries that developers contribute to. These are simple things. A lot of it is you can start somewhere. You don’t have to have this gigantic plan to implement everything all at once. You can bring in little things at a time or one thing at a time, or make every improvement you make where every step you take in that direction is better. It’s a layered thing. You’re not secure, or you’re secure. You’re not secure, or you have some level of security.
We’re more secure than we were yesterday. We’ll be the securest at some point. You used a great word earlier, onerous. When you implement new things, there’s going to be natural pushback from a certain percentage of people because people do things the way they have successfully done them and are like, “Why do we have to implement this new thing?” This probably is going to bleed into the notion of community.
When you add the new tool, you change the training and are like, “What I used to be able to blow through this in fifteen minutes, now we’ve got to do this new thing where we’re all getting together for an in-house day to do this type of stuff.” In your experience, the curmudgeon who doesn’t want to do that thing, how can we entice them? This was to the earlier question about buying into it and seeing that there’s genuine improvement in everything if they would come along and get this done.
That’s pretty simple. You just make the new tool better. Have you ever had the experience of opening a new tool and being like, “That was easy?” I know at one point, at some company I was in, the filing of expense report software was changed to something. It was so much easier to do an expense report. It was so much faster. You could take a picture with your phone. This was a long time ago. They all do this now. You could take a picture with your phone, and it would automatically go in and attach itself to the report. It made it so simple. You could be standing in the airport and take a picture of your coffee receipt, and it’s already there, and you never think about it again.
It was like, “That is so much better. I’ll do that.” Tools should be streamlining things and automating the mundane tasks that people have to do, like sitting there photocopying your receipt and creating PDF files as opposed to just picture and attach. If your tools are doing the right thing, they’re not a burden. They’re better. There might be a brief learning curve to using them. If there are people who are not interested in using a tool that’s better because there’s an extremely short learning curve because that’s part of it too, it has to be easy to learn or intuitive. Otherwise, it’s not a better tool. You’ve got to look at that. That’s an odd perspective.
I’m that guy. That’s always a thing. It’s like, “This is just the system that works.” It’s those, “Talk me into it.” I know that’s not the space that you should occupy. Maybe that’s a better question for culture and community so let’s do that easy segue into that. I heard a speaker from a different type of industry, but she was talking about building a community. Part of her approach was getting the people to elevate through the stages, going from, “I am great,” to, “We are great,” to, “Life is great.” That style of approach, can you implement that inside of a corporate culture, specifically inside of a development type of culture?
I have to approach it from a different perspective. I’m not sure I understand exactly because I don’t have the context around that one. Do you want to give me more context?
Going from the notion of me as an individual, I like my job. I like what I do. It’s moving out to the idea of like, “I like my team. We all are good at this. We like being here. We like being together,” to the total approach, “Now we are a community. We are all doing this together. Not just my team but the whole company. What we are doing is something that is for the greater good.”
I completely agree with that. I didn’t understand before. Like you pointed out before, maintaining security in an organization or privacy, etc., is everybody’s job. It doesn’t matter if you’re in accounting, or a developer might be helping to ensure that the products coming out of the company are secure for other people, as well as maintaining the security of their own IP and PII within the company. Everybody is a portal right in. The other thing I think is if you have leaders in the company saying something and meaning it, people know right away if a leader is saying something and doesn’t mean it. They’re touting, “We’re now going to focus on whatever it is. We’re now going to focus on security.” The first time a product has a potential security issue, they say, “We’ll ship it anyway. We’re going to be late.”
There are lots of nuances in those things. A lot of things require careful consideration to make those decisions. If people are seeing the leaders making hard calls in favor of the thing they support, then everybody else follows suit. It’s easy. It’s like, “This is what we do. This is how we are. We care about quality at this company. We’re not just going to do it that way because it’s faster and simpler. The quality has to be there.” It’s the same with security, privacy, or whatever else it is that your focus is.
If people see leaders making hard calls in favor of the thing they support, everybody else follows suit.
When we talk about a community, that can be a tough thing. It’s, “What’s the right way to do this?” Can you push too hard to create a community? Does the community need to evolve organically and naturally? As you said, leadership needs to be out front and inspirational to the people that are in there. Communities form rather than are formed. Am I getting that wrong?
I don’t know. That’s such an individual and personal question. I know that I can tend to resist things if I feel like it’s, “There’s now a pressure to form this rah-rah group within something.” If you’re structuring things right, there should be a natural evolution of people to want to innovate and do things together. If you have an environment where it’s okay to take some risks, go out on a limb, and pitch a new idea, people get excited to find friends or colleagues who might complement their skills and put things together. If you’re mandating these things, it’s tricky. If leaders are doing the things that they say they care about, that sets the stage. You have to have that.
If you’re providing an environment where it’s okay to have friendly competition, it’s not okay if you’ve got some situation where people are concealing data from one another within an organization because they’re terrified that they’ll fall behind or whatever it is. If you have an environment where it’s fun to collaborate and come up with something new and try it out, and if you have a great idea that will be listened to and have an opportunity to get an audience, maybe it can go somewhere, these things naturally form. I’ve certainly experienced it. Sometimes we have such a different approach that it’s almost frustrating within the team, yet I know I want that person on the team because we’re so different that I know that they’re going to solve the problem that I’m not and vice versa.
One of the hardest things about everybody coming together like that is, by definition, you want that type of diversity and approach, which means you’re going to have very different personalities. People aren’t going to like each other, but it doesn’t mean they can’t be a part of the same thing, “This is my neighborhood. It is what it is. Not just my house versus your house,” and that thing. It’s the spice that keeps everything dynamic and keeps it moving.
You got to have the visionaries. You have to be meticulous. You’ve got to have the naysayers. You’ve got to have the optimist. They all have to come together. If you have an entire team with one or the other, it’s probably not going to be as strong as the one that’s got the integration of multiple different people.
We’re going to shift into a different question. You have had a very dynamic career and have moved throughout some interesting portions of Intel. It gives you the opportunity to look at a lot of different types of technology and all the different things in our industry. Is there anything that’s got your eye now that maybe we should be paying a little more attention to, something you’re seeing in your peripheral vision that is close to being in the main focus? It doesn’t have to be a product or a company but a technology that’s almost prime time.
A couple of things probably come to mind. One is I’m very curious about this intersection as we roll forward here over the next few years of security, privacy, artificial intelligence, and sustainability. There are going to be some interesting links there. I personally am going to explore them more. I interviewed a guy at Stanford University, a professor there who was talking about the conversion of electric vehicles, even in the United States, and how we don’t even have enough known reserves of some of the minerals that would be required to create the batteries to do that in the world. He’s using artificial intelligence to look for them. There’s more depth and all that, but we got to think about that one.
That’s very interesting. That’s new. The other thing I am interested in is I’ve looked a lot at the supply chain and critical infrastructure. That is expanding. It’s expanding to outer space. We have satellites becoming part of our critical infrastructure. We have it expanding to interspace as we see nanotechnology and things like biology interacting with compute. It has, for a long time. You have insulin pumps, pacemakers, and things like that that exist in our bodies and need updates sometimes. There are processes for that. As we move forward, there’s going to be more and more intersection of biology and compute. I’m interested in security and privacy as that happens.
It’s notable when you said outer space because I was going to make a half-joke about it. I’ve read a lot of articles about going back to the moon and specifically going to parts of the moon where we’ve never been before, where there is water. It’s like, “Is that the end goal of Artemis 1? The Apollo Program was exploration. Now, who knows what’s going to be on the other side of that?” We’re going to start drilling on the moon in order to get electric cars. You never know. Stanley Kubrick would be proud.
The world is looking beyond the world at this point for solutions. We’ll see.
I like it. Shameless plugs, you have a lot going on, as we mentioned, like a very cool podcast. Tell the people where you are, where they can find you, anything cool going on with Intel that’s worthy, charities, are you speaking anywhere, and all the things. It is yours, and you know what to do with it.
I’m going to condense all my things to one thing. You should check out CyberSecurityInside.com. This is this podcast I do with a colleague, Tom Garrison, VP of Security in the Product Division at Intel. We interview guests. We do an outside approach where he and I will often talk with C-Suites about a variety of topics. I do a complementary series within that called What That Means with Camille, where I take topics and technical experts at all different levels and dive into them.
Non-fungible tokens might be a topic, deep learning, federated learning, and anything you might hear in tech. I have one coming out soon. That’s 5G and 6G, talking with the radio frequency engineer about what that means. We try to look at defining the technology, why it matters, and what are the hot topics, arguments, or emerging threads or trends within that to consider, like machine consciousness. It’s stuff like that. Those are fun things.
I might have rolled my eyes at the NFT, but machine consciousness, now you’ve got my attention. That’s what I would check out for sure. Any social media, anything going on out there, LinkedIn, Twitter, TikTok?
@Morhardt is the Twitter handle.
Camille, thank you so much for taking the time. It’s always nice to talk to somebody who talks to people. This has been great. We are looking a collection of everybody we’ve had on the show for some big, loud, and rowdy crowd table. Consider this as your official invitation to that. It’s a nice way to close out here if we can steal you away from your shows and all the things you have going on inside of Intel as well.
Camille is the latest in a long line of very interesting and diverse guests we have all over the place. We’ve got some cool ones coming up as well. You can find us anywhere you find your podcast, Apple, Spotify, and Audible. All the friends are here. All we ask is that you subscribe, rate, and review. You’ll never miss all the great folks who are coming on the show. Until then, we will see you next episode.
Important Links
- ElevateSecurity.com
- LinkedIn – Elevate Security
- Facebook – Elevate Security
- Intel
- @Morhardt – Twitter
- CyberSecurityInside.com
- Apple – Friendly Fire: Mitigating Unintentional Insider Risk
- Spotify – Friendly Fire: Mitigating Unintentional Insider Risk
- Audible – Friendly Fire: Mitigating Unintentional Insider Risk
- @PackMatt73 – Matt Stephenson
About Camille Morhardt
With over a decade experience starting and leading product lines in tech from edge to cloud, Camille eloquently humanizes and distills complex technical concepts into enjoyable conversations. Camille is host of “What That Means,” Cyber Security Inside podcast, where she talks with top technical experts to get the definitions directly from those who are defining them. She is part of Intel’s Security Center of Excellence, and passionate about Compute Lifecycle Assurance, an industry initiative to increase supply chain transparency and security.