“Privacy isn’t dead, but people need to be made more aware of the invasiveness of the internet of things, digital devices, and tracking.” – Theresa Payton.
Every company has a duty of care to ensure they are tracking their employees on a certain level to ensure people are who they say they are, working when they say they’re working, and taking care of the intellectual property and customer data. In this episode, Theresa Payton, the co-author of Privacy in the Age of Big Data, delves into security and privacy in the world of insider threats. Adding that layer of security doesn’t mean you don’t trust people. It is trust but verify the process to protect employees. Nothing is easy with the rise of AI in malicious attacks, but what should security leaders do? Tune in to this episode and hear what the 100% Woman Executive Team does to mitigate Unintentional Insider Risk.
—
Listen to the podcast here
Theresa Payton: Security And Privacy In The World Of Insider Threats
We’ve been doing this for a while. For those of you that are new to the show, I’m very glad that you are here. For those who have been around for a while in various incarnations, hello and welcome all. On Friendly Fire, we are bringing the top experts in the industry in for a chat about anything interesting in keeping our world safe and secure. Speaking of that, I’ve been chasing this one since day one. The first host gig I got with Friendly Fire, I was like, “We have to get Theresa on the show,” which is why I am so excited to welcome Theresa Payton to Friendly Fire.
Theresa is the CEO of Fortalice. Among other things, our guest is the former White House CIO, who brought in the first mobile devices inside the network. She is the author of multiple books. The new book is Privacy in the Age of Big Data, which she co-authored with Ted Claypoole. Her previous best seller, Manipulated: Inside the Cyberwar to Hijack Elections and Distort the Truth.
She has been a featured guest all over important television shows, including The Daily Show and all of the cable news networks, and recently on the TODAY show. A few years back, CBS did a show called Hunted, where she led the SOC in tracking people trying to stay off the grid as fugitives. All of this leads to the notion of people versus security versus malicious intent. We’re going to get into all of this stuff. Also, she has keynoted at the most important security conferences all over the world. Theresa, I feel like we should have big music or walk up music for you like in ballparks or WWF. All of that being said, welcome to Friendly Fire.
First of all, thanks for having me on the show. I’ve been looking forward to this conversation. Second of all, everybody needs a hype man. I feel like you could be my hype man.
I’m all in on that. You got a new book dropping. It’s a revision of an earlier book that you did with Ted, Privacy in the Age of Big Data. Why revisit this topic now as opposed to doing something entirely new? Why bring this thing back?
It’s interesting because when the first edition came out, we had already sent the final over to the publisher. Long answer to your question, so much has changed since the Snowden revelations, Cambridge Analytica, you name it. Our privacy or semblances thereof has been trodden upon. I don’t believe privacy is dead, but I do believe that people need to be made more aware of the invasiveness of the internet of things, digital devices, and tracking.
The only way to help people be more aware was to update the book and say to people, “You don’t have to throw your hands up and give up. You don’t have to go off the grid to have your privacy.” We all need to have more awareness, and then demand, “I would like to opt-in this. I would like to opt out of that. If you’re going to be making some money off of me, I should make some money off of me.”
We wrote the second edition not just to update everybody. People can read the headlines. We don’t want people to think it’s time to crawl under the desk or just shrug their shoulders and give up. We wanted to arm, engage, entertain, pull people in, and say, “We all have to stand up. We can do something about this. We can have our privacy and have our conveniences too.” That’s why we did a second edition. There are some chapters that are completely brand new. For example, the Metaverse chapter. It will be interesting to see how it’s received. I’m looking forward to it.
I love that you use the terms arm, engage, and entertain because each one of those is important. Don’t sleep on entertain because certainly, in the universe that we’re existing in now, it’s got to be cool and fun. Otherwise, nobody is going to sit down and read ProPublica. I wish they would. I think I might every now and again. You might every now and again. I know producer Sharon does.
Telling these stories in a way that impacts the receiver, how hard is that? You get into some egghead stuff because it’s important that people understand what this is. I don’t know how many people got Cambridge Analytica and what that means. It was a great Netflix thing. Everybody talked and tweeted about it for a while. In order to share the story in a way that can be impactful for those who are receiving it, how do you do that?
I try to take a moment and walk in everybody else’s shoes. I think about my family who has to put up with me and live with me. My extended family is used to it. I used to say that I wrote the book for my parents, grandparents, and aunties, but they’ve been living with me so much. I need to write it for everybody else’s aunties, uncles, cousins, nephews, and grandmoms.
What I try to do is talk to everyday people and ask them where their concerns are. I try to talk to people who are not in the cybersecurity industry or the privacy industry. They are not in big tech at all and are consumers of big tech either at work or in their personal lives. I try to understand walking in their shoes by asking questions. Having done that, I try to say, “What would make this the most interesting to them?” I try to use a storytelling narrative and unmask what some of the challenges are. More importantly, what are the tool sets that people can use on their own without calling the Genius Bar or the Geek Squad? What are the things that they can do themselves and feel empowered to take control of their privacy?
You have spent the bulk of your career in the security industry. You have been with some of the most important financial institutions in this country. You’re at the White House as the CIO. It doesn’t get a whole lot higher than that. Now, you are a founder and creator, and someone who is making an impact on security. When you look at the intersection of privacy for employees and the need for verification or security inside the organization. How difficult a task is that, respecting both sides of that equation, in a way that puts them both in a position to succeed?
This is a great question and it also speaks to some of the topics that you’ve covered on the show, which are things like insider threat issues. I would say to businesses, just because you can create a data element about your employee or about your customer, the question is, should you? Just because storage is cheap and you could keep it sitting around, the question is, should you? Just because you could monetize it or use it to build models, the question is, should you? Assuming that it could all get stolen tomorrow, what are you going to do differently?
I will give you a quick example of how to think about this in the physical world. Where I used to live, we were aware of a lot of daytime house break-ins. Thankfully, nobody at gunpoint, but house break-ins, smash-and-grab everything you can get, and runoff. That’s where I used to live. I started thinking differently.
I wouldn’t leave checks sitting out on the counter that I haven’t deposited yet. Besides setting the alarm and locking the door, I take a quick look around and say, “Is there anything that’s sitting out that shouldn’t be sitting out because it’s too easy for the smash-and-grab?” There are things that are more valuable where you can use extra measures.
If somebody were to do a smash-and-grab on your company and steal your employee data and customer data, would you think differently about what you’re tracking about them? What’s your duty of care for that data? It’s segmentation, tokenized access, anonymization, and digital shredding strategies. For the moments that you need it, it needs to be layers and layers of encryption. When that smash-and-grab happens, which it will, you need to render that data useless. It needs to either be anonymized and/or it needs to have layers and layers of encryption. It is virtually unusable to anybody except for you.
As it relates to employees, I’ve thought about this a lot. Every company has a duty of care to ensure that they are tracking their employees on a certain level to make sure that people are who they say they are and that people are working when they say they’re working. Also, they’re taking care of the intellectual property and the customer data the way they’ve been trained to do, and according to the policies and procedures of the organization.
Some level of monitoring must be there. It doesn’t mean you don’t trust people. It’s a trust-but-verify process. It’s necessary not just to protect the organization and the organization’s client data and intellectual property, but it protects the employee too. If there’s unauthorized access pretending to be an employee, you wouldn’t want to falsely accuse the employee.
I’m not against employer tracking of employee movements and what they’re doing, but I’m a big believer that if you’re going to do that, you need to be transparent about it. Say how it fits into the overall mission, vision, strategy, resiliency, and recoverability plan of the organization. If you tell your employees that and they understand that, they may even mention it to your customers, which could impress your customers.
Ted and I talked about this in the book. Some of this was pre-pandemic and it’s certainly post-pandemic with the mandates to work in the office a minimum amount of time. People are finding ways around that. They’re like, “I know you’re checking my badge swipes.” There are people who will swipe in, get a cup of coffee, sit down, leave, and go work from home.
It’s not because they don’t want to work, but because they feel like they get more work done at home than in an open cubicle environment. They can’t work there. All of their setup for the last two years was at their house, so they’re leaving. Some may have more nefarious intent. For the most part, the people that I talked to were like, “I get more done at home.” If you got tracking and surveillance going on, is it serving you well? What is it that you’re trying to get out of it? Be transparent about it that you have a duty of care now that you have this tracking data because it could be used against you and your own employees. Assume the smash-and-grab will happen.
If somebody has all of your employee’s behaviors, and they want to pretend to be your employee to conduct an extortionware campaign, ransomware, or something else. If they know their patterns, they can predict what they would look like to your system, so they could be virtually undetected. If you’re going to collect that data, besides being transparent and using it the right way and being responsible. How do you make it virtually useless if there is a smash-and-grab?
I love that you mentioned patterns because that feels like something now, especially with the rise of AI in malicious attacks. Nothing is easy. It is a tool that was not previously refined enough to use. I’m not talking about malicious intent like unwitting, as people went from being in the office all the time, then we were home all the time, and now we’re trying to go back. As the patterns evolve, what do you have to do? How do you have to consider this as a security leader? As you are coming in to sit down and talk with CISOs, CIOs, executive vice presidents, and boards. People are chaotic. There are patterns though and there are ways to examine this, but also respecting the individuals who make up the larger group.
Take a look at the last eighteen months of breaches that have been reported. Not the breaches that have happened because we don’t know about many of them yet.
Only the ones that are reported. Let’s make sure that we couch that appropriately.
Let’s take a look at those. What’s interesting is Melissa, Bridget, and I, we all worked together at the White House. They’re partners in my firm, so we’re a woman-owned business and a 100% woman executive team.
We’re going to get to the shameless plugs in a little while, but go ahead and give a shout-out to the website and everything else.
We’ve got an intelligence operation within our firm. We would see where a VIP would reach out to us. Maybe they ran a private firm that’s over $1 billion in revenue. Maybe they, in and of themselves, are the product, but they’re a billionaire because of the different ventures and the creative ventures that they’ve launched. We said years ago that personal account compromise was leading to business account compromise. Melissa is our Chief Strategy Officer. She even goes further to say that personal email compromise is leading to business email compromise.
Back on the people and monitoring the people, it’s a tough sell to say, “I’m going to monitor you in your personal life,” but there are things about the patterns in your personal life that could be used to create a legit profile of you, so I know when it’s not you. If I can take a pattern of what you’ve been doing, when you log in, when you don’t log in, and when you log into other things, chances are I can probably predict better than you can where you will be in the future.
Based on your patterns of life, I can use algorithms, machine learning, and behavioral-based sciences, and say, “This is Matt’s gym. This is when Matt works. I can tell when he likes to record because I go to social media and he mentions that he had a great session. I can tell when he is doing creative work because I can see he’s reaching out to people and asking questions.”
All of those things can be used to build a legitimate profile of this is or is not Matt. That can be a helpful thing with your personal bank account, and making sure it’s not a fraudster. It can be a great thing at work to make sure it’s really you. It has to be treated with care the right way, so it’s not used against you. I’ve said this for years. We have been lacking industry standards on what constitutes the right privacy framework, and how data can be used and opted out of.
Each one of us should be entitled to our own privacy identity. We should be able to decide what’s in it, what’s not in it, what’s monetized, and what’s not. We should have a right to assume that companies take our right to privacy as seriously as we do, and understand how they’re treating our information. We forced industries like banking. We’ve had to do disclosures since before I worked in banking. Anytime customer data goes to a third party for anything, the bank has to disclose that to you in the disclosures. They send in your mail statements and digital statements.
Other industries don’t have to do the same thing. I find that fairly baffling because it’s the same data. Even if we are told we have to disclose it to the regulators, we still have to disclose it to you as a customer in 4-point font or 6-point font in your statement. We do have to let you know of certain things. For example, if you deposit over $10,000, it has to run through anti-money laundering.
Your name, the deposit, and how it came in. Was it mobile? Was it at a branch? Was it a check? Was it a wire? All of those things, including who you are and where the money came from, have to go through the AML processes when it goes through the anti-money laundering program as part of KYC or Know Your Customer. We also have to disclose to you that if you deposit this amount of money, then this is the entity that we have to send this information to.
That duty of care, transparency, and understanding can be an incredibly powerful tool to make sure that somebody cannot do a digital walk-in on your life. It can be an incredibly powerful tool that could be used against you. On the other hand, somebody could do a complaint in a total digital walk-in on your life. That is the conundrum that we’re dealing with now.
An incredible amount to unpack there. First, shout out to make reference to the font size because it matters. The smaller it gets, the less likely people are to read it. Second, shout out to Nate Dogg in Regulators Mount Up. As we’re talking about people and patterns, this is psychological behavior. These are things that are scientific and trackable. It’s something you can absolutely learn from.
In the new version of the book, you also talk about biometrics. Bringing this back to the intersection of privacy, responsibilities, and security. We could probably pick twelve things for the Venn diagram to knock this down to something the size of a kernel of corn. What about that? If you’re tracking somebody and you can steal their behavior, that’s interesting. That’s philosophy and even theology to a degree.
When we’re talking about fingerprints, DNA, and all of these things people are either willingly giving up or being required to give up to get into whatever company they work for. Theme parks even require it now to get into these places. How do we deal with that in a way that respects the privacy of the user while still creating a secure environment for a company? Especially in your world and where you’ve been, banking, the highest levels of government, and now cyber security.
Let’s layer something else onto your question. Let’s add deepfake technology on top of that. You know because we’ve known each other for years. I sit down on Black Friday every year and I go two years out. After I finished my shopping for the family for the holidays, I spend some time thinking about where technology is going, and how it will play a role in our lives, both at work and at home. I try to imagine what that utopian thing is going to look like two years from now. I think about nation-states, hacktivists, fraudsters, criminals, and cybercriminals. What are they going to do two years from now?
Is this what you do, just in time for the holidays?
Yeah. Everybody is still sleeping at this point because I’ve got a whole process down for how I do the holiday shopping. Usually, everybody is still sleeping. I sit down, I sketch all this out, I think about this, and then I write my predictions for two years out. Black Friday of 2022. One of my predictions that I wrote down was that because of the prevalence of AI and synthetic fraud that’s going on right now with deepfake technology and biometrics put together, we will have these Franken fraud identities who will be working in the workplace remotely. They will do digital walk-ins on people’s lives, and they will be nearly impossible to eradicate.
You take the biometrics, patterns of life, the data that has been collected about us, the intersection points with marketing data to sell us things, and our patterns and behaviors based on algorithms on social media. You then take deepfake audio, video, and photography, and someone could be you. They could interview for a job. They could feed the deepfake audio and video to respond in a way that would be consistent with Matt. They don’t have to do this with a lot of money and a lot of resources. Much of the deepfake audio and video technology out there is free. You just need a good gaming computer and a creative mind, and you’re off to the races.
Much of the deep fake audio and video technology that’s out there is free. You just need a good gaming computer, and a creative mind, and you’re off to the races.
If people think that my prediction can’t happen, I will jump back in time to 2019. I’ve been doing these predictions for ten years now. Back in 2017, I had predicted that deepfake audio and video technology would accelerate and be used to commit fraud. This is the first instance of this, and there has been more than one case. I was doing a presentation to the European Council with banks. I was the only American asked to do a presentation. I was so thrilled. They asked me to do a Black Swan scenario where a cyber event would take down all of global banking. What an honor to be the big Debbie Downer at an event and be the only American on stage.
“Let me give you the most frightening thing you can imagine, please. Thank you for this opportunity.”
I was so delighted to be asked to do this. More importantly, they wanted, “What can we be doing now? Give us the scenario. Tell us what’s coming. Tell us what we can do now to make sure your scenario never happens.” Nobody has done it. Anyway, I’m there and they asked me after my talk. They were like, “That was great. Can we spend a little bit of time with you to talk about a case?”
A quick story on this case. It was an international holding company that had vertical companies underneath. You’ve got an international global CEO, then each company has CEOs and each company has CFOs. They had accidentally wired over 300,000 sterling to fraudsters. Their protocol was, you send the email back and forth, it has to match the vendor list, and then you get a voice authorization.
CEOs have their voice in the public domain. You and I have our voice in the public domain through this show. CEOs typically don’t talk much when it’s doing an authorization for a wire transfer. There’s not going to be a chit-chatty conversation. They doctored the CEO’s voice. They had studied his patterns on social media and things that the company had posted about him. He didn’t have a huge digital footprint. He wasn’t a big selfie type of person or anything like that, but there was enough out there that the email was incredibly convincing. The voice authorization came through and it was his voice that ran through a deepfake audio program to say exactly what needed to be said.
People would say, “That’s amazing.” It’s not that amazing. We forget in cybersecurity to talk about the human user story. We don’t consider them first. That’s why we make twenty-character passwords and garbage like that. You’re the last person we think about. We’re so enamored with all of our technology and all the geeky things we can do and make you do.
We kind of forget what the human user story is. That’s why bad things continue to happen. It’s because we don’t focus on the human first. They’re like, “We just need to train you better and you just need to follow these policies better.” It needs to be, “We need to always start with the human user story first.” On the flip side, fraudsters and cyber criminals could teach a masterclass in human behavior. They know how these processes work. They have their own businesses. They have to wire funds too.
They know there’s an email that happens. They know there’s an account. They know it gets matched to a vendor list. They know there’s a voice authorization because they use these systems too. Because they study human user behavior, they know how this works. They’re like, “All we need to do is get the CEO’s voice, doctor it up, make it match the email, and we’re off to the race.”
They’re not smarter than us. They’re not better than us. They don’t have tons of cash in a barrel to spend on a bunch of sophisticated technology projects. They just study human nature and they’re like, “This will probably work.” It worked and it has happened more than once. That was the first instance of it that I was read into, and that was the fall of 2019. Since then, more of those cases have happened.
That’s a long answer to your question, but with all of the biometrics that are out there, they will be used against us. The question is not, “That’s terrible,” but “How are we protecting biometrics?” Voice is going to be harder to do. Voice should not be the only thing you do. Fingerprints are a little harder to collect. Ted and I wrote in the first book and we were able to show you could collect them using a gummy bear. It has to be fresh, not a stale gummy bear. You can collect fingerprints using a gummy bear and present it to not super sophisticated, but lower-level fingerprint reading technology.
Your face for your phone. There are 3D printers now. We have to be asking ourselves, “Why don’t we study what humans are doing with technology?” Let’s go back to the human user story. Find ways to integrate different things that are not easily guessable by fraudsters and cybercriminals. That’s what we incorporate into the safety and security net that we build behind the scenes around the user.
They typically stand at this geo-coded location. They typically like this browser. They’re typically a fast-typer or slow-typer. There are a lot of patterns and behaviors about us that could potentially be used and leveraged in a way, depending on the level of transactions. “Am I just reading something?” That level of transaction is fairly low risk. We use a low-risk safety and security net around the user.
As soon as we start talking about moving money or accessing client data or intellectual property, now it’s riskier. What are the elements that we want to look for in a very simple, elegant, seamless way that’s not a burden on the user? There are twenty-character passwords. They’re getting us every time. Also, if you look at these past data breaches that are in the news, what was the way they got in? Compromised credentials are pumped and dumped on the internet, reuse the passwords, and just log it in.
Should we and can we collect the data? Yes, we can. We got a duty of care for that data. How are you storing it? How are you collecting it? Why? What purpose does it serve? How are you treating it in motion and at rest? How can you get creative to put these different layers of security and safety around the user as part of the user story that are things that cyber criminals and fraudsters wouldn’t think about?
I love the notion of how we get asymmetrical in our defense as opposed to the asymmetrical attackers. To your earlier point about deepfake audio technology, this is something that we are all seeing right now. It has gotten very good and it is scary. If you go back to one of the most legendary hacker movies ever, Sneakers. That dropped in ’93. They got access to this building by using a mini tape recorder with, “My name is Werner Brandes. Passport is my password.” That’s a deepfake.
Now, you’re talking about all this other stuff. There was a sci-fi movie or TV show a couple of years back where they gained access by breathing into a receiver of some kind. As we look at this notion of biometrics that is not necessarily invasive, nobody wants to give up your fingerprints unless you’ve been arrested. You definitely don’t want to give them up but you have to.
With these types of things, it’s what you’re saying about the responsibility of needing it, collecting it, but also protecting it. Once someone gives up something that is intimate, whether it is a fingerprint, a retinal scan, your breath, or your DNA. Everybody is sending their DNA away to find out where they came from 500 years ago. What do we do? All that stuff is accessible to the bad guys. If they’re good enough to get in, they’re going to get in. No pressure on you to solve this problem right now, but you got seven minutes.
Whether it’s protecting intellectual property, people, people’s identities, transactions, money, or whatever it is, the challenge is we think about it in terms of locking the windows and locking the doors, which we should do. That’s basic blocking and tackling. For example, for multi-factor authentication, Security Boulevard did a study. If you were looking at credential stuffing attacks, multi-factor authentication blocks over 90% of credential stuffing attacks. That’s good odds. That’s pretty darn good.
There are some basic blocking and tackling. It’s not going to stop everything from happening, but it will throw the lower-level operatives with fraudulent intent off their game. There’s that next 10% or more. To me, it goes back to the human user story. It is about how we out-think, out-maneuver, and outsmart cybercriminals.
They’ve studied human behavior. They know the processes. They’re users of the processes themselves, so they look for the workarounds. We know that. We should assume there will always be smash-and-grabs. They will always be breaking in. They will always be successful. Knowing that, what do we want to do differently? This is where people will throw out the term zero trust architecture. I want to be on the record saying I cannot stand that term. It sounds like a destination. It sounds like a product stack and it’s crap. It is a way of living. I haven’t even started yet.
It’s like you can go on a crash diet to get to maybe a weight you want to get to or a fitness goal. Unless you change your way of life, you will not stay at that goal. Zero trust architecture implies that you could do a crash diet to get to a goal and stay there, and that is not accurate. The terminology I always used at the White House was no trust architecture.
Zero trust architecture implies you could do a crash diet to get the goal and stay there. It is inaccurate.
I would say, “How do we think about this as no trust architecture?” You’re going to come in and tell me you are who you say you are, and I’m going to do multiple things to make sure that’s you. Now, you’re going to try to access some information. Of course, the White House is going to be a different level of security and things like that, but you can always apply these principles at home and the workplace.
I would say, “When you go to do that thing, I need to make sure it is really you. I’m not going to leave this ongoing open connection for you to access anything you want at the White House just because you told me at the door it was you. Maybe you shouldn’t even access that anyway just because you work here.” Everybody gets stuck on phraseology because they want to talk to the board and the CEO. They want to throw out the same thing as the big consultants are throwing out as phraseology.
This is a way of thinking, living, and being. That’s why I like the term “no trusts architecture” because when you talk about a no trusts architecture, you ask, “What’s the human user story? What exactly are they doing at this moment? How do we storyboard that? How do we make sure they can do their job? How do we make sure when the data is in motion and at rest that we’ve taken care of it?”
If you go user story by user story, you’re going to build the right enterprise architecture. If you go and say, “It’s a tech stack and it’s a zero trust architecture. We’re going to buy all these products. We’re going to do all these things,” you’re never going to get to your destination. Start with the human user story. Start with the principle of no trust architecture, and review each one of your human user stories. That will tell you where you need to focus and make changes.
Talking about human behavior, given the rooms that you’ve sat in. You’re probably tired of me making this reference, but you’ve sat in some of the most important rooms in the world talking about important stuff. Your last book, Manipulated, shines on social manipulation. Manipulated was a lot more about manipulating behavior in the course of the election process. This is also now a key aspect when it comes to cybersecurity.
Whether you are talking with your clients at Fortalice, or you are doing your presentations at these events , or you’re sitting down with some of the most important people in the world in these important companies, how do you talk about something like that, and just be like, “This dude is on TikTok?” People want to dismiss that and reduce that. When we look at the idea of social manipulation into personality and behavior, how do you understand it, and then communicate it to the decision makers like, “This is something you have to consider?”
It’s interesting at the forefront of search engines. I’m going to walk us back not to dial-up, but just a little bit. You mentioned social media. You mentioned TikTok, the algorithms, and some of the challenges there. I’m going to get to ChatGPT in a minute. For example, once Netscape became available and we made some decisions about who would have access to it and things like that at the bank, we thought long and hard about our acceptable usage policy and being very scenario-based. I’m not going to even get into the dial-up world and things like that, but we would say, “Just because you can search for terms doesn’t mean you should.”
For starters, these topics are off limits when you’re at work. We’re going to block as much as we can, but that’s a whack-a-mole game. Understand that what you’re searching for will be logged. The regulators, third-party auditors, or a lawsuit could come in at any time and ask for you personally and what you searched at work. Understand that we’re in a heavily regulated environment. I don’t care to look at your search terms, but somebody else might want to. I know high-speed access here is great, but here’s the downside.
The second thing we had to think about was people looking for answers about things that are proprietary to the bank, proprietary problem sets, and proprietary formulas. I’m doing a little walk back for everybody because I want to talk about today’s technology. These are the same problems in new technology, with faster problem sets being a bigger problem.
We had to have conversations with people about, “You cannot type into search engines proprietary banking information. If you are seeking something, you can go to our intranet and our team of experts will help you. If we don’t have the answer, then we will do a research assignment because we don’t know where your request is going.”
Probably most people don’t appreciate this inner look inside my head because it’s scary in there. Fast forward to today with ChatGPT, Bard, or any of those. You could have people typing in things about customers, trying to do a good job, trying to basically do a good job for you, and not thinking of the ramifications of their digital footprint and what they’re doing. They might be on TikTok. They might be on social media or other platforms. They might be on ChatGPT. They may be spilling your proprietary secrets, customer data formulas, and things like that in the quest of being a good employee.
Unwittingly, they’re trying to do better at their jobs looking for information.
I will give you an example. I’ve been thinking a lot about some things that I want to do differently. I always like to play around with Python to keep my brain sharp on some of these problems. I was like, “I wonder what ChatGPT would do with the Python code that I’ve been working on.” I was like, “Can you write this better?”
I didn’t have any Fortalice data in it. I didn’t have any Fortalice used cases or anything like that. It’s just an initial idea. I knew full well by asking ChatGPT that I was basically entering what I had written very rudimentary into potentially the knowledge base, but I did that with foresight and thought. I thought, “This isn’t that unique what I’m trying to do. I just want to see what I get back.” I got back something interesting.
I say all that to say it is a risk and this is a very challenging risk. The regulatory frameworks have not kept up with this. The laws have not kept up with this. How we train people, how we talk to them from K-12 through college, how we train people that enter the workforce, and how we do our onboarding have not kept up with this.
This is Wild Wild West stuff. If you think you got a great insider-threat program or you got a great onboarding program, you don’t have anything. With how your employees are using social media and how they can potentially access these tools to do their job better for you in the public domain, you don’t know what you don’t know. I don’t even have all the answers here.
Hold on. That’s why we got you on the show. I thought you had all the answers.
No. It’s Friendly Fire. I’m on the firing line right now. I’m trying to figure out which ones are rubber bullets, jelly bullets, and armored bullets.
This is how we know how serious Theresa is. She knows the difference between bullets. You mentioned Wild Wild West. I know you got a hard stop coming up. Otherwise, we could let this go for five hours or it could be like Rogan’s style. Part of your illustrious career is as the Chief Information Officer of the White House. It’s one of the coolest things I’ve ever been able to say out loud about anybody that I know. Part of your tenure, if I recall this correctly, you were part of the team that brought in mobile devices inside of the security network for the first time. Do I have that right or am I making that up a little bit?
The Blackberry devices were already there when I got there. Blackberry was already implemented. Blackberry was superior for any Canadians tuning in. It was an incredible Canadian company. They were superior and in many ways probably still are superior to what Apple and Google do. I had the dubious honor of integrating social media platforms and other smartphone devices. It wasn’t President George W. Bush. There were too many risks for him to carry a smartphone, but we knew somebody following him would probably want one. We were prototyping code as having a smartphone on their belt.
Part of your task was integrating this recent technology, and now add social media to that stew of everything in order to get information from these brains to the public for what they’re doing. Is there a comparable sea change that is happening right now? Especially, given what you do, the boards you advise, the company that you run, all of those things. What has got your attention that feels like, “This reminds me of this thing that I had to do once upon a time?”
My big fear, concern, and worry are what I call the black-boxization of technology. You might think about an internet of things device. There are multiple companies that came together to create Alexa. It’s not all done in-house. You could think about something like that, but you got this black box around how a mobile app works. You got a black box around how customer service chatbots work.
All of these modules and components are now not completely authorized and standardized widgets that have all been put through the Good Housekeeping seal of approval. In some regards, the way programmatic work is we’re still not in artificial intelligence. It’s highly sophisticated machine learning, behavioral-based analytics, and good heuristic algorithms. We’re still not true artificial intelligence, but the neural networks are definitely more sophisticated and working a lot faster.
Everything feels like a black box to the implementation people. It is in those black boxes where the dormant security vulnerabilities lie. SolarWinds came and went, and dodged a bullet on that one. Although there were incredible challenges with that, we haven’t seen anything yet. That’s what keeps me awake at night. It is a bigger challenge than even some of the things we faced at the White House.
I will give you an example. There’s a whole bunch of eyes on WhiteHouse.gov. At one point, I remember sitting down and trying to meet the different vendors that were part of the technology stack. They said, “There are certain times that you are the most attacked website in the world.” It would depend on who was visiting us or policies that we’re posting and things like that, which is a dubious honor to have that not every day, but some days.
When I think about that and those technologies, we did almost everything we could in-house. If we didn’t do it in-house, we went to the Department of Defense. If they didn’t have it, we went to the best of the best who had been through the Good Housekeeping seal of approval. We would put them through the paces of, “You can’t outsource that. You can’t offshore your development.” We knew that stack inside and out.
It’s that loss of transparency that exists now. If you use ChatGPT, you don’t know what the engineers put in there. You don’t know who the engineers are now. You don’t know how much of it is outsourced to the community. Is it all an in-house job? That black-boxization of widgets that are now part of the fabric is what keeps me up at night.
What’s going to keep me up tonight is the fact that you made two references to the Good Housekeeping seal of approval. For a couple of different demographics who might be tuning in to this show, Google it.
Ask your grandparents.
We are in show season. South by Southwest, RSA, HIMSS. We’re about to roll into summer where all of the good regional shows are happening, and then eventually Hacker Summer Camp towards the end. How are you feeling about the industry right now? Not necessarily going to the shows, but the transition from pre-COVID to COVID to now. Given everything that we’ve been talking about, how do you feel?
It’s an incredible time to be together. Where some things have to be virtual, participate virtually. The platforms have gotten so much better to be able to participate virtually. If you can be together in person, even better. We have some complex challenges we need to be talking about. Sometimes it’s best done over lunch, coffee, tea, and a whiteboard. If you get that opportunity to go to these conferences, make it a point to be with people and talk about these complex issues. It’s going to take the entire ecosystem. This is not something to go productize and monetize. We all have to be part of the global army to protect and defend businesses, nations, and individuals.
We all must be part of the global army to protect and defend businesses, nations, and individuals.
Are you saying faith in humanity?
Trust but verify.
Trust but verify.
That’s the best possible answer. The Theresa-est possible answer. Leadership corner now. We’ve dug into a bunch of deep-dive stuff. Oddly enough, you have a couple of other hours in the day that are available. What are you doing when you’re not doing this? What’s on your Spotify playlist? What magazines are in your bathroom? What books are at the coffee table?
Do you mean when I’m time-traveling so I can get those extra hours?
Exactly. The 25th through 28th hours of your day.
If I’m trying to get in a mode, Childish Gambino. Not the cleanup version, so I hope you don’t think less of me.
No. Please swear. Are you kidding? Let’s go.
If I feel like I need to do something or I need to channel being annoyed and mad about victims, seeing the same problem over and over again, and no justice, Immigrant Song by Led Zeppelin is definitely a go-to. It’s easy and comfortable. It’s a song I can put on, and I know where I’m going to go with it. I have two rescued Great Pyrenees. I like to spend a lot of time with them. I’m big on the outdoors. As much as I can, I try to do 5 to 6 miles, sometimes 7 miles in the morning running. I like to bake and cook with my kids, trying to make sure that when they go out on their own, they can cook for other people.
I’m picturing you on the road running with the Immigrant Song, come for the land of the ice and snow as you’re pounding the pavement. That is such a great answer. Weirdly, a lot of guests have been a little wrong-footed by that. They’ve never given me specific answers like that. That’s amazing. Shameless plugs. Obviously, you do some things. People would want to find those things and find out more about Fortalice.
It’s a medieval outpost. You can say Fortalice.
It made me feel good seeing it on the front page of the website that nobody knows how to pronounce. I feel less bad about myself, but I still feel bad. Anyway, company, books, keynotes, all of the cool stuff that you were doing. Where is Theresa? What’s going on?
If you ever need me to speak at an event, I work for the best in the business, Keppler Speakers Bureau. They’re amazing. You can book me through them. I run Fortalice Solutions. If anybody needs third-party assistance. We love to help you. We’re professional secret keepers. We don’t brag about who our clients are. Three books. Manipulated. I’ve got two books with co-author Ted Claypoole, who’s a fabulous and brilliant mind privacy lawyer. He has some great stories. You should have him on.
I’ve worked with Ted. We’ve done some fun stuff. That guy is amazing.
He dealt with somebody who was like, “I’m holding the internet hostage for ice cream.” Ted has stories for years. He and I wrote Privacy in the Age of Big Data and Protecting Your Internet Identity: Are You Naked Online? Our second edition of Privacy in the Age of Big Data is coming out soon. That’s pretty much it.
We’ve got a Help A Sister Up group on LinkedIn. It’s an organic group if you want to be a mentor, find a mentor, find a job, have a job opening, want to speak at a conference, or need speakers at a conference. This is for men and women. It’s a safe place for people to share ideas and promote more women to join the STEM fields with a focus obviously on technology and cybersecurity, but it can be for anybody. It has been this fun organic group. Women in my company came up with the idea of Help A Sister Up. It’s on LinkedIn. You can join it and post your jobs or whatever you need there.
You got any keynotes coming up? I feel like if you’re not wandering around on stage in leather waving at the crowd, it’s a dull month for the industry.
For the more conservative groups, I do have houndstooth. What have I got coming up? At some point, I will be speaking at UVA at their knowledge continuum. I can send them to you. I will be at RSA for the beginning part of RSA. I will be at the Security Industry Association, which is in Las Vegas. All kinds of good stuff and never a dull moment, which I love.
I’m embarrassed it took this long to get you on here, but I’m happy we’re doing it at this time because you have all of this amazing stuff happening at once. She is Theresa Payton. She is imminently googleable, and she just mentioned Childish Gambino. Why would you not want to do that? Theresa, thank you so much for taking the time with us.
Thanks for having me on. I appreciate it.
Get the books. They are amazing. One more time, all of your finest bookstores. Privacy in the Age of Big Data, co-authored with Ted Claypoole, who is also amazing. Manipulated: Inside the Cyberwar to Hijack Elections and Distort the Truth. Your life will be better for being out on the other side of that. Until then, though, thank you as always for joining us on Friendly Fire.
A friendly reminder, all comments reflect the personal opinions of the participants and not necessarily those of their employers or organizations. Although it’s her company, so maybe they do. For more information on all that’s good in the world of cybersecurity, make sure you check us out. You can find Elevate on LinkedIn and Facebook, and the mothership as always, ElevateSecurity.com. You can find me at @PackMatt73 across all the socials. Theresa, final things?
Thanks for having me on, Matt. Be safe out there.
Come on. You can do better than that. I had a glass of red wine with you after 11:00 at RSA. I know you could give me one more thing. What’s your favorite Childish Gambino song?
I can’t pick one.
We will fix it in the post. There we go. Get out of here, everybody. Subscribe, rate, and review so you will never miss all the great folks who were coming on the show. All we want to do is try to make the world a little safer and a little more secure. Until then, we will see you next time.
Important Links
- LinkedIn – Elevate Security
- Facebook – Elevate Security
- Fortalice
- Privacy in the Age of Big Data
- Manipulated: Inside the Cyberwar to Hijack Elections and Distort the Truth
- Bard
- WhiteHouse.gov
- Keppler Speakers Bureau
- Protecting Your Internet Identity: Are You Naked Online?
- Help A Sister Up – LinkedIn group
- @PackMatt73 – Instagram
About Theresa Payton
As one of the nation’s most respected authorities on secured digital transformation, Theresa Payton is frequently requested to advise Boards of Global Companies, CEOs, and Technology Executives. Theresa is a visionary in the digital world leading the way as an inventor of new security designs and has an approved U.S. patent in security. She provides advice drawing from her experience as a technologist first and now veteran cybercrime fighter, entrepreneur, and author.
Theresa started her professional career within the financial services sector during a time when unprecedented merger activity and innovative and creative banking solutions were market differentiators. Theresa went from a developer to the management ranks quickly, running over 60 significant merger integrations as well as deploying several “first ever” platforms for her banking employers such as expert systems, artificial intelligence, mobile banking, and more. She started her career in a Board Room – while in her early 20s, Barnett Bank (now Bank of America) put her in charge of the CEO’s top strategic technology initiative and asked her to brief the Board of Directors.
Theresa was named the first-ever female White House Chief Information Officer with vast responsibilities that extended globally and included other duties such as maintaining the Federal records keeping for the entire operation and building out the first ever 24X7 security operations center.
She continues to practice her craft as the CEO of a leading, profitable cybersecurity company, Fortalice Solutions. She is a thought leader in key areas of importance to management and boards, including combating fraud, insider threat, improving customer experience, operational resiliency, and securing leading-edge technology deployments leveraging blockchain, artificial intelligence, and big data analytics.