
Phishing For Answers
“Phishing for Answers” brings you insider knowledge from the front lines of cybersecurity. Listen in as we speak with seasoned professionals about overcoming phishing attacks, managing user training, and implementing solutions that work. From practical insights to actionable strategies, this podcast is your guide to strengthening security awareness across your organization.
Phishing For Answers
Cyber Insights: Cody Burrows on Ethical Hacking, Leadership Evolution, and Cultivating a Vigilant Workforce
This episode delves into the critical intersection of cybersecurity and the human element, emphasizing the importance of effective training and mentoring. Cody Burrows shares insights from his extensive experience, advocating for a shift from punitive training to one that nurtures understanding and encourages proactive behavior.
• Cody's journey from pen tester to CISO
• The ego problem within cybersecurity
• The necessity of mentorship and support in pen testing
• Role-based training versus traditional security awareness
• Using behavioral science principles in training
• Importance of positive reinforcement in security culture
• Trusting instincts and gut feelings in recognizing threats
• The balance between carrot and stick approaches in training
Joshua Crumbaugh is a world-renowned ethical hacker and a subject matter expert in social engineering and behavioral science. As the CEO and Founder of PhishFirewall, he brings a unique perspective on cybersecurity, leveraging his deep expertise to help organizations understand and combat human-centered vulnerabilities in their security posture. His work focuses on redefining security awareness through cutting-edge AI, behavioral insights, and innovative phishing simulations.
PhishFirewall uses AI-driven micro-training and continuous, TikTok-style video content to eliminate 99% of risky clicks—zero admin effort required. Ready to see how we can fortify your team against phishing threats? Schedule a quick demo today!
All right. Hello and welcome to another edition of Phishing for Answers. Today I've got Cody Burrows with us. He is a CISO as well as the VP of Information Assurance at Chase, correct?
Cody Burrows:Correct. But just to be clear, all of my comments today are my own and not reflective of Chase.
Joshua Crumbaugh:Yes, got to get that legal out of the way right away, but you have some really good opinions. We were talking a little bit before the show, but maybe before we jump into that, tell us a little bit about yourself. How'd you get into cybersecurity?
Cody Burrows:Like a lot of people in my generation, security didn't even exist. Whenever we were starting this all out, computers weren't what they are today. But honestly, I started I'm one of those that had a Mac 2 in sixth grade and they gave me the prerequisite. They gave me the prerequisite half an hour on it and then said the smart kids were going to be able to do it and I wasn't one of them. And so that was enough to make me get my parents to buy me a TI-99-4A and I started learning how to code from there. So, yeah, we learned a lot about computers growing up, but then went into the Navy and as a fire controlman on submarines and from there learned a lot about security and the hardware and how that works. So I had a good understanding of coding and the hardware and security and then went into Raytheon and started learning about how SDLCs worked and how requirements worked and how a program actually came together and from there started learning about more of the security.
Cody Burrows:At the time learned more about pen testing. It was just starting. We didn't even call it pen testing, it was just hey. Can you? You know you keep saying all these bad things are going to happen. Can you show it to me and started showing them and the next thing I know I'm doing a lot of testing and then ended up going into Visa and being able to be an in-house pen tester for visa. And then took over for uh doing a lot of the security uh the perimeter security for nasdaq, and uh continued my journey of learning uh, everything having to do with security, from there to the point to where my pen testing background was. I was strong on the technical side but started learning that more of the managerial side is where the decisions are made, and so I started learning about how to manage people, manage projects, manage situations, and that's how I got to where I'm at today.
Joshua Crumbaugh:That's awesome Recurring theme training or teaching yourself to code in high school, I know I did it. I've had a ton of other guests that were very similar. I think there's just it's this personality type that wants to know how things work and which leads to well, I want to write code, and so very cool. Now ethical hacking. That is also a place where I spent a great deal of time. That is also a place where I spent a great deal of time, and I know there was a very specific moment for me when I realized that I really wasn't providing the amount of value, that I felt like I should be through the penetration testing and that I was almost playing a game of gotcha, and that's what really led me down the path to initially go into the CISO role or into management and then down the road to found a company. Was there a moment like that for you?
Cody Burrows:Absolutely, and there's a whole bunch of sayings that I, you know, play in my head all the time, and what I think you're talking about is is where there's an ego that can definitely come from being a pen tester. The fact is is we see what nobody else sees because we're, we're going in and we're looking the point of view that you get as a pen tester nobody has it unless you're a pen tester, I guess, and trying to relay that over is honestly one of the biggest failings that a lot of people just in security have. The fact is that our egos get in the way a lot. We need to look at security as something where we're an advisor, where we're mentoring those that don't have that knowledge that we do from being in the system, and there's nothing like being on somebody's system and they don't know it and gathering their credentials and then using them in ways they're not expecting.
Cody Burrows:I had one guy that it was kind of funny. We found a domain controller that wasn't consistent with all the other domain controllers and it was too good. It was fully patched, it was hardened, it didn't have a lot of the normal faults that I was seeing in all the rest of their domain controllers. Is it a honeypot, well, yeah. Well, that was what I was concerned about, because whenever I came on it, it took me.
Cody Burrows:I had already compromised all the domain controllers the first day and was in like Flynn, no problems, and then all of a sudden they started reporting that there was another domain controller, but it wouldn't let me in and I was like, okay, that doesn't make any sense, and it took me about two days to get into it. Whenever I finally did, I was like, wait a minute, this thing's too nice. And so one of the things that people don't think about is that if you're an attacker, one of the ways to be stealth is to make sure you're not a problem, because what brings a sysadmin quicker than hey, whatever's down or whatever's broken, right?
Joshua Crumbaugh:Most of the time when we got caught. That was the reason. It wasn't that a user was like oh, there's something suspicious, they were just either a phishing email. They didn't understand the instructions clearly enough, so they're asking whoever we're impersonating, hey, what did you want me to do? Or it was hey, my computer's not acting right. But it was almost never somebody actually reporting what we did. It was the outcomes of what we did.
Cody Burrows:Right. Well, this was one where, um, it looked too nice and so I went to the. I went into the CISO I was working with and said, you know, hey, this does not look right. And we got the headsets admin. It was fairly funny because he'd never seen this domain controller and come to find out, one of the contractors that was working for the company had gotten really tired of dealing with their domain controllers and not and them not working. He set up as he promoted one of his own to a domain controller and was was running it smoothly so that he could get his work done. And no, but none of the sys admins knew about it. And so one of the funny things about that one was is uh, once we finally got that figured out, I watched him go down and start and he goes up and he looks over, he start, goes up and looks over, and I realized what he was doing. I look over at the CISO and I just kind of nodded to him to look, and that admin looks at me and goes who are?
Cody Burrows:you logged in under and I'm like you're the CISO admin. You ought to know and he goes what looks like you're logged in as me. I went and he goes. Well, you can't do that and I go, why not? He goes, it's against policy.
Joshua Crumbaugh:Oh, this is a pen test, buddy. Let me tell you about something called dynamic scoping. We use it all the time.
Cody Burrows:Yeah, well, the funny thing is is that in the room we've got the CIO, who did not realize what was going on. We've got this headsys admin, we've got the CISO and all of a sudden he's really upset. And this actually gets back to what I was talking about earlier, where we need to have a mentoring mindset whenever we're pen testers. And, by the way, in just general security, the fact is that he left for the day. He was that distraught, he was very upset and whenever he came back, I actually got onto his computer and then called him and I asked him to explain why he was upset and started working with him and uh, he actually was responsive.
Cody Burrows:And uh, uh, he kept on saying but I'm the, I'm the, uh, you know, I'm god of this network. You know, nobody knows and has the controls. I do. And I'm like yeah, but you don't always know that you're compromised. And he's like of course I know. And so I was already on the system and I'm like okay, if that's the case, why don't you bring up a command line or a browser I can't remember which and why don't we walk through something real quick? And as soon as he brought it up, I killed it and go ahead.
Joshua Crumbaugh:Oh no, go ahead and finish that statement. I was just thinking that really leads to another conversation that I think would be really worthwhile to have. But go ahead and finish first.
Cody Burrows:Well, basically, I kept killing it, got uh uh curious about what was going on, brought up task manager, started looking through all the things. He couldn't find me on there and eventually, uh, I let him know that I was actually on there and the thing was is I did it in private, so it wasn't a matter of embarrassing them in front of everybody. It wasn't a matter of embarrassing him in front of everybody. It wasn't a matter of my ego and showing him how stupid he was and how smart I was. It was more a matter of in his mind that stuff goes on in Hollywood. It's not real.
Cody Burrows:And he really had all the keys to the kingdom and nobody else did. And in taking that away from him and helping him understand how vulnerable he was, and in taking that away from him and helping him understand how vulnerable he was, all of a sudden I mentored him into a better understanding of why the pen test reports were important. And then, all of a sudden, we started making traction and the point of security in a pen test. It's not for our egos to show how stupid everybody else is. The point of it is to actually remediate the issues and allow the. I mean, and this is. This is something that a lot of us miss is it's the making sure the business is able to do what the business needs to do to make us all money.
Joshua Crumbaugh:That's the point.
Joshua Crumbaugh:It's not about the reason that we're here, and too many people do not understand that and they think, you know, it's all about cybersecurity. No, cybersecurity is a means to an end and the end is profit, and so, no, I couldn't agree more, and I think that that business case is often lost on people in cybersecurity because they get way too into the weeds. Cybersecurity, because they get way too into the weeds. But you mentioned that. You know this admin had this idea that you know hacks really only happened in Hollywood or things like that. You know, I really do think that that's one of those sacred cows, if you will, or whatever, that we really have to slaughter here and we have to confront it with our users that listen, it's not just Hollywood and when they realize that, I really do think it changes their mindset a little bit. But to the greater point here, this is part of the reason that, admin, or that story that you gave, is part of the reason I'm very passionate about role-based training.
Joshua Crumbaugh:In my experience as an ethical hacker and I'm curious if you've seen the same thing.
Joshua Crumbaugh:But you know, the user gets all the bad reputation for letting the bad guys into the network, whether it's by opening an attachment or clicking a link or typing in their credentials or whatever they happen to do. But there's so much more beyond that. I found that it was the mistakes of the IT and network admins that allowed me to get the full keys to the kingdom very quickly once I got in the network, and it was the mistakes of you know different departments that might allow me into some of those more critical systems, like you know cardholder, you know data platform or data management platform. And then it was those mistakes, even of even our information security team, like not staying on top of the latest MITRE, ATT&CK framework trends, that allowed us to not get discovered. And I feel like so much of the time we're putting all of the focus around security awareness, training on the user and these handful of mistakes, but forgetting about all of the other mistakes that can also lead to utter compromise. Is that what you saw?
Cody Burrows:You're doing. One of the things that I think we're all fighting and that is pointing the fingers and look like I said we're all doing it. We all run into the okay, something bad happened, something went wrong. Some you know, hey, we got a report. It's telling on us or oh, we had a breach. The quickest way to stop the bleeding is to stop worrying about who made the mistake, because the fact is the, the CISO and the C-suite are ultimately accountable and, and you know, we delegate the responsibility down to down through the ranks. But if all we're doing is looking for who we're going to beat on about it, this kind of gets to a carrot and stick situation. The fact is, is that we're we're? It's self-defeating.
Joshua Crumbaugh:So I think you misunderstood my question slightly and I agree with with what you're saying, that we we don't want to be blaming or pointing the finger. What I was really trying to drive at is that there's so much additional training beyond what we're typically doing that needs to be done across all levels of the enterprise in order to more holistically address human error. Right now, I guess what I was trying to say is we focus on just a handful, maybe five common human things, errors that happen again and again, but we miss a lot of the other stuff, and to me it's not about pointing the finger. It's about designing our training to be holistic so that it addresses those errors at every level, so that our people have the training that they need.
Cody Burrows:Look, we need to do training. Although a lot of places can get into overtraining, I think that at one point, whenever I was at a large bank, I had over two weeks. I think heading on to two weeks is a good way to put it. It was over. It was like 67 hours of training a year I worked for the SEC for a while and the FDIC.
Joshua Crumbaugh:So I know a thing or two about the god-awful amounts of training and I agree about the god-awful amounts of training and I agree that's actually been proven through countless academic studies to have zero impact on security.
Cody Burrows:Well, here's the thing. I think that you're talking about role-based training. I think that this is where and you gotta be careful about overtraining, you've gotta be careful about it's social engineering users in a positive way to be able to get the outcome that we're all looking for, and that is an awareness of what to look for you. You know it's teaching your kid to not jump into a van because you know the the guy has. You don't want them walking around and every time they see a van they're scared to death and run away from the van. But you also don't want them to get, you know, to jump into the van with the candy. And so how do we, how do we hit that sweet spot? And in that it's identifying. If training is identifying where there is a gap, and then goes back to what I was talking about earlier, where we're acting as a mentor. For, okay, here's what you missed. You know the I don't think this is nearly as common as it used to be uh, the misspelling in the email, or you know the.
Cody Burrows:That's almost a sign that it's a legit nowadays with yeah, actually, but it's a matter of letting people know what they need to be looking for. And one of the things I actually got in trouble at that same bank because, whenever I found what looked like a legitimate phish, I hit the you know the hey, this is phish, this is a ph fish to send it off to the security team. But I also shot out an email, a wide distribution email, to everybody saying, hey, it looks like we've got a fish. There's, you know, here's the subject header, here's what to look for. And, uh, they, they came back and started uh, uh, getting upset with me because I ruined their test. And they even tried to say that I went against the key controls that we're all bound by.
Cody Burrows:And it was kind of funny because I would go okay, show me in the key controls where that is. Nobody could do it. And I said, on top of that, you are inhibiting the exact behavior that you are wanting. Because, if it's, if this had been a real fish, of course you want for everybody to start telling everybody else hey, there's a problem, it's a herd mentality. Um, I actually raised goats and it's funny. All it takes is one goat to see the fox or whatever else they start telling everybody else. And then everybody faces the fox and and addresses the issue. And a fox taking out a small you know one of the babies is pretty easy. The fox taking out a whole bunch of ticked off protective mothers is is a whole nother thing for the fox. And whenever you're dealing with fishing I absolutely want everybody to you know, start raising alarms and saying, hey, we've got a problem, even if it is a training situation where you want to enhance that, not get upset about it.
Joshua Crumbaugh:Oh, I couldn't agree more. I actually think that this is a really good segue into social engineering for good. So a little bit about my background. I actually, when I first decided to found this company, I asked the question well, how do we use social engineering for good? And that just really led me down the behavioral science rabbit hole, which ended up with me speaking at a conference over in Ireland and next thing I know, I was asked to co-author some college coursework on exactly this how do we build effective security awareness training programs and the core theme in there was behavioral science, and so I got this opportunity to sort of collaborate, and I know there's a number of sort of behavioral science principles or tactics that I've learned over the years that I would classify as social engineering for good.
Joshua Crumbaugh:One of the key ones and I think it goes to what you were talking about earlier about how you got to be careful with training fatigue is called space learning theory, and it's the foundation of advertising, and what it says is that if we can break things into tiny little, bite-sized chunks and we feed it to people at a high frequency, you get really good retention, and so that's why we see our advertisements. When we were a kid were 60 seconds, but now they're 12 seconds, 10 seconds, 15 seconds back to 30 seconds.
Cody Burrows:We still have more words to do.
Joshua Crumbaugh:It's long these days. That's the direction we see advertising going. But what I realized early on is that this very much applies to security awareness and we've got to do more to truly break this down into bite-size chunks. I say truly, because there's been a lot of content that's three to five minutes long. That's sort of called micro-training, and what I found is I can't keep the user's attention more than a minute. So even at five minutes we're already at beyond the, this younger generation's attention span and and we've seen it uh, all the studies show that people's attention spans are shrinking and all theories point to social media, which it makes sense. But uh, as a result, I think we've got to shrink down our training too.
Cody Burrows:I. I would agree with that, and you've also to. I would agree with that, and you've also. It was not until later that I started understanding why all those math algorithms and things that we learned about actually mattered, in that whenever I was going through school, it was just a lot of learning. You know, wrote, you know what's the quadratic equation and how do you break it down into all of its components, right? I didn't have an understanding of why, and because of that, I didn't learn as much as I should have. And oh, me too.
Cody Burrows:Yeah, well, no, I look, I think this is most of us, but where I excelled was actually trigonometry and geometry, because I could look at it and understand it very quickly. I think that's one of the bigger things that, with most training, we miss. In other words, I can sit there and I can tell you don't click on email, but do your job, and that's contradictory and it doesn't make sense and I don't know. Okay, what is it I'm supposed to be dealing with? An example of a positive phishing training is whenever you actually click on the example and it comes up and it says here's what you should have seen, you know and how you could have caught that, and and then explain how that might've affected the rest of the business. In other words, what, what do you have access to that might be sensitive to everybody? You know the SharePoint that has all the PCI information, which is kind of a nightmare to most of us, um, or?
Joshua Crumbaugh:I said I hope it's not in sharepoint yeah, well, but that's, but that's.
Cody Burrows:That actually goes past. The reason why I bring that up is because that goes past. You and I both know that your pci information, your credit card data, should not be in a SharePoint but in your pen testing. How many times did you just simply type PCI whenever you had access to the SharePoint and, all of a sudden, all the Excel spreadsheets started coming up?
Joshua Crumbaugh:Right Plenty of times.
Joshua Crumbaugh:Well, let's just say that we worked for a lot of very large organizations, and so the flag that I kept, I guess, continued on, was that, whoever was the pen test you know ran the pen red team would run a transaction there and you'd have to find their, uh, their credit card number and like, present it back.
Joshua Crumbaugh:And uh, cause we were working and you know definitely are testing into a lot of production environments and uh and I can't tell you how many times we were just, I mean, it was so easy we didn't even have to know the full person's name, like, uh, one of the times the, the person whose credit card was in there, went by their middle name, and so no one knew that until we got in and started digging for credit cards and were able to find it, but it was always because it wasn't properly protected, it wasn't properly segmented, and I think it's not that people don't care about being more secure, it's that they don't know how and when they don't know how, they take the path of least resistance, and that's how we end up with credit cards and SharePoint.
Cody Burrows:Well, and that's why I'm saying if in our training, if you catch somebody fishing, and then you can transition this into how this might actually affect them personally. Now, all of a sudden, people are going to go oh God, had that have been real. Yeah, I could have lost a million credit cards or whatever.
Cody Burrows:The you know social security numbers, whatever sensitive information or I could have given you know you'll again going back to the admins just taking that extra time to train him and get him to understand how vulnerable he really was. I guarantee you that the rest of his life he went through and goes. It's not in Hollywood, man, it's really not. I've seen it in real life and at that point he trained more people. And now all of a sudden, it's not just the phishing and the ransomware and hey, everything will get encrypted, and that it's taking those opportunities to actually give meaningful training. That broadens it so that they take part and then they will train other people. And now that's how you build a culture.
Joshua Crumbaugh:Yeah, so going back to the, uh, the CISO that you were talking to, that was very against fishing because he felt like it was very heavy with the stick. Um, if we go all the way back there, uh, one of the the things that I've seen and you just hit on was you know, you make sure that they have that training and the moment that they click. And so one of the things that we've realized is that when somebody realizes that mistake whether it's the IT admin or whether it's the end user clicking on a phish they're actually going through an emotional response, and that emotional response has been proven to increase retention because it's quite literally creating new neural pathways. And so I bring that up because I see a lot of times where people run phishing simulations as part of their security awareness program. It's very different if it's a pen test, but they run phishing simulations as part of their security awareness program and then they check whether or not the user will type in their credentials and they do what I call exploiting the user.
Joshua Crumbaugh:And I think that is one of those sticks that not only makes our training less effective, because we miss out on that just-in-time training, but on top of that, it makes it that much worse, because if the user doesn't immediately realize their mistake, things can fester and get out of control. The example I like to use is the person who gets the fish saying that they're getting a raise, who was already, you know, hoping for a raise or had convinced themselves that they were deserving of a raise. And then they go and they try and log in and they can't. And so in the time that it takes for whoever to reach out to slap their hand because they typed in their credentials, they go around bragging to everybody that they got their raise in their credentials. They go around bragging to everybody that they got their raise and that, just you know, went from an emotional high to an emotional low very quickly. And those are the types of scenarios that I think we have to avoid.
Cody Burrows:And so very detrimental.
Joshua Crumbaugh:Yeah, yeah, and I and that's one of the reasons that I've always been very, very much a fan of just intime training at the moment that mistake is realized, because that's your best time to train them.
Cody Burrows:I fully agree, and the CISO I'm talking about. It's not that I disagreed with them at all. Oh, I agree by the way.
Cody Burrows:Well, I've seen a lot of training where you just go it's about ego. There, all you're doing is saying I got 70 of them to click on it and look how bad you know we are and we need to spend more on this. Like you know, it's literally the beatings will continue until morale improves. Um, which is just. It's asinine. Um, the you're. You're absolutely correct, but it's also it's just in time, but it's also. I actually was just dealing with an organization and I got hit by two fishes and I'm supposed to know what I'm doing right and I go back and for both of them I went back and back and I'm like, okay, what is it? I could have seen? And you know I hover over the link and it comes up with proof point and then a bunch of garbage that there's no chance of me being able to decipher unless I actually put it through.
Joshua Crumbaugh:That's the reason I'm not a big fan of url rewriting, because it it takes away the ability to hover right, right, well, uh, you know.
Cody Burrows:And then I go through and I'm looking for okay, is this really like a? And okay, is there what is it that I could have caught? What I mean? I'm okay that I that I made a mistake, but I want to learn from it, that's. You know, absolutely, I go through and I I cannot find anything that would that would have alerted me or had any. You know, it was actually topical to what I was dealing with because, as part of me helping them, I was on I'm like, okay, it's another training and it looked like their other training. And I went back and I actually asked the guy I'm like, what is it that I could have done? And he's like, oh, we copy and pasted it from the real one and the only difference is is like he's like the URL, and even that was I mean, in other words, they'd done, they'd done everything to make sure that it was fully obfuscated. I'm like, okay, so what is it that I could have learned from this? How am I supposed to know this from? And he goes, oh no, that's the whole point.
Joshua Crumbaugh:You're not supposed to so I might add that you just defeated the whole purpose of fishing in in this scenario, because, to me, why do we fish? We fish to train people on the different red flags.
Cody Burrows:That's why we run simulations, for and that's why I bring this up.
Joshua Crumbaugh:Yeah, and if there's no red flags, then there's no learning, and if there's no learning, then all we're doing is abusing our employees for no reason, and so, actually, all you're doing is basically getting people to go.
Cody Burrows:It's those jerks in security on an ego trip again. And the next time you're not even dealing anything with phishing or with email or anything you're talking about. Hey, we need to do an IP address rewriter, whatever else, nothing to do with it, and all of a sudden you're already ticked off about it.
Joshua Crumbaugh:And there's a whole nother factor here too, though, that that I think people don't think about enough before they do that Gotcha, and that is learned helplessness.
Joshua Crumbaugh:There is this group of people inside of every organization, because it's a significant percentage of people. And yeah and so they. They go in thinking people and yeah and so they. They go in thinking, hey, I'm really good at spotting scams, but then the IT people they use inside information and they use tricks and there's no red flags, like you said, and all of a sudden, this person that thought they were really good at detecting scams now has no confidence in their confidence and their ability to detect them, and because of that, they become less secure, because as soon as they give up on that confidence, they're like I guess I'm not good, they don't try as hard and they end up clicking on more fish, and so there's a lot of danger there, just besides just making everybody mad at the InfoSec team, which, honestly, we should be bridging the gap, if anything, we're bridging the divide and and which, honestly, we should be bridging the gap, if anything, or bridging the divide, and we don't want to be Dr no, like you said.
Cody Burrows:Well, I started off saying we need to be more of the advisor. Well, what is it that you just advised me? I'm an idiot. Nobody would stick around for more of. I'm an idiot, I know I wouldn't, Nobody would I mean, but that's that's where I agreed with that CISO. He's right that there are a lot of ways that people run the email training programs. Where it is just you know, beatings will continue until morale improves and that's antithetical to what you really want.
Joshua Crumbaugh:Yeah morale is not going to improve.
Cody Burrows:Yeah, well, exactly, and learning is not going to improve and, honestly, for any number of reasons, there's a lot of backlash that will happen. That doesn't need to.
Joshua Crumbaugh:Well, and other business impact that I think is outside of just security that people don't think about, like when you run a heavily punitive program in any department that people have to work with in any sort of regular basis, and the example I'll use is like that three strikes and you're out. You fall for three fish, you get fired, type of approach. Well, what happens is that you have people go from a growth mindset into a fixed mindset, and that's when your company stops innovating, their sales start declining and they run the risk of going out of business. And so there's a lot of behavioral science that has been studied at great length that we're just not applying to cybersecurity yet. And I think it's natural because we're still learning cybersecurity in some ways.
Joshua Crumbaugh:And you know, the part that we needed to know all along was behavioral science, yet we're still not training it. There's not a whole lot of programs that incorporate that into it, and so you know it's no one's fault, but we've got to incorporate more behavioral science into what we do. You know, one of the things that we've found works really well for helping to offset that negative reaction is just making a game of cat and mouse out of fishing. Hey, let me tell you about this attack that pretends to be from your finance team. Now be on the lookout, because the next time you see it it might be me or it could be the real bad guys. And that gets this hyper vigilance. And, more importantly, if they do fall for it, they were told they're not mad, they're not upset anymore because we changed the script and we turned it into a positive experience.
Cody Burrows:Well, I've also seen where you got attaboys for uh, for identifying it. In other words, you, if you, I think this is a fish, and all of a sudden I get you know uh, 15, uh, I'm a good boy. Uh, bucks, uh, uh, that helps. Um, it's also um, whenever uh, something that doesn't literally, it literally doesn't cost a dime, but whenever somebody hits a uh and it actually is it goes to their boss and their boss knows them and then comes back around and simply says hey, I saw that you did a good job, that you caught that. Hey, good job.
Joshua Crumbaugh:Yeah, our cyber coach literally calls them a cyber hero when they report fish, whether it's one of ours or a real one. But I think that those attaboys are very, very important and too often overlooked. You know, we tell people when they do bad, but we don't tell them when they do good. And to me, telling them when they do good is as important, if not more important, than telling them when they've made a mistake. And when we do tell them about a mistake they've made, it needs to be constructive, not hey, you're an idiot, but hey, here's a couple ways that you could have helped maybe avoid this mistake. And so it really is in how we craft that language too, because you know we do all the right steps, but then we say, you know, hey, you need to think more before you click. Well, that's probably going to turn them off because it's a little aggressive.
Cody Burrows:So I mean? I mean, how long have we been trying to fight FUD fear, uncertainty and doubt? If it's a matter of making people fearful of email, just exactly how does that help the business?
Joshua Crumbaugh:They need to learn how to use it securely. Just like we don't train kids to be scared of the street, we train them to look both ways before they cross the road. And I think that that productivity hit needs to be contemplated because, like to your point, we can't just be fear and nothing else. They have to just know how to use their email in a more secure manner.
Cody Burrows:But I think that one thing that is important is like, if you have somebody that like that's this admin before I got a hold of him.
Joshua Crumbaugh:That's somebody who's very sensitive. I like how you worded that before I got a hold of him. That's somebody who's very sensitive. I like how you worded that before I got a hold of him.
Cody Burrows:Well, whenever I got a hold of him, he was on a path that was very detrimental to the business right, and the position that he held is very sensitive. I mean, the CEO cannot shut down the company as fast as a rogue sysadmin or somebody with his permissions yeah, that's true. So the thing is is that whenever you have those sensitive positions, you need to be able to have a balance with the carrot and the stick. Your average user, who has average permissions and who is not dealing with hypersensitive credentials or data, you can be a lot more lenient on. But you also need to drive home it's not Hollywood with those that are dealing with the sensitive information.
Cody Burrows:One of the things you know we're talking about how social engineering works. If you walk into a room that smells, you only really smell that bad smell for you know, the first five minutes at most before your brain acclimates to it and says, yeah, I got it. It smells and usually you can block it out. One of the things that we run into is that there are certain positions that they're so used to running around with domain admin or whatever social security numbers or credit card numbers that they get blind to it. Yeah, and with those people you know. Whenever those get hit, you need to take that extra training and give them extra awareness of how critical their role is to what we're doing and how they can affect the business, and it's it's one of those where it's striking the balance. There has to be a balance between it's not all Pollyannish and it's not all oh, you're a good boy, even though you stole the thing. It's more a matter of the roles that are less sensitive, can be handled in a more general way and and employ a lot of the positives that we're talking about.
Cody Burrows:But you also need to identify. In other words, if training is more about identification instead of this, just you know beating on people, you know three strikes and you're out. Well, you never told me how you know. You never told me or trained me as to how I don't run into this again, and yet I keep on. Now I'm afraid of land mines, because that's really what you're doing.
Cody Burrows:Well, take them in and have a session, like I did with that sysadmin. Have a actually do a tabletop where you walk through. Hey, you're somebody who works with our PCI on a regular basis. One of the interesting ones here is call centers. Call centers are considered, you know the entry-level job, the high turnover rate, all those kind of things. These are people that generally don't get a lot of training and yet they are the ones that are dealing with our credit cards right, and so whenever somebody like that gets fished, you want to take extra time to train them into how sensitive the information they are, they are dealing with, can be exploited. So it's a matter of at a certain point you're doing the carrots, but you're also, whenever it's more sensitive, you want to use it as identification, not as the stick.
Joshua Crumbaugh:So there's this behavioral science principle called identical elements theory, and it explains why, when somebody buys a new car, for example, they instantly start seeing it all over the road. And it just talks about how, when you learn about something really well for the first time, you'll start to recognize it everywhere. And I highlight this theory because it's really applicable when it comes to cybersecurity and driving the behavior that we want out of people. We want them to be more secure, and so if we can help them learn the small handful of common red flags like urgency, authority, you know, sunken costs, things like that right, if we help them learn it, they'll start to see it everywhere, not just in phishing simulations.
Cody Burrows:Exactly.
Joshua Crumbaugh:And this is all happening at a subconscious level. And there's this really great book that's called Thinking Fast and Thinking Slow, and it talks about how we basically have two brains one that is our logical slow brain and one that is our subconscious fast brain. And yeah, and so you know. The examples are if you're doing that math problem, you're using your logical slow brain, but you know if you are, if your reflexes kick in and block something from going into your eye, that's your fast brain. And, to put it into perspective, if your conscious brain is moving at one mile per hour, well, your subconscious brain is moving at about 5,000 miles per hour. And so whenever we see incidents, almost every single time the person that initially let them into the network says I knew something was off, I should have trusted my gut, and we hear this over and over again when we analyze incidents after the fact. And so I think A we have to train people that you do have a built-in antivirus. It's your subconscious or your gut, and you have to trust it, but it requires that we train them on that, and then it requires that we condition them and fill their antivirus with virus definitions, if you will, and that, to me is why we run phishing simulations, why we run threat simulations in general, why we do tabletops? Because it helps put these things into their subconscious.
Joshua Crumbaugh:And then when they avoid a fish because their subconscious said it was risky, they don't even realize that they did it, often consciously at all. They just delete it and move on, or report it and move on, whereas when they avoid it consciously, we still run the risk of the curiosity getting the best of them and one of the or just being in a hurry. Oh yeah, that too. But I saw this one study that found that there's this group of people that think that their work environment is so secure that there's no way it could get hacked. So when they get a suspicious email to their personal address, they just forward it to work and open it there. Email to their personal address, they just forward it to work and open it there. And to me, that is why we've got to start focusing less on the conscious and the knowledge side of this and more on the subconscious and the instinct side of this With everything.
Cody Burrows:You'll notice, I'm a real big proponent of balance. I you know, if you go one side or the other, we usually miss whatever it is. It doesn't matter what it is, and so it's a matter of you know we're we're hitting around all the right spots. Right, we're talking about there needs to be carrot with the stick. There needs to be more training, but not over-training. We need to think of the conscious and the subconscious. Training. We need to think of the conscious and the subconscious. You know, I was just talking about the fact. If I don't know why I'm learning about a quadratic equation, then it doesn't. You know, there's no correlation to why I care, right, and so, consciously, we need to make sure that they understand how sensitive their positions are. But we also need to bolster the subconscious where they get where. If it doesn't feel right, just hit the. You know, I think this is a fish button and you know, go on Right, don't be afraid to hit that. Don't be afraid to hit that.
Joshua Crumbaugh:And it's whenever. Whenever you're hitting on all cylinders is when you're hitting all of the, whenever you're hitting those, those, those medians that matter. And to your point about the why countless studies have shown that when you can connect the dots for somebody, it makes that training 15 times more effective, because now it's contextual to their access, their job, their role, and that makes all the difference in the world, because one is just it doesn't apply to me and one is dialed in and that truly does drive better engagement. Unfortunately, we are running out of time here, do you have any sort of last thoughts.
Cody Burrows:Yeah, go. Let me finish with what you were just talking about, and that is nobody knows the employee better than their immediate manager. And this is one of those where, if you're focusing just on the employee, you get too much stick and it's impersonal. The management and having them be able to understand the issue and then make it personal for the employee, I think that's one of the keys to success in being able to drive security awareness. Sorry to interrupt, I just didn't want to. You teed that up beautifully.
Joshua Crumbaugh:No, I agree wholeheartedly about that and I think that's a great point. Do you have any sort of closing thoughts?
Cody Burrows:that that you want to leave with anybody really doesn't have to be about security awareness, anything at all that I push is if you want to be good at security, don't go into security, and it doesn't make sense on the face of it. But one of the things that you and I've been talking about, you know, for the better part of the last hour is actually understanding the base of the system that we're trying to hack. Right, Hacking is understanding the system to the point where you can get it to do something it was never originally intended to do, and one of the things that I find that people who've been in security, we understand the underlying system. You keep talking about understanding this theory or that theory, with understanding the human system, right? And one of the things that you started off asking me was you know, everybody wants to know how to break into security. Understand the system is my answer to that.
Cody Burrows:Don't just go into security because there's a lot of money there and it's. You know I'm going to be the cool kid in the hoodie. It's really a matter of, if you want to break into security, understand the system, and once you understand the system, the security is an overlay. It's a point of view and we talked about that earlier as well. It's a mindset. That that's all we've been talking about is changing the mindsets and understanding of that sysadmin or of the employee or of the manager. And if you want to break into security, don't go into security. Go understand the system and then start learning about the security part of it. It'll come naturally and those are the best security people I know.
Joshua Crumbaugh:Certainly the best hackers that I've ever had a chance to work with. They started out in generally multiple different positions within IT teams everything from networking to sysadmin to help desk and that broad understanding made them really good. And then I've also seen pen testers that started out on the developer side of things and they bring a whole different point of view to pen testing, and I think that it's that understanding that you talk about that is so critical and that is where all of the coolest zero-day exploits come from is a deep understanding of how things work. That's being exploited. When you look at the TCPIP vulnerabilities, the processor vulnerabilities, these crazy ones that we've seen, it's somebody that really understood the system first and then they were able to exploit it as a result. So I couldn't agree more Exactly. All right, Well, thank you for joining us for another episode of Phishing for Answers, Cody, if you can stick with me one minute, but I'm going to end the stream, Thank you.