
Phishing For Answers
“Phishing for Answers” brings you insider knowledge from the front lines of cybersecurity. Listen in as we speak with seasoned professionals about overcoming phishing attacks, managing user training, and implementing solutions that work. From practical insights to actionable strategies, this podcast is your guide to strengthening security awareness across your organization.
Phishing For Answers
Crafting Security Cultures in the Age of AI with Tim Chase
The episode focuses on the evolving threats posed by phishing scams enhanced by artificial intelligence, particularly in corporate settings. Tim Chase shares real-life experiences and insights into the changing landscape of cybersecurity and the need for tailored training to empower employees in recognizing and combating these threats.
• Discusses a significant gift card scam incident
• Examines the evolution of phishing tactics and AI's role
• Emphasizes the necessity of role-based security training
• Highlights the importance of social engineering awareness
• Advocates for creating a culture of communication about suspicious emails
• Suggests positive reinforcement techniques to promote cybersecurity awareness
You can also align future security initiatives with effective training methods, focusing on current challenges arising from evolving threats.
Joshua Crumbaugh is a world-renowned ethical hacker and a subject matter expert in social engineering and behavioral science. As the CEO and Founder of PhishFirewall, he brings a unique perspective on cybersecurity, leveraging his deep expertise to help organizations understand and combat human-centered vulnerabilities in their security posture. His work focuses on redefining security awareness through cutting-edge AI, behavioral insights, and innovative phishing simulations.
PhishFirewall uses AI-driven micro-training and continuous, TikTok-style video content to eliminate 99% of risky clicks—zero admin effort required. Ready to see how we can fortify your team against phishing threats? Schedule a quick demo today!
Hello, hello and welcome to another edition of Phishing for Answers. Today I am joined by the talented Tim Chase. He is a field CISO over at Glaceworks. Tim, you were telling me a little bit about a story before we got started.
Tim Chase:Yeah, Can you open there and maybe tell the audience a little bit about yourself? Yeah, absolutely so. You know my background is in AppSec, cloud security, grc, that space. But I've been a CISO a couple of times before I made the trek over to being a field CISO and so I've had the opportunity to build, grow, lead security teams and I remember I think it was my second time when I was being a CISO and this was several years ago.
Tim Chase:I would say you know the phishing back when phishing scams were still kind of rudimentary, where you can look at them for just grammatical errors they were, they were semi-obvious, right. But it was kind of at the very beginning of when the whole gift card scam came around. And I remember a certain leader I don't want to give too much away and point any fingers, but one of the execs of the company kind of came to me in a panic a little bit and she showed me an email and the email was just, you know, very short. Hey, I can't talk right now. I need you to buy some gift cards, you know, please send them here. And and she did right, she didn't pay attention to the exact email it was coming from, and so you know this was around Christmas time, right.
Tim Chase:So budgets are tight and we're talking about, you know, not like $200 worth of gift cards. It was $2,000 or $3,000 worth of gift cards that she went ahead and paid for in expense, right, and so, you know, she was, rightfully, in a little bit of panic, you know. Is there anything I can do? Can I get a refund? Obviously, the, you know, in the end the company reimbursed her and it was a, it was a good learning exercise, but it really just it was kind of at the start of when those started to become popular. Those still happen, right, but it's it's kind of tis the season as we get into Christmas and shopping. You know, watch out for those type of of of scams, because that really hit her hard and it was, it was definitely personal.
Joshua Crumbaugh:Yeah, we, I actually. I have a brother-in-law that falls for a gift card scam at least once a week. I joke. You think I'd be joking, but it's actually true. It really does fall for these.
Tim Chase:You know what I do for a living right Like come on, brother.
Joshua Crumbaugh:I mean, well, my wife actually works with me here too, and she's just like uh, hello, and every time they'll be like you just don't trust people.
Tim Chase:Yeah, that's right.
Joshua Crumbaugh:No, but I mean gift cardams are are absolutely still a thing. One thing I found interesting about what you said and just the evolution of phishing is you said it was back in the day, when you know the typos and all these little things like that gave it away what I find funny about that is.
Joshua Crumbaugh:It's actually the opposite. Now, if you've got typos, it actually indicates that it's almost more human, whereas before it was more of an indicator that was malicious. But with AI I see that the complexity of phishing has gone up significantly and they're even able to do a lot more customization. What are you seeing there? I mean, that's a great site, we'll just jump right in there.
Tim Chase:Yeah, I think that's great because you know, maybe a little bit in response to this, back when I was setting up that security program, we really started to take a look at some of the software that we were that we're using. Is there anything that we can do to mitigate that kind of stuff? Right, if we look at the, the headers, if we look at where it's coming from, what can we do? But we were really focused on the typos. You know, for example, um, if the ceo signs his name this way, he never signs his name that way. He always uses his uh, his short name, not his full name. So we were looking at ways to try and mitigate that. Right, but I think the complexity there I mean that stuff still needs to be filtered and you obviously still have to do that base level of work. But the complexity is there because now you have the chat GPT engines that can sometimes speak better English than you can.
Tim Chase:I mean, I know personally not, I don't send phishing emails but, like you know, I run a kind of a travel agency as a, as a side thing with me and my wife. My wife and I do, and when I post on social media I always take my blurb that I think sounds good, and I put it into chat GPT and I say make this sound better. And Holy crap, it does. I'm like why didn't I think of that, right? So why couldn't you do that? You know, why wouldn't you do that with the phishing scam, right? And so that throws kind of that whole easy filtering method that I mentioned out the window, right, and then to kind of go on top of that, you know kind of the chatting, the back and forth. It's not even just the initial email and how you craft the email, but with some more sophisticated phishing scams or social engineering scams, there'll be a conversation. It's not just a one-off, right. Well, the chat GPT can carry on this conversation, and not only that, but it can. I don't know how many people know this Maybe most people do but like chat GPT and the other ones, can take on a persona and sound like that person or a type of person.
Tim Chase:And what I mean by that is when we were first demoing our AI functionality, our chat bot that was built in to our product, one of the things we always like to do at the booth was say, hey, tell me about this security alert, but tell it to me as a pirate. Tell it to me in an email that a CEO can understand. Tell it to me in an email that an analyst can understand, right, and so the persona would flip right. It would totally do the pirate, but then it could totally do the CEO as well. And so you know, it's not a stretch to say, hey, let's write this chat bot, let's write this email, based on how this person would write, because a lot of CEOs and leaders are prolific and they post on LinkedIn, so it wouldn't be that hard to kind of understand how they refer to themselves, how they write and things like that. So in this day and age, that just makes it a lot harder to maybe identify uh, what's fishing and what's not chad.
Joshua Crumbaugh:Gp speaks like what? 260 different languages or something ridiculous something crazy still less than what is spoken in new york city. Uh, but just the, the mass or the vast amount of languages, I find very intriguing because it can very easily translate and because of the amount of data that is used to train these models. They're often aware of colloquialisms, particularly if you vary the model to the region, so if I'm looking for different, unique colloquialisms that might be used in Hong Kong, I would probably go to the Baidu large language model, whereas if.
Joshua Crumbaugh:I'm over in the Middle East, I'm probably on Falcon, or if I'm in Russia, I'm on the Yandex model. But I do find it very interesting because these different models allow you to really customize down to a level that would have required that I had a linguist and an analyst that understands this area before, but now I don't have to have that. I just ask an AI and like that it's done. It can create video, it can create audio. Did you see the Notebook LM podcasting feature in Google?
Tim Chase:I did. I saw them try that. I was like holy crap, that is crazy, it is.
Joshua Crumbaugh:I mean it's just absolutely insane. I wish I didn't have it here. I wish I did, though, have it here. I wish I did, though, because it's I had to do a podcast about fish firewall, and it just I mean, it's probably higher production value than this time. I mean, it was ridiculously good quality.
Joshua Crumbaugh:But you know, those are the things that I think are a really big concern too, because, just like things can be used for good, they can be used for bad, and I feel like we opened up Pandora's box without thinking, and now we can't close it, and so there's a little bit of catching up that has to be done, and, by definition, you can't close Pandora's box, so that catch up is then difficult, and I think that everyone's looking at different elements of it, and that actually leads me to my next question here. So what threats are you most concerned about as it pertains to AI? And then, as a follow-up question to that, what do you think we should be training our users on in terms of AI? Because that's changed too. I mean, just what we have to train our users on has really evolved in the last 12 months.
Tim Chase:Yeah, I'll answer that a couple ways that maybe I can take the conversation in different areas. You know I am concerned not to keep talking about the topic, but I am concerned with with phishing's effect on AI and how all that works together. And the reason I am I am most concerned about that is because I, when you look at where a lot of the attacks are these days, I think a couple of the cloud security, app security those are all super important. You have to have those right. But a couple of areas that I think need a lot of focus. One is an identity. I think identity is a big focus area of any major cybersecurity program. Most of the CISOs that I'm talking to are undergoing whether it's the human, the non-human. They're undergoing that.
Tim Chase:But a lot of the attacks that you see, like going back to the casinos or whatever, it's not like they found some firewall port open and then they managed to weasel their way. In A lot of these attacks these days there is some social engineering aspect to it. I don't want to go back and rehash all the casinos, but a lot of times they're able to use those social engineering tactics to get from one person to another person or pretend to be security or the help desk and get access by taking their identity and then obviously, creating their own identity and going on, right? So I do think that social engineering is just a is super important, but I think you know the old way of just doing awareness training by once a year sending an email and seeing reporting who clicks on it and who doesn't, probably is not going to work. So what I would think that you would want to do, and what I would want to do, is educate my company on why AI has an effect on these types of social engineering messages, right? So any sort of awareness training, whether you do that in an online method, whether you do it in a security champion sort of training where you, you know, you have kind of this, you know training that you provide to the whole company, but I think you know, even coming up with examples where it's, you can record it and or whatever but to help people understand exactly how powerful AI is, because I don't think a lot of people understand, right? I was having a conversation with my wife who's in the medical field, from Grok, that said, from Elan that said you know, hey, you know Grok can now look at x-rays and PET scans and MRI scans and read those for you, and she was like what, right? So I think that the lay people don't completely understand the effect that AI can have on mitigating security controls that we have in place. So just educating them on that is just number one. Including that into your awareness training, that's probably that's kind of the entire user base.
Tim Chase:And then I think that the second part of the question potentially, if I were a security, if I were leading a security program, I would want to just be able to understand. First of all, I think we're kind of in the identification phase of of AI, meaning I just need to know where it's being used at the moment. Right, we'll put some, we need to put guard wheels around it. We need to understand which, which models are being used. We, you know all of that, but I think it's just reminiscent to me of when we started going to the cloud.
Tim Chase:If you remember I don't know, I think I went to the cloud back in 2011, 2012, something like that, I think, and at that point it was just all right, let me the answer. Security would say no, right, no, you can't go to the cloud. Why are we gonna store our data outside of my control? But that doesn't work. And so you get past the initial no and then you've got to okay, how am I going to secure it? And I think that's kind of where we're at with Gen AI right now. Right, we're kind of getting past. The business wants to use it, security says no, they use it anyway. Okay, well, let's figure out how to secure it. And so that's kind of where I think a lot of folks still are is in that identification phase, to just understand it's being used and we're doing our best to secure it.
Joshua Crumbaugh:I might argue all of cybersecurity is still in that investigation phase. That's probably true all of cybersecurity is still in that investigation phase, but that's probably true I mean, I feel like we should really equate it more to medicine they practice. I feel like we practice cybersecurity because every day I feel like I'm learning something new, oh yeah.
Joshua Crumbaugh:And it's one of the most common questions how do you keep up, right? So the pace of change is so rapid in this industry. But no, I agree, it really is in its infancy. We don't necessarily understand all of the use cases, all of the different applications, even all of the models. I mean there are a million different models, new ones coming out daily. I mean just in terms of some of the tools that I use. I jump monthly between tools because the market changes that quickly and if I want to be able to stay on top of things, I have to be able to be that agile too, but it's also you know, the complexity here is, it's not just a security problem, it's a privacy problem, right?
Tim Chase:I remember kind of when machine learning came around, right, I was doing machine learning at Calibra and kind of working, because Calibra is a data intelligence platform and you have the ability to auto-classify data, to say, hey, is this PII, PHI? What do we have here? Well, in order to learn, the question is do you allow data to mix? Do you allow data, does it have to be kind of its own tenant? Because in order to learn the best, you learn the best with a larger amount of data. But there's some privacy issues as well, right, and so I think that's the other aspect of this that I've been chatting a lot about is how do you balance the accuracy of the machine learning model with the privacy part? Like, can you get it accurate enough by having your own tenant? Do you have all of your customers' data going to one model to be analyzed, or do they each have their own? Like? There's a lot of different aspects to it, I think.
Joshua Crumbaugh:Agreed 100%. I know from my perspective, this is the first time it really seems real. I say that because AI has been the next big thing that's going to take over the world since 1958. But this time it really is, and I see technology exploding like it did in the late 90s, early 2000s and we haven't seen that level and that rapid of growth in a really long time. And so it's interesting and it's fun and it's exciting, but it's also scary because I see how quickly these companies pump out technology. I both worked in an application security you know capacity for very large organizations as well as founded a software company. I know how long it takes to move and with the rate they're moving, we know they're using they're heavily utilizing AI to write code, and so how does that impact security?
Joshua Crumbaugh:Do we need to worry about that? I would assume we do, because there's no way with you can push stuff out that rapidly and do it right. So I am concerned about when all of these vulnerabilities that weren't thought about at the beginning start to surface. And I was just thinking about it from a personal perspective the other day and you know, I was just thinking about it from a personal perspective the other day, but I was in my chat and searching for something and sort of going through all of the different threads.
Joshua Crumbaugh:Man, if somebody got a hold of this, the amount of information they would have on me. So yeah, it was just, I don't know, an interesting thought. But I don't think it's just me. I think everyone has those massive amounts of these conversations.
Tim Chase:Mine's full of ninth grade algebra, where I sneak off and pretend like I know what I'm doing from my daughter's algebra class and I go over to the side and just ask it how to solve and graph a problem for me, because my daughter still thinks I'm genius. So that's what my chat GPT log looks like. But no, I mean, you're spot on Right and I think the uh, I think the struggle is just that. Um, you know, even in the security industry, if you look at a lot of the products that are out there, you know, is any how many products are utilizing AI in a way that is advantageous to our industry right now, versus just being able to say they do it?
Tim Chase:There's a lot of chatbots out there, you know, but are we? It has the potential, like you said, uh, like this feels real, but you know, when does it get to the point? Where it's, um, where it's, it's something tangible that can deal with some of the problems that we have in our industry. Like you know data overload, um, vulnerability, prioritization, um, you know, data lake, um, analyzing of data lake, things like that that I think are serious problems in our industry. Can you use an LLM in a fish filter? So can you detect itself?
Joshua Crumbaugh:We were able to train one. We're bringing it to market right now, but I mean it is. I think those are the questions that we got to ask. And to one of my guests that I had on last week, wendy Nather she's saying we got to question everything, including what I'm saying. She pointed out and what she was saying.
Joshua Crumbaugh:But even what I'm saying I mean we got to question everything because there's so much, even in our industry, it's like what you were saying with AI, where we see AI there for the purpose of having AI, without any thought about providing real value. I almost imagine this like cartoon, like character smoking a giant cigar, just like give me some of that AI, you know. But I actually heard people say that we need AI. Okay, well, what is it gonna do? How is it going to help? And as we learn those things and look at the research because one thing I will point out is that for decades we've known the potential of AI and for decades, academic researchers have been writing papers about the potential of AI across different sectors we can just simply go there and look at it and there's a bunch of ideas about how we can effectively use this to provide value. In fact. I mean, I look at what we do at Fish Firewall. We built almost all of our technology off of those research papers that said listen, nothing works in this industry. If you want to actually change some human behavior, you got to do this, this and this. We were just the only ones that bothered to read the papers. Love it, but I do think there's value there.
Joshua Crumbaugh:I don't want to keep the whole thing on AI there's really so many other things but I think one of those sort of sub points of AI, and one of the reasons that I feel like AI has a particularly has a lot of opportunities, specifically in security awareness, is because it can customize things down to that individual. And when I think about the AI threat and how they're targeting us, I see AI being used to send finance fish to my finance team. I see it being used to send development fish to my developers. So to me, this is one of those areas where we need more of that contextual, role-based training, and the behavioral scientists of this world have already proven that when we can make something contextual to somebody's role, it is 15x more effective, and so, if for no other reason, just because we want our training to be more effective. We need to go down that path, but I think we have to go down that path because the threat has already gone down that path too.
Tim Chase:No, I agree totally Right. The biggest part of training is making it be something that isn't just a checkbox, right, something you have to go through every year. It's the same for everybody, but in reality the risk isn't the same for everybody. Like your finance folks have a little bit of a different risk than everybody else, right, because they're at risk with the money. They're at risk with the money transfers.
Tim Chase:How many times I've worked for companies where they have fallen for the hey, can you wire me this right? And sure enough, the money goes out and it can't be gotten back? You wire me this right, and sure enough, the money goes out and it can't be gotten back, right? So I think you know, being able to speak in the lingo, that they understand. Like you said, the models are already there. But let's, let's meet them where they are. Let's get the.
Tim Chase:You know, like I used to work for, for a company that was healthcare training, right? Like, if we're going to get there was a security aspect to that training. Well, let's, let's talk about that. Like, what specifically do they need to know? Like my wife is a nurse practitioner, um, what she needs to know from from a security training is different than what I need to know from security training, because I know I just have a baseline knowledge that's different than her right, and so meeting them where they are is both beneficial, because it will help, but it will keep their attention as well.
Tim Chase:That's, that's the thing. You know, the micro training has really taken off. You know, there used to be these 20 minute videos, but now you've either um, get the live action training or you've got the, the micro training. I do LinkedIn learning. Uh, I've got like four or five courses, I think, uh, out there, and and that's the big thing for them is they're like hey, you're like make it 234 minutes, you don't want to go any longer than that, because you even got crazier than that.
Joshua Crumbaugh:Our shortest videos 12 seconds.
Tim Chase:There you go.
Joshua Crumbaugh:Well, I mean, when you look at Tiktok, you look at reels, you look at shorts what, what do? What is every single social media application have in common? They have the short form video, you know platforms where I can just go and scroll through, and most of them now are giving you more categories, like TikTok now has a STEM tab.
Joshua Crumbaugh:So you can go there and just keep completely on STEM and it's all vetted, known, published, reputable authors that aren't just going to give you, you know, junk. But I mean, I see this as the direction our world's going in. You know when, when all of this started, social media whether we blame Tom from MySpace or Zuckerberg, regardless of whose fault it is it really did start changing the way that we ingest information. The internet did. In general, we went from long form to short form, but when you look at it, it seems to me that content and entertainment and all of that is just a little ways behind advertising.
Joshua Crumbaugh:If we look at advertising, years back it was the standard length of a commercial was a minute. Now the average length is 15 seconds Really drastic difference. And to me, I see that as being how our security awareness and sort of approach to training people has been going to is shorter, shorter, shorter to match the shrinking attention spans that that our people have. And let's be honest, let's take that health care worker how much time do they have to learn? It's their walk between one patient to the next patient, and so we want them to be able to do their training and be done before they get in there and that's how you can keep them more security aware. If they have to make time for it and go sit down, they're not going to Not going to it.
Tim Chase:Just it comes to the very end. I would be curious to know if because I agree that the attention span has really came down. So I would love to see a study that is that because of social media that's brought that down. Is it because of social media that's brought that down? Is it because we're too busy? I'd be curious to know what exactly has brought that down. That's probably a psychological study somewhere.
Joshua Crumbaugh:Yeah, I mean, I know there are some studies on it. I would have to guess, though, that most of it is anecdotal, because at the end of the day, we don't necessarily know. But the few studies I've read really point to technology in general. It's it's the instant gratification aspect of our society that has changed it, from us being patient and more willing to put in time to read or watch the video to hey, I'm used to getting everything. Now, why don't I have it? It's been 10 seconds already. I joke, but I've been that guy at the drive-thru before.
Tim Chase:Oh, 100% right. Why is my food not ready?
Joshua Crumbaugh:Right. So, along the lines of role-based training, you hit on a few key departments like your finance people, your developers. Are there any other core common departments that you would focus in on when building out those roles?
Tim Chase:Yeah, I mean, I think privacy is a big one too, and legal, like those two, they have very specific. They have very specific functions, whether it's the worldwide privacy that they have to worry about, whether it's the GDPR, ccpa, like there's some very specific things there, I think, and then legal as well. I think both of those teams speak kind of a different language and so they need to probably have some specific training, and I just think cybersecurity professionals as well. I know I mentioned them. Yes thank you.
Joshua Crumbaugh:I was hoping you would say that. Yeah, continuous MITRE ATT&CK framework training so that they know the tactics. Yeah, yeah, exactly.
Tim Chase:Because they that they know the tactics. Yeah yeah, exactly Because they need to know the tactics. Like I said, we have a baseline understanding of cyber and so I think a lot of the stuff we're like, oh my gosh, like why are we like we know this, Like we could have written this Right, why are we having to go through this? And so why don't we up level that for our cyber folks and provide them some specific training? So I would say those are the ones that hit me, like the finance, the legal privacy in cyber specifically probably would have some there.
Tim Chase:The only one I'd add to that is IT, because, as an ethical hacker, I exploited their mistakes almost more than the average user.
Joshua Crumbaugh:That's true, but I couldn't agree more. I think those are really great roles. I know that as everyone goes down this path and we're seeing it more and more because the CMMC requirements are dictating that you have to have role-based training. Of course, in typical government fashion, they didn't name a single role.
Tim Chase:They just said that you need to have it, so we'll see Very high level.
Joshua Crumbaugh:Yeah, yeah, I guess it's up to us to define it. I don't know, we'll see.
Tim Chase:But if you don't do it, they'll come at you. That's kind of the way it is Exactly. I hate the ambiguous stuff, right? That's kind of the way it is, exactly, I hate the ambiguous stuff, right?
Joshua Crumbaugh:Well, I mean, I think the goal is to let industry flesh it out, but when it's ambiguous, what my experience is is that most places just do the bare minimum. Oh well, if I can interpret it this way or this way, this costs less. Yeah, that costs less, exactly OK.
Tim Chase:So I love to ask one question on every single podcast and that is, if you only had one tool in your arsenal that you could use from now on to drive engagement and to drive awareness. Would it be the carrot or would it be the stick? I am, I'm a carrot kind of person, Like I. Just I think of the stick as the department of no sort of security and I just don't think you don't want to be Dr.
Tim Chase:No, I don't want to be Dr. No, I do not, like I've kind of lived it, I've done that before and I just don't want to be Dr. No, I do not, like I've kind of lived it, I've done that before and I just don't think we're there in this day and age anymore, right.
Joshua Crumbaugh:I think Well, I don't think it was the right way to begin with. Just, we didn't know any better.
Tim Chase:We didn't know any better, right, cyber was maturing, people were learning more about cyber and like everything that you read, um, and this could go on for a while. But, like a good cso, today is business aware, right? I just think that, yes, you know, um, when you they're a business enabler, and in order to do that, you got to speak two languages. You know, you got to speak the business and speak the tech, and so I think that, um, in order to do that, you kind of have to, and so I think that, in order to do that, you kind of have to get along with people just to be, just to be frank, right, maybe the DevOps sort of revolution kind of started that. But I think also is, security has just gotten more expensive. They're having to prove their value more, which, ironically, you know, they didn't have to do as much. But I think that in order to both be a business enabler, to prove your value, you have to work with other departments, and so I think that's where the carrot comes in, right, it's the hey, let's work together, let's let's entice people, let's figure out how to do this, let's get rewarded for doing the right thing. I think that's, that's the way to do it.
Tim Chase:I just think that when you use the stick method like, things just don't get done right. You ultimately, at the end of the day, if you have, you know, 5,000 vulnerabilities in your environment because you scan the code like they got to get fixed, and if you're just sitting there hitting somebody over the head and tossing them over or whatever, then they're not going to do it. But hey, what if we have this security champion program that provides training, it provides education, like, provides a way for you to maybe do them a little bit at a time, like and then you know, whoever fixes the most bugs in the next sprint, you know, gets something, gets a gift card. I don't know like. I just think that that that's the carrot is just, by and large, uh, the the best way to do it and I'll point out that let's the carrot is just, by large, the best way to do it.
Joshua Crumbaugh:And I'll point out that let's take the developer, the vulnerabilities in our code base, right. And if you're the developer and all I'm using is this deck you're not motivated to really fix that, you're only motivated to kick the can down the road. So what's that mean? Well, you can just as easily say, hey, I fixed it, send it back for retesting with one line of code being changed, knowing that it's going to fail. And I've seen it happen a million times, where things go through retest after retest, after retest, after retest, and it's because the people aren't being engaged. There's no motivation to actually fix it, there's only motivation not to get in trouble.
Joshua Crumbaugh:And I think when you use the stick, that's what it becomes. I don't want to get in trouble, and so people are much, much more likely to do the bare minimum, and that's not what we want. We want to build that growth mindset and that growth culture where people go above and beyond, they help their coworkers, there's a team effort. I think that's what we want to cultivate and it really even goes goes to the whole argument for more people to at least try and be extroverted, or even just more extroverts inside of cybersecurity. Because when we don't go out and talk to the different department heads, the different people in our organizations, we only think we see all the risk.
Tim Chase:I love it. No, you're, you're spot on right. It's a matter of um encouraging them so that you know they want to take ownership. They don't want it to happen again, right, not do the bare minimum, but, you know, maybe they fix it when they find it, instead of having it make its way through. Right, I, yeah, I, I love that it's it's make, it's. It's an easier way to make them have ownership of their own security, right, when, when you have that, that kind of carrot method of thinking, which is ultimately what we want, like whether you want to call it the shift left, I don't, I don't care what you call it, but ultimately we want everybody to be responsible for their, for their security, whether it's a developer, whether it's an IT person, we're still going to be there to catch their mistakes, like we don't expect anybody not to make a mistake.
Joshua Crumbaugh:but ultimately, that's what the carrot induces is kind of that that I want to do the right thing and I think almost everybody wants to do the right thing. There's this misconception that you got all these people that just don't care, and that's just not true. Even like on the hey, you clicked on a fish thing. I expected all kinds of people to lie Like almost nobody lies when they're just more honest than I ever would have expected. You know that's because people want to do a good job.
Joshua Crumbaugh:Everyone has a sense of pride about what they do or most people anyway and so it's very rare that you get that obstinate person that doesn't want to do good, that doesn't care if they click on fish. And one thing I've noticed over the years is that person has committed a lot of other crimes in the organization by the time that they don't, or you know they click on too many fish or whatever they rise to a level of you know risk that you need to escalate it to HR, and so one thing I'd say is you know our job is to communicate risk. If that person needs fire, there's other reasons that they're going to need to be fired.
Joshua Crumbaugh:But for everybody else they're trying, and so we've got to try too, because the more we try, the easier it is for them to do.
Tim Chase:When they try To some, you're right, I mean usually the ones that have been problem children, so to speak, like their problem, they're already known to be problems because they're, they don't work well with others or like the ego is, like there's something else going on and it's it's not personal, but I. But I feel like, um, if you make someone, if you make it easy for someone to do the right thing, then that's when, um, that that's when you win their hearts and minds. I think when you, in my experience, when I've, when I've, uh, I had a, I talked to the c, I don't know four or five months ago, and they're just having problems with their DevOps team, he's like they won't fix anything, they won't work with me, they won't do any of that, Right, and ultimately it's like, okay, well, let's, let's take a step back and figure out why, right, because, yeah, that's probably what's. No, probably what's happening is you're throwing a bunch of stuff over to them and saying you're not fixing it. Well, how can we make it easy for them to fix?
Tim Chase:Like, in my experience, you know, I bring them into the, and this is what I always advise when I'm doing like my training and stuff is bring in the DevOps team and say, look, we want to make this easy for you, like, what language are you building in? What IDE do you use? Are you you know? What build process do you use? Are you a Jenkins, shop, circleci, whatever?
Tim Chase:And let's make this easy for you so that you can integrate it and there's not a lot of work, right, and to me, making it easier on them so that they can still do their work, making them think that they have a, not just something that security has mandated, but making them have a seat at the table, like those two things are, are what really helps in in my um. So it's not a you know 100 solution, but that, the kind of taking that, that general mindset, is really kind of what you were saying. Right, it helps them. They want to do the right thing. Let's make it easy for them to do the right thing and let's empower them in the decision-making process as well.
Joshua Crumbaugh:And I think another thing we can do is praise them when they do the right thing. You know, I see a lot of teams, a lot of organizations or security awareness programs that are really, really good at telling somebody when they do a terrible job, hey you clicked, you suck man. But they're not as good at telling them when they do a great job or a good job. And to me, when they click report on one of our simulations, that's an opportunity to be back there saying, hey, I can't fool you, you're a rock star, you know. Or when they click report on a real fish, that's an opportunity to be in their inbox saying, hey, you truly just saved the company from a real cyber attack.
Joshua Crumbaugh:This is why we need you. And guess what happens the next time they see something suspicious? They get a smile on their face and they get excited because they get to report it. And what's?
Tim Chase:going to happen.
Joshua Crumbaugh:They're going to get kudos, and everybody loves a pat on the back. To me, that's the type of gamification we should be using for adults. It doesn't have to be a gold sticker. Just tell them great job, that's what they want.
Tim Chase:I love it. No, you're spot on. Give them a praise, give them a pat on the back, reward them, uh, and and, like you said that it'll, they'll be your friends and they'll, and they'll help spread the word. Right? That's ultimately what it's about. Is you want that's?
Joshua Crumbaugh:kind of a different way of building security champions. It's a different type, right it's? I try to keep this to about 45 minutes. I did want to ask you real quickly though what are you doing for Security Awareness Month? I know we've got what two days left. But have you done anything special, anything fun, went and given any talks anywhere. You know, tell us, tell us what's up this month.
Tim Chase:Yeah, I'll, I'll tell you what I have done besides, you know the posting on LinkedIn which you know. Whatever I had the opportunity, I posted about it a little bit, but I've really taken the opportunity to take the take it to the schools. So, you know, I've got four kids my oldest is 12, going on 13 and cyber security is a little bit really. Yeah, busy, really busy. Luckily tonight we only have one soccer game to go to instead of two or three.
Tim Chase:But but, like one of the things I went and spoke at their school on on cyber security and it, it, it is, it's continually enlightening to me um about um, the fact that they don't, kids don't always know what cybersecurity is right and in 2024, you know, they still think you know it's missiles and it's it's it's national cybersecurity and things like that. They don't understand necessarily the fish they don't understand. You know people are going to want to take their information like that's not really on their minds and so you know, I I've kind of taken up the, the, the cause, if you will like. I really want to do more with my kids and my school this month and in other months as well, to like all right, let's, let's, let's, let them know, like not only here's good cyber security, but here are the different careers that you can have in cyber security kind of kind of doing both high five man no, I, I think that's a really good one actually.
Joshua Crumbaugh:Uh, bz, he's the navy reserve cyber warfare commander. Um, he's doing a series where he's asking all kinds of people how they got into security and he's doing a tedx in silicon valley where he's just playing all these videos. Uh, but I thought that was really cool. That is cool. But one of the fun, uh, I guess one of my passion side projects, uh, charity initiatives, whatever you want to call it we got an opportunity to work with cyberorg, funded by DISA, and the whole goal of cyberorg is to take that training into K through 12 schools and start getting it in front of people really young. And so at our you know, youngest sort of, I guess, themed stuff, we created like these little short videos under 60 seconds they can play in the classroom, but you know it starts off with, you know, a focus on second grade and talking about who they're going to meet online and what types of dangers there are and game chat and the types of scams they're going to get hit with and then it works its way up as they get older and older.
Joshua Crumbaugh:So, um, I really do feel we need more of that. Look at just, I guess uh, I don't know just are people entering the workforce. The youngest people are as insecure as the oldest people.
Joshua Crumbaugh:It almost doesn't make sense to me that the people that grew up on technology are just as insecure as the people that didn't have technology until they were quite a way into their adult life. Yes, it's at a very missing mark somewhere, because they have this false sense of security, and so if we want it to be second nature like looking both ways before we cross the street that starts.
Tim Chase:It does it because they live in a different world to some extent, right, I do, I'm older, right, and so, like I'm sitting here, you know, like every time my daughter wants to install an app on her phone, I'm like, is there a chat feature? Like who can you talk to? Like you know, that's something you have to think about, because a lot of these games and things like that, like there's a lot of things that can happen in the background of who they can talk to and chat with, and so, like I'm pretty strict with explaining to her like I'll let you have life 360, but um which we probably can have a whole other discussion about whether that's okay or not but like you can't chat and you can't be in it with your friends, right? Or, uh, you can be, you can have um, you can do your your Spanish duolingo, but like we're not going to do the chat function and things like that. You know, just just stuff like that, right? So just meeting them where they're at is is kind of important, and that's what I like working with, with the kids and trying to understand what they do day in and day out, cause I don't, I mean, I don't, I don't always know I don't do video games or all their social media apps as much as I do.
Joshua Crumbaugh:I. I mean, I used to feel like I was always up on things, and now I'm starting to feel old, so there you go Slang, I don't even get. But, jokes aside, this has been an amazing episode because it's Security Awareness Month. Do you have any advice for the non-technical user who might stumble across this program?
Tim Chase:I mean, the best advice that I can have is to as cliche as it might be is to always think right, always double check. My wife we've been married for 15 years now and I've been doing this for a while, so she'll come to me and she'll have me look over something, like if it smells fishy no pun intended like she'll come and say Tim, will you just look at this to make sure, cause I think something's up with this. I just get a second set of eyes, Like if you think something is is strange or fishy, there's a good chance that it is, and so don't be afraid to just step up and just ask somebody else hey, what do you think about this? Right, that's the best advice I can give is is don't be, don't be ashamed or scared to do that.
Joshua Crumbaugh:Yeah, and and trust your gut. Uh, you know, I think a lot of people don't realize that that gut feeling that they've gotten, that's probably saved their lives, didn't wait for me when I was a teenager many times. But that gut instinct that you get is your subconscious that moves, at a rate, much quicker, conscious brain just warning you about things, and it's your body's built in antivirus. So the more you can learn about the types of attacks you're targeted with, the better your subconscious is at detecting them.
Tim Chase:I love it, spot on.
Joshua Crumbaugh:Well, hey, thank you so much for joining us To the audience. Thank you once again. We'll be back tomorrow with another episode of Phishing for Answers. Thank you and have a great day.