One of the job perks of working in the cybersecurity industry is getting to the know the community. From social media to conferences, I’ve been fortunate enough to meet and talk with several security awareness program managers, software developers, professionals of many hats, IT admins, and even some government officials.
Their stories are all different but they all have a common theme: information security is a human issue. Even as security technology gets better, most breaches come down to a human failure. Someone clicking, someone making assumptions, someone not following policy. Mistakes get made. It’s clear end-users need continuous awareness education.
Yet, there are still some people that believe we should ignore the human side of security, and point our main focus at developing and maintaining technology to defend against cyber-attacks. A few years ago, the CSO (Chief Security Officer) of Palo Alto Networks, Rick Howard authored an article titled, “When It Comes to Cybersecurity, Look Past Your Employees”. He opens by saying:
“Many security professionals are quick to blame the user. I’ve been in many a conference hall or closed-room meeting when the speaker, while sharing a story about how a user opened an email they shouldn’t have, turns to the audience and says with a knowing smile, ‘You can’t fix stupid.’ This kind of thinking really chaps my hide.”
I agree. That kind of thinking is a poor process to begin with. The idea that “humans are the weakest link in cybersecurity” is based on irrefutable evidence, but it’s also an unfair cop-out. For that matter, any organization that refers to their users as “stupid” should probably audit their hiring process.
Howard’s point, however, is that cybersecurity experts have a hard time being prepared for every attack method criminal hackers scheme up. Expecting the same of normal users is foolish. A fair point, to be sure.
But you know what kind of thinking chaps my hide even more? This:
“We should not be spending time trying to make employees experts at spotting phishing emails or determining which websites are good or bad based on how the URL looks.”
Here’s a fun fact: according to the Anti-Phishing Working Group, the total number of phishing attacks increased by 65 percent in 2016. In the fourth quarter alone, there were an average of 92,564 attacks per month.
Why so many?
Probably because phishing works. It seems that security technology has yet to eliminate 100 percent of phishing attacks. And it never will. Unless, of course, cybercriminals decide to stop leveraging phishing campaigns against end-users. Until that happens, we would be foolish to stop training employees about the most prevalent attack method worldwide. Frankly, it’s absurd to even suggest it.
Howard wasn’t necessarily saying we eliminate awareness training altogether. Rather, he wants us to focus training more broadly at policies:
“By no means am I saying that employees shouldn’t receive cybersecurity training. But that training should be focused on making them aware of the organization’s security policy and procedures, not training them to be cybersecurity experts.”
It should be noted that any program tailored towards making employees “cybersecurity experts” is going to fail. That’s the wrong direction. And I’m guessing Howard was exaggerating a little, loosely using the term “experts”.
Security awareness programs need to be made personal and easily understood. Staying alert for phishing and other social engineering attacks is not a technical skill. Quite the opposite. Cliché as it may sound, all the security tech in the world won’t save any organization if their employees lack common sense. That’s not an attack on end-users. That’s a realistic snapshot of why social engineering is successful. Without awareness training, the click-happy employee stays click-happy. Intrusion detection systems aren’t going to change that.
To Howard’s point, training absolutely should be focused on making employees aware of policies and procedures, and enforcing punishment for when policy is not followed. That is a part of developing a culture that’s resilient to cyber-attacks. Always follow policy. But it cannot stand alone.
And to be clear, I’m not cherry-picking one article from the interwebs and cherry-picking a few quotes from it to push an agenda. The thought process Howard suggested has been around for a while, meaning there are others who share his viewpoints.
In fact, he makes some good points in his article. His overall argument is that the “security community should be designing systems that protect their employees.” I 100 percent agree with that. But where he loses me is by saying, “Protecting the enterprise is the security team’s job. If one of your security team’s best security controls is relying on an end-user to stop the bad guy, then your program has some serious issues.”
Nothing could be farther from the truth. Making your organization resilient to attacks is everyone’s job. The security team’s. The executive’s. The IT admin’s. The end-user’s. If your program is relying strictly on threat prevention, then your program has some serious issues. And if your program is trying to train end-users to be security experts, then your program has some serious issues. And if your program fails to include execs in awareness training, then your program has some serious issues. You cannot isolate one part of your organization, and hope the rest of it stays safe.
Howard also fails to acknowledge that security tech can and will fail. There are new companies showing up every day claiming to have the latest and greatest software, that “they could have prevented WannaCry” or whichever massive breach is currently in the headlines, but it’s just a matter of time before those become outdated and leave the back-door open. In some cases, investing in them is risky business.
Investing in your users, however, is smart business. Yes, we need all the help we can get from technology. Yes, policy is important. But the strongest, most resilient organizations are those that tie everything together. That’s the entire point. Cybersecurity isn’t binary. It is isn’t just cyber or just physical or just human. It’s a cycle of training, testing, identifying, preventing, reporting, patching, and training again. Simply throwing tech at it is a lazy, fragile approach. This is a human issue. And it’s not changing any time soon.
So you’re welcome to take Howard’s advice and look past your your employees if you want, but I can promise you, cybercriminals are staring right at them.
Latest posts by Justin Bonnema (see all)
- Why Personal Security Is Important to Awareness Programs - February 11, 2019
- Data Privacy Day and You - January 24, 2019
- Account Compromised? What to do After a Data Breach - January 17, 2019