OUR BLOG

Security

Testing Your People – Creating a Culture of Security

November 8, 2016 0 Ashley McDaniel

We have long known that People are the weakest link in security. Organizations today realize they can have the best security technology money can buy, but if one employee clicks a link or downloads an attachment from an email, all that security gets thrown out the window. Organizations are beginning to provide more training and education to their employees to help ensure everyone is doing their part to prevent attackers from gaining access to our networks and stealing our information.

Testing Your People

There’s one caveat to more training and education: how do you know if it is working? The answer is to test your People. We already test our Technology via Vulnerability Assessments and Penetration Tests. Most organizations are required to test their Processes (typically an annual IT Audit). But if we know that People are our weakest link, shouldn’t we be testing our People at least as much, if not more, than our Technology and Processes?

Creating a Culture of Security

Testing your People is not difficult, but it’s only half the battle. The difficult part comes after individuals fail an assessment. How does your organization handle failed Social Engineering Assessments? Most organizations do not take the results of the assessment seriously or simply respond with additional training. However, the key to creating a stronger culture of security is to hold our people accountable. A failed Social Engineering Assessment shows that your organization has serious weaknesses in its People, which creates a very big risk to your business.

Holding your People accountable for failing Social Engineering Assessments or creating security weaknesses is a struggle for most organizations. However, in order to create a culture where everyone is expected to treat cybersecurity seriously, holding our People accountable for creating additional risk to the business has to be a priority.

Don’t Create Fear of Security

We do have to be careful not to simply punish undesired behaviors such as clicking on bad links. Ruling with an iron security fist typically leads to a fear of security in which your employees will hide their mistakes from you rather than report potential security incidents.

“How do we get our People to report security incidents rather than attempting to cover up their tracks?” you might ask. There are a few different ways to get to the desired result, but it’s simply a matter of rewarding desired behavior and penalizing undesired behavior.

If we analyze the traditional “don’t click the link” model, it usually looks something like this:

sebehaviormodel

You’ll notice the only incentive for employees to not click is fear. Fear of getting in trouble, fear of reprimand, or fear of losing his or her job. Now, an employee clicking on a link creates a HUGE risk to your organization, so there should be some form of penalty involved with the undesired behavior. However, fear itself will not create a culture of security where employees buy-in and value cybersecurity.

If we take the process a step further and talk about rewarding desired behaviors, our flow-chart looks like this:

sebehaviormodelreward

The best outcome regarding a Social Engineering Assessment is not simply for an employee to ignore the attack but to report it your Information Security or IT Department. Consider providing a reward (the “carrot”) to employees that report attempted phishing attacks on your organization. The reward can vary from large to small; it is completely at your discretion. Many organizations utilize a points-system (gamification) that allows employees to collect points for an extra day of PTO or branded clothing items. There are a lot of ways you can reward your employees.

However, we still have an issue with the fear of clicking bad links and hiding the behavior until confronted. Employees that cover up their behavior does not help the organization out in the event of an actual attack, where you’ll be stuck playing catch-up from the start. What happens if we look at the “undesired behavior” of clicking and take it another step further?

sebehaviormodelpenalty

While clicking the link is still an undesired behavior, we want to promote a positive action once an undesired behavior occurs. To do that, we want to encourage our People to report their mistake rather than cover it up or simply not tell anyone. If you are performing your own SE Assessment, you’ll be able to tell who clicked, but if it’s an actual attack, you won’t have this insight.

Therefore, we want to “reward” this desired behavior (reporting mistakes rather than covering up) by effectively negating the undesired behavior of clicking the link. There’s no actual reward in this scenario, but rather than being penalized, the employee gets back to a neutral state. You can apply this concept to whatever reward process you have implemented as well.

The implementation of a behavior-based model for Social Engineering Assessments will help build a strong culture of security at your institution. Be sure to focus on rewarding the desired outcomes (training your People to respond correctly) rather than simply creating the fear of being punished.

How Broadtek and SBS Can Help

At SBS, we want to help make sure you are building a culture of security at your organization. SBS has always had a training and education focus, and we’d be happy to help provide some cybersecurity training to your People. Additionally, SBS has partnered with KnowBe4 to provide your organization with the ability to test your People through automated phishing email assessments.

For more tips on building a cybersecurity culture at your organization, take a look at the 10 Key Ideas to Build a Cybersecurity Culture infographic.

This article was written by our partner, Secure Banking Solutions.

John Waldman, CISA, CRISC – Vice President of Business Development – SBS Institute

sbs-logo

Ashley McDaniel
8 Posts

Comments are closed.