Leveraging Existing Audience Beliefs

When it comes to security awareness, there’s no such thing as a blank canvas. Your audience will already have pre-conceived notions about your topic. The language, tone and media you use will invoke associations in people’s mind, both helpful and unhelpful. These associations will influence how people view the root causes, likelihood and potential outcomes.…

Comprehensive verses Comprehension

I had a very strange encounter with a PCI auditor recently. On viewing my client’s security awareness portfolio he refused to sign it off as meeting PCI requirements because it didn’t cover ‘everything’. It got me thinking. There are two schools of thought when it comes to communicating risk. The first is the comprehensive approach where all the facts are presented. As part of this mindset, most organisations require their staff to read and agree to a security policy which is usually long and written in formal, contractual language. The majority of employees probably only skim their security policies and even if they did read them in full, would they understand them? Information security can be difficult to understand at the best of times without adding the additional complexity of overly formal and legalese phrasing. The Paretto Principle, or 80/20 rule, suggests that only a small percentage of content really matters but comprehensive approach usually means hiding it

Personas For Security Awareness

Large scale awareness programs can be challenging with so many topics to cover, so many different communication options and such varied audiences to consider. Also, your communication efforts will be competing with background noise. Every day, people are bombarded with advice. Exercise more, eat more greens and don’t click on dodgy links. The question is how you can make the most of the limited time and attention available. The Pareto Principle, also known at the 80/20 rule, proposes that 80% of consequences come from 20% of causes. If we apply this to security awareness it implies that 80% of the risk comes from 20% of topics. The problem is in knowing which 20% of users and content this applies to.

ISSA Security Awareness Column Feb 2013 – Innovation in Information Security Awareness

Here’s a trivia question for you – how did President George Washington die? No points for anyone who thought he died in battle, fell from a horse or was poisoned.  Actually, he had an infection and suffered massive blood loss. Why he suffered massive blood loss is fascinating. For thousands of years people were convinced that blood could become stale and that ‘bad humours’ could cause illness for which bloodletting was the solution. When Washington became sick, his staff did the natural thing at the time and bled him. When he didn’t improve his staff bled him some more. Then the doctor was called and when he arrived Washington was bled again. All told, Washington lost some 6 pints of blood in a 16 hour period. He had a severe infection to be sure, but it’s likely that the massive blood loss significantly contributed to his demise.

Sometimes, how we define a problem limits our ability to solve it. Innovation counts for nothing if the approach itself is the problem. Physicians focused on how to let blood more effectively for thousands of years. Elaborate rituals developed to define where on the body blood could be taken from to fix specific aliments. Contraptions such as scarificators were invented to help people administer their own bloodletting – you didn’t have to visit someone to get them to do it for you (ever wondered what the red on a barber’s pole stood for?).

ISSA Security Awareness Column November 2012 – Why Do People Ignore Risk Advice?

How frustrating is it when you point out the risks to people and they just don’t listen? Every day around the world there are millions of people who smoke, drive too fast and click on strange emails, even though they’ve been repeatedly told about the dangers. They are ‘risk aware’ in the technical sense of the word and yet their behaviour continues. This is a big problem since the mainstream approach to security awareness assumes that all that’s needed to achieve behavioural change is an understanding of the risks. Traditionally when encountering non-compliant behaviour, we security technocrats reiterate the facts and increase the threat of sanctions. But, there is another way.

Luckily for us this is a problem that safety risk communicators have been grappling with for decades. The safety risk communications field has a number of explanatory frameworks to predict how people will react to risk communications. One of the most interesting models to arise is the Extended Parallel Processing Model (EPPM) which seeks to explain why people fail to take action once aware of a threat. This is a goldmine for security professionals looking to apply a more structured, formal approach for promoting behavioural change.

ISSA Security Awareness Column June 2012 – Security Awareness in Crisis

You wouldn’t know it by looking at it, but the information security awareness industry is in crisis. Humans are increasingly seen as the weak link in information security defences and human factors are increasing in prominence as a preferred exploit. Time after time we’re seeing expensive technical solutions bypassed by a simple call to the helpdesk or someone just asking users for their password. A cynic might say that’s because mistakes are inevitable when humans are involved. However, have we made our best attempt at managing human information security risks? In a series of columns about awareness and risk communications we’ll be taking a fresh look at the ways we attempt to manage human risks.

Technical information security solutions have advanced in leaps and bounds over the last two decades. We now have real time anti-virus, local firewalls and automated patching. It’s a far cry from the old days when we had to remember to load anti-virus manually once we started our computer. By comparison, human security management remains largely unchanged. We create information security policies and publish them on intranets. We hold mandatory training sessions. If the problem is getting worse then what is the solution? More policies? More mandatory training? Or, is there a fundamental problem in how security professionals are approaching the problem? Remind me again what the problem is we’re trying to solve? Our implicit assumption seems to be that the cause of insecure behaviour is a “lack of facts” known by an audience. Hence we distribute information in the hope the behaviour improves. But what if people have heard our message before and that didn’t fix it? Telling people again what they have likely heard before can only have a marginal return at best.

Death by a Thousand Facts: Criticising the Technocratic Approach to Information Security Awareness

Recently I co-authored a paper “Death by a Thousand Facts” with David Lacey for the HAISA conference where we explored the nature of how technical experts choose what content is included in risk communications. A copy of the proceedings is available here. Basically, mainstream information security awareness techniques are failing to evolve at the same…

Mental Models

mentalmodelsOne of the problems with the current approach to information security awareness is that methodologies such as ENISA are detailed about the logistics of planning security awareness but don’t have much to say about the content of security awareness.

So, how would you determine what information an audience needs to know so that they can manage the risks they face? Mental models offer a structured way of approaching risk communications rather than just “broadcasting facts”.

A mental model is a pattern of understanding held by an individual. It consists of what beliefs they hold, the strength of those beliefs and the connection between beliefs. Safety experts note that when risk communication takes place the audience will have some degree of pre-existing knowledge which forms their mental model:

“…for most risks, people have at least some relevant beliefs, which they will use in interpreting the communication. They may have heard some things about the risk in question. It may remind them of related phenomena.” (Morgan et al 2002)