Any endeavour is made doubly difficult when pursued with a lack of metrics and without a clear understanding of cause and effect. When stumbling in the dark, facts are the flashlight of comprehension, illuminating the way forward when the path is unclear. Information security is often required to function in the dark with little in the way of facts to guide us. We hear noises and bump into things but can never be certain if we’re going in the right direction.
When security fails, how do we know? While failures of integrity and availability are obvious, failures of confidentiality can be silent and insidious. Some actors such as LulzSec boast about their exploits and derive their benefits from the resulting publicity. Other actors quietly go about their ‘business’ and organisations may not realise they’ve been breached. Often, even when we do discover failures of confidentiality the organisational interest is to bury it. As a result, our profession is rich in rumours but poor in facts which make it difficult when trying to understand the effectiveness of security controls.
But what if other industries have the data that we lack? The objectives of safety risk communicators in changing behaviours are in many ways the same as security awareness but they have significant advantages. Safety failures are usually noisy, announcing their presence with the clanging of falling pipes, the scream of a burn victim or the crash of car bonnets. Reporting is also enforced by legislation in many countries and government agencies make the data freely available. This has made it far easier to understand how safety communications have impacted accident rates. Safety risk communicators have been able to learn lessons and improve their methods using decades of high quality research. So what can we learn from them? Some key points are:
Changing behaviour requires a concentration of effort. There’s a reason you don’t see TV adverts showing ‘Top 10 suggestions for safe driving’. It’s because they don’t work. Covering multiple topics simultaneously has been shown to result in a dilution of effort where little or no behavioural change results.
- Understanding the perceptions of your audience is critical. We throw around words like ‘Virus’ and ‘Firewall’ and often these words are taken to mean something different to what we intended. We need to do better at checking vocabulary domains with our audiences before communicating.
- As technical experts we often over-rate the importance of the ‘facts’. Actually, people don’t respond to facts to the degree that we expect. ‘Facts tell, stories sell’ is a powerful marketing adage which is equally true in risk communication. We need to make risks personal as part of a story.
- Managing your audience’s attitudes to risk is important. Its been shown time and time again in safety studies that behavioural change is not only a function of risk awareness but also a function of risk tolerance. We need to spend more time managing our audience’s perceptions of due care by using moral appeals. Behavioural appeals in a moral context are a powerful motivational tool.
It’s astonishing to think that if only some of the lessons learned in the safety field are true then we’ve been wasting a huge amount of organisational treasure on ineffective communication techniques. Why isn’t this wastage self-evident from our own data? The Head of Information Security at Network Rail, Peter Gibbons, recently said to me that he wished that the security profession was more like Alcoholics Anonymous. I thought he was making a comment about the ISSA London chapter but actually he was observing that the first stage to managing a problem is to admit it exists. Many organisations won’t talk about security problems which mean that the industry as a whole is forced to rely on cocktail anecdotes. Saint Marcus of Bellwether recently criticised the use of unsourced rumours in security industry newsletters pointing out that this kind of hearsay wasn’t helping anyone.
It seems that it will be difficult to get the stats that we need from our own industry given the constraints of unreliable incident detection and organisational motivations for not sharing security failures. In that case, the answer isn’t a neighbour’s cousin’s friend story over drinks but reliable statistics from parallel disciplines that have the data we lack. We need to stop re-inventing the wheel on our own and jumpstart our communications effectiveness by looking to learn lessons from communications experts outside our field.
Published in the October 2012 ISSA Security Journal