Senior management support® is something often mentioned as critical to the success of an information security awareness campaign. There are a number of reasons for this. Firstly, senior management help direct the usage of resources within the organization. Without their support, you won’t get much of a training budget, permission to take staff away from productive duties and you might even struggle to get a room booking. Secondly, managers set the tone for behavior in the organization and it’s common for staff to imitate their manager. This is often exhibited in the way they dress like their managers and also when they behave like their managers. Do your managers scoff that ‘the rules’ are for everyone else? That training is only for the IT-illiterate and don’t bother to show up? The bad news is that many of your staff will copy these behaviors.
A common objective of information security awareness is to encourage whistleblowers to use internal mechanisms to report their concerns. External whistleblowing and the airing of concerns in public view risks brand damage and exposure of sensitive information. The Snowden affair has shown how divided we are on the ethics of external whistleblowing. To date, much of the debate has been speculation about Snowden’s character flaws. Sometimes when trying to understand a controversial decision such as Snowden’s it helps to understand the chain of events leading up to the decision since failures in complex systems can rarely be given justice in a single newsbyte. In this case there are a series of failures that occurred prior to the employee of a subcontractor deciding to flee the country and leak sensitive information to foreign journalists:
Trust is an incredibly important concept in information security and a vital component of influencing an audience. We know from safety risk communication research that it’s not enough to be an expert in your field. It’s not enough to be correct. You also need to be trusted by your audience. Otherwise your level of influence will be reduced and people may decide to act in ways that challenge your mission objectives.
When I wrote the July column as satire imagining what a GCHQ letter to a supportive member of the public might look like I was poking fun at the unrealistic expectations about our intelligence services that were being perpetuated. That as ‘big brother’ they knew better and were always looking out for our best interests. I recognize now that what I was also doing was challenging the notion that intelligence services innately deserved a high level of trust.
I’m amazed at how many people are offering advice on what information security topics I should be deploying. They seem to know what training is needed despite having never met me or my beautiful users and not knowing anything about my organisation or it’s goals. There are plenty of top ten lists of awareness topics. Numerous generic training packages are available on the internet. I’ve got nothing against generic awareness materials or topic lists as such. In fact some of it is very professional and far better than individual organisations could create. While it might be easy to use someone else’s training package or use their list of recommended training topics that doesn’t necessarily make it a good idea. I worry that we haven’t properly defined the problem that we’re trying to solve. If training material X is the solution, what was the problem?
Dear Michael Burgess of Tunbridge Wells in the UK, we in the GCHQ read with interest your recent letter to the Guardian Newspaper in which you state that you’re not bothered if the Government knows what web sites you’ve been visiting. It is refreshing sir, (and we know you are from the scanners at Heathrow airport) to find a true patriot who welcomes the state’s determination to know everything about everyone. Corporate security awareness programs have been advising for years that personal privacy is something that can’t be ‘fixed’ once lost so your willingness to permanently surrender your privacy (and the privacy of anyone you communicate with) is appreciated.
If your organisation was an animal, what would it be? Is your organisation a risk taker? Short sighted? Perhaps it’s slow to react? I’ve worked for elephants, giraffes and even a hyena. Animals and organisations both have their behavioural quirks and ways of optimising their survival chances in their particular environment. However, what worked in the past isn’t always the best survival tactic in the present. Sometimes organisations need to adapt due to factors such as customer demand, regulatory changes or new environmental risks. Behaviours adopted in the mistaken perception that they are helpful can even be self-harming and may need to change.
Last month we discussed information security culture and the shared underlying unconscious assumptions of staff that frame it. This month we talk about how to go about trying to change security culture. Changing the culture of an organisation can be a significant challenge and I’ve seen many efforts fail.
There are three things you need to know before you start. Firstly, you need to identify what problematic behaviours exist. Secondly, you need to understand what beliefs, attitudes and unconscious assumptions are enabling them. Thirdly, you need to know what cultural values you’re aiming for to re-align the organisation’s behaviour towards it’s key goals. Potentially, this means the ‘un-learning’ of one set of beliefs and the learning of a new set.
As I escorted him to his desk I became conscious that everyone was looking at me. I did all the usual self-checks of fly, food on face and freaky hair but came up negative on all counts. When someone had tailgated me through a secure door I had challenged them. Rather than leave them outside when they didn’t have their pass with them I offered to walk them to their desk. I found his manager who told me with an expression more serious than a budget facelift: ‘Yes, of course he works here – he’s hardly here for the view’. What had encountered amongst the engineers at this small satellite office was a very different security culture than what I was used to with my head office, ivory tower view of the world. The culture that I had encountered worked on high levels of trust. They all trusted Dave so couldn’t understand why I didn’t (even thought I’d never met him). I less than a block from the head office of this organisation and yet the security culture was completely different. For me, the experience was an eye opener that effort is needed to understand not just if people are following security policy but the extent to which policy is reflected in security culture.
It was a children’s birthday party. He cried and whinged and pleaded with tears streaming down his face. For about 2 minutes his mother said no but eventually she pulled a chocolate biscuit from her bag and gave it to him before turning to me and saying “I just don’t know why he cries so much”. Operant Conditioning is a phrase coined by BF Skinner that many security awareness professionals may not have heard before. Broadly, it means that ‘behaviour is a function of its consequences’. If the consequence of behaviour is positive then there is a chance to increase the magnitude or likelihood the behaviour. Alternately, negative consequences have the opposite effect.
My young friend at the birthday party had been trained to whinge and cry because he had been regularly rewarded with a treat for doing so. Just as rewards and punishments influence children’s behaviour, they are also an important factor in governance, risk and compliance. However, there are interesting quirks of rewards and punishments that need to be understood by anyone trying to influence behaviour.
We spend a lot of time talking about how to raise security awareness. We fill entire books, columns and conferences with it. However, anything that can go up must also go down. How about we turn the phrase on its head and ask what lowers security awareness? Just as there are behaviours that raise security awareness there are also some that lower security awareness. But what can we do about it? Name and shame was an important step in getting software vendors to deal with security vulnerabilities in their products. We should be equally critical when human vulnerabilities are created through the promotion of unsafe attitudes and behaviours. In this column I’m going to name and shame particularly egregious examples which I think reduces security awareness.
Here’s a trivia question for you – how did President George Washington die? No points for anyone who thought he died in battle, fell from a horse or was poisoned. Actually, he had an infection and suffered massive blood loss. Why he suffered massive blood loss is fascinating. For thousands of years people were convinced that blood could become stale and that ‘bad humours’ could cause illness for which bloodletting was the solution. When Washington became sick, his staff did the natural thing at the time and bled him. When he didn’t improve his staff bled him some more. Then the doctor was called and when he arrived Washington was bled again. All told, Washington lost some 6 pints of blood in a 16 hour period. He had a severe infection to be sure, but it’s likely that the massive blood loss significantly contributed to his demise.
Sometimes, how we define a problem limits our ability to solve it. Innovation counts for nothing if the approach itself is the problem. Physicians focused on how to let blood more effectively for thousands of years. Elaborate rituals developed to define where on the body blood could be taken from to fix specific aliments. Contraptions such as scarificators were invented to help people administer their own bloodletting – you didn’t have to visit someone to get them to do it for you (ever wondered what the red on a barber’s pole stood for?).