I’m always genuinely excited to find someone doing something new in the field of security awareness. This month I caught up with Sarah Janes, Managing Director at Layer 8 Ltd. Sarah started her career running security awareness at British Telecom (BT) and has delivered award winning behavioural change programmes to FTSE 100 companies. Her team…
Long suffering readers of this column will be familiar with the importance of security culture in driving behavioural change. This month I caught up with Kai Roer, founder of the Roer Group and author of Build a Security Culture. Kai has created a free resource called the Security Culture Framework and runs a blog at…
If your organisation was an animal, what would it be? Is your organisation a risk taker? Short sighted? Perhaps it’s slow to react? I’ve worked for elephants, giraffes and even a hyena. Animals and organisations both have their behavioural quirks and ways of optimising their survival chances in their particular environment. However, what worked in the past isn’t always the best survival tactic in the present. Sometimes organisations need to adapt due to factors such as customer demand, regulatory changes or new environmental risks. Behaviours adopted in the mistaken perception that they are helpful can even be self-harming and may need to change.
Last month we discussed information security culture and the shared underlying unconscious assumptions of staff that frame it. This month we talk about how to go about trying to change security culture. Changing the culture of an organisation can be a significant challenge and I’ve seen many efforts fail.
There are three things you need to know before you start. Firstly, you need to identify what problematic behaviours exist. Secondly, you need to understand what beliefs, attitudes and unconscious assumptions are enabling them. Thirdly, you need to know what cultural values you’re aiming for to re-align the organisation’s behaviour towards it’s key goals. Potentially, this means the ‘un-learning’ of one set of beliefs and the learning of a new set.
As I escorted him to his desk I became conscious that everyone was looking at me. I did all the usual self-checks of fly, food on face and freaky hair but came up negative on all counts. When someone had tailgated me through a secure door I had challenged them. Rather than leave them outside when they didn’t have their pass with them I offered to walk them to their desk. I found his manager who told me with an expression more serious than a budget facelift: ‘Yes, of course he works here – he’s hardly here for the view’. What had encountered amongst the engineers at this small satellite office was a very different security culture than what I was used to with my head office, ivory tower view of the world. The culture that I had encountered worked on high levels of trust. They all trusted Dave so couldn’t understand why I didn’t (even thought I’d never met him). I less than a block from the head office of this organisation and yet the security culture was completely different. For me, the experience was an eye opener that effort is needed to understand not just if people are following security policy but the extent to which policy is reflected in security culture.
There’s no denying that some people are impervious to our attempts at security awareness and refuse to listen to warnings or instructions. There is a temptation when things go wrong to label such people as ‘bad apples’. I think that this saying is overused. Originally, the expression ‘bad apple’ referred to a rotten apple in a barrel that would spoil the good apples. Usage of the phrase has changed and its now often used to explain failures of scale. The perception is that when there are many apples you have to expect some of them to be bad.
I often hear the phrase used when a governance failure is attributed to human mistakes. Frequently however, I think the phrase bad apple is a convenient cover for poor management where processes and procedures were badly designed or supervised. The bad apple narrative can suit prejudices of humans being a weak link and any narrative is more comforting than no narrative at all. However, bad apple narratives rarely withstand serious scrutiny.
One of the small mercies of being a security consultant is that I’m usually spared the ordeal of attending information security induction sessions. Recently however I was asked to review the induction process for a European organisation. It was classic death by PowerPoint. It included organisational charts of the security function, strategic plans for ISO certification and pages and pages of security policy requirements. The conclusion of the session was a quiz on facts from the security policy.
Why do we do this? Why do we make people’s first contact with information security an ordeal for insomniacs? Consider that in people’s first week at a new job they’re usually nervous and on edge. Accompanying this will be elevated levels of adrenaline and cortisol (a stress hormone) which is not conducive for learning. In some ways we’ve picked the worst week to deliver training.
What is it that we’re trying to achieve with induction sessions? Is there a benefit to users being able to describe the organisational structure of the security department? Surely they would only need to know how to contact the security department in the event of an incident? What benefit is there for users knowing the ISO certification strategy? They might be things we want to tell them, but do they care? We seem to make the mistake as technical experts by selecting the information we want to tell people, not the information people need to know or are disposed to listening to.
Many of you will be familiar with the footage of Ian Tomlinson apparently being struck by a Metropolitan Police Officer in London on the day of the G20 protests. After the footage was aired, senior members of the Met Police were quick to promote the narrative of a “bad apple”. They pointed out that the Met Police is an organisation which includes some 50,000 people.
You have to have some sympathy for the police. They do a difficult job. The problem with the bad apple narrative is the video footage of the incident. Although the attack on Ian Tonlinson took place immediately in front of at least three other members of the Met Police, none of them appear concerned enough to go to the aid of Tomlinson. Neither are they seen to remonstrate with their colleague.
Recently I co-authored a paper “Death by a Thousand Facts” with David Lacey for the HAISA conference where we explored the nature of how technical experts choose what content is included in risk communications. A copy of the proceedings is available here. Basically, mainstream information security awareness techniques are failing to evolve at the same…