There’s no denying that some people are impervious to our attempts at security awareness and refuse to listen to warnings or instructions. There is a temptation when things go wrong to label such people as ‘bad apples’. I think that this saying is overused. Originally, the expression ‘bad apple’ referred to a rotten apple in a barrel that would spoil the good apples. Usage of the phrase has changed and its now often used to explain failures of scale. The perception is that when there are many apples you have to expect some of them to be bad.
I often hear the phrase used when a governance failure is attributed to human mistakes. Frequently however, I think the phrase bad apple is a convenient cover for poor management where processes and procedures were badly designed or supervised. The bad apple narrative can suit prejudices of humans being a weak link and any narrative is more comforting than no narrative at all. However, bad apple narratives rarely withstand serious scrutiny.
For example, when Ian Tomlinson died after being struck by a Metropolitan Police Officer in London in 2008 the Police Commissioner invoked the bad apple narrative and pointed out that the Met police is an organisation which includes some 50,000 staff. It later emerged that the officer in question had unresolved previous allegations of assault and the investigation of the incident was bungled with wider allegations of professional misconduct. What was initially presented as a bad apple was in fact a systemic failure.
While bad apple narratives may have their place in public relations, for us as risk practitioners it is a hindrance to understanding the root cause of incidents and making improvements. Other mindsets are far more useful. Herbert William Heinrich was an American pioneer in workplace safety who proposed that there was a predictable relationship of one major incident to 29 minor incidents to 300 near misses to some 3000 unsafe acts. This became known in the safety field as ‘Heinrich’s Law’. Heinrich represented this relationship in a pyramid.
While the exact ratios he proposed have been subject to debate, there’s no denying that failures do not happen in a vacuum. When investigating security incidents I’ve found that there are usually a number of recorded previous minor incidents or near misses. Processes were not fit for purpose or fail-safe and it was only a matter of time until there was a major incident. Underpinning this is usually a set of beliefs and attitudes amongst staff and managers that contribute to the continuation of unsafe acts. I used to be surprised when people said things to me like ‘What do we need screen savers for? It’s not like we’re the CIA’. That’s why in my version of the pyramid I’ve added unsafe beliefs and attitudes as a foundation that underpins unsafe behaviour.
There is a temptation amongst security awareness professionals to focus on equipping people with specific security skills rather than actively managing people’s security beliefs and attitudes. Since the basis of Heinrich’s Law is unsafe acts then we need to look at the beliefs and attitudes that underpin those acts and not just focus on competencies.
Recognising the likely rate of disclosure incidents is also something that Heinrich’s Law could help with. Some disclosure incidents are discovered because the perpetrators have an interest in the resulting publicity. Other disclosure incidents are only discovered by accident which means that we have no real idea of what percentage of disclosures are occurring. Counting unsafe acts, beliefs and attitudes could help predict the actual rate of major and minor incidents.
The next time you hear someone assign the cause of a security incident to a ‘bad apple’ stop and ask yourself. Is it too soon to know what the cause is? Have we looked for previous near misses and bad practices? Is it really just an isolated incident and who benefits from the cause being assigned to bad apples?
Published in the Jan 2013 ISSA Security Journal