Security policies are a great opportunity to influence behaviour. Unfortunately, for a variety of reasons they’re not usually as effective as they could be. Despite our efforts to sell the value of information security, actually reading a policy is less popular than a fart on a plane. There’s a reason that a security policy has…Details
Whoever said that there’s no such thing as a stupid question, only a stupid answer, has probably never seen a feedback survey for security awareness training sessions. Questions such as “Did you learn anything?” and “Do you feel more secure?” are as common as they are idiotic. I guess its largely shaped by the motives of who is asking the question. The trainers involved are primarily interested in demonstrating that they are good trainers and questions are designed to elicit complimentary feedback. Feedback surveys are a great chance to obtain valuable feedback, but only if we’re asking the right questions.
In this column we’re going to look at training feedback surveys in more detail. Getting useful feedback from training sessions is challenging, but not impossible. For a start, you need to be aware of people’s biases. Surveys measure ‘declared preferences’ since they rely on people expressing their views. While easier to gather, declared preferences have inherent biases that need to be acknowledged and allowed for when interpreting the results. ‘Revealed preferences’ are what people actually do but measuring what people do accurately and efficiently can be difficult especially if people know they’re being observed. Here are some suggestions for allowing for people’s biases while obtaining reliable survey data.Details
Here’s a trivia question for you – how did President George Washington die? No points for anyone who thought he died in battle, fell from a horse or was poisoned. Actually, he had an infection and suffered massive blood loss. Why he suffered massive blood loss is fascinating. For thousands of years people were convinced that blood could become stale and that ‘bad humours’ could cause illness for which bloodletting was the solution. When Washington became sick, his staff did the natural thing at the time and bled him. When he didn’t improve his staff bled him some more. Then the doctor was called and when he arrived Washington was bled again. All told, Washington lost some 6 pints of blood in a 16 hour period. He had a severe infection to be sure, but it’s likely that the massive blood loss significantly contributed to his demise.
Sometimes, how we define a problem limits our ability to solve it. Innovation counts for nothing if the approach itself is the problem. Physicians focused on how to let blood more effectively for thousands of years. Elaborate rituals developed to define where on the body blood could be taken from to fix specific aliments. Contraptions such as scarificators were invented to help people administer their own bloodletting – you didn’t have to visit someone to get them to do it for you (ever wondered what the red on a barber’s pole stood for?).Details
Any endeavour is made doubly difficult when pursued with a lack of metrics and without a clear understanding of cause and effect. When stumbling in the dark, facts are the flashlight of comprehension, illuminating the way forward when the path is unclear. Information security is often required to function in the dark with little in the way of facts to guide us. We hear noises and bump into things but can never be certain if we’re going in the right direction.
When security fails, how do we know? While failures of integrity and availability are obvious, failures of confidentiality can be silent and insidious. Some actors such as LulzSec boast about their exploits and derive their benefits from the resulting publicity. Other actors quietly go about their ‘business’ and organisations may not realise they’ve been breached. Often, even when we do discover failures of confidentiality the organisational interest is to bury it. As a result, our profession is rich in rumours but poor in facts which make it difficult when trying to understand the effectiveness of security controls.Details
During the course of World War Two in the Pacific there were numerous primitive cultures on remote islands that came into contact with Westerners for the first time. Islanders were particularly impressed with the cargo that the visitors brought with them. At the conclusion of World War Two most of the visitors left and the cargo stopped arriving. Across multiple islands separated by thousands of miles a strange phenomenon occurred. Primitive cultures attempted to invite new cargo by imitating the conditions of what was happening when the cargo was arriving. They cleared spaces for aircraft landing strips and “controllers” dressed up with vines for wires and sticks for microphones. Bizarre ritualised behaviour developed around the use of artefacts like uniforms and insignias. “Cargo Cult” behaviour was a phrase coined by the scientist Richard Feynman to explain activity that occurs where appearances are superficially imitated. A result is pursued without actually understanding the underlying mechanisms of cause and effect. Pre-requisites are mistaken for causation. The pattern across so many independent island cultures suggests that this confusion is part of human nature. A good causation parody you may have heard of is a lack of pirates causes global warming.Details
I’m really looking forward to RSA Europe 2012 next week where I’ll be taking part in a debate about whether or not organisations should train their staff in security awareness. It is being organised by Acumin and the RANT community. Participating with me will be: Christian Toon, European Head of Information Risk, Iron Mountain Europe Thom Langford, Director Global Security…Details
Senior management support® is something often mentioned as critical to the success of an information security awareness campaign. There are a number of reasons for this. Firstly, senior management help direct the usage of resources within the organization. Without their support, you won’t get much of a training budget, permission to take staff away from productive duties and you might even struggle to get a room booking. Secondly, managers set the tone for behavior in the organization and it’s common for staff to imitate their manager. This is often exhibited in the way they dress like their managers and also when they behave like their managers. Do your managers scoff that ‘the rules’ are for everyone else? That training is only for the IT-illiterate and don’t bother to show up? The bad news is that many of your staff will copy these behaviors.Details
The National Institute of Standards and Technology (NIST) is updating 800-16 (A Role-Based Model for Federal Information Technology/Cybersecurity Training). Many will be familiar with NIST 800-50 (Building an Information Technology Security Awareness and Training Program) which was published in 2003 and has aged badly. In many regards, the problems with 800-50 stem from how the security…Details