Whoever said that there’s no such thing as a stupid question, only a stupid answer, has probably never seen a feedback survey for security awareness training sessions. Questions such as “Did you learn anything?” and “Do you feel more secure?” are as common as they are idiotic. I guess its largely shaped by the motives of who is asking the question. The trainers involved are primarily interested in demonstrating that they are good trainers and questions are designed to elicit complimentary feedback. Feedback surveys are a great chance to obtain valuable feedback, but only if we’re asking the right questions.
In this column we’re going to look at training feedback surveys in more detail. Getting useful feedback from training sessions is challenging, but not impossible. For a start, you need to be aware of people’s biases. Surveys measure ‘declared preferences’ since they rely on people expressing their views. While easier to gather, declared preferences have inherent biases that need to be acknowledged and allowed for when interpreting the results. ‘Revealed preferences’ are what people actually do but measuring what people do accurately and efficiently can be difficult especially if people know they’re being observed. Here are some suggestions for allowing for people’s biases while obtaining reliable survey data.
Selection Bias. By definition, the population available to fill out training awareness feedback forms are usually those who actually attended. Therefore, the results do not include those who chose not to attend. Consider carefully what the people who didn’t attend might say. That the training was too long? Too basic? Too boring? If people have perceptions which are holding them back from attending its important to find out why. Its not necessarily about the session, its about people’s perceptions of the session which also needs to be managed. You may want to consider a survey targeted at people who didn’t attend to ask them why.
Confirmation Bias. When we signal the desired answer in the phrasing of the question then we deserve the answers we get. Its human nature to avoid confrontation or disappointing people and there is a tendency for people tell us what we want to hear. To counter for this bias try to avoid questions which are phrased in moral terms. Look out for the word ‘good’ as it normally signals a moral norm and therefore an expected answer.
Intention Bias. People have all sorts of good intentions. Go to the gym. Lose weight. Stop smoking. However, there is a big gap between intent at a point in time and what people actually do in the following days and months. Its all very well people declaring their intention to take security more seriously but you should have a glance at your own 2012 New Year’s Resolutions for a reality check. If you’re going to bother asking people about their intentions after training then you should have a way of measuring later how many people followed through.
Question Phrasing. Questions should be as short as you can make them without becoming vague and you should only ever ask one question at a time. For example, ‘Was the training clear and easy to follow?’ actually has mixed up two different concepts which mean different things – training clarity and training pace. Where questions are unclear or confusing then the temptation will be to abandon the survey (which reduces completion rates) or skip though (which reduces data quality).
Be Specific. Avoid subjective words which are going to have different interpretations. For example, the word ‘often’ will mean different things to different people. Instead of a word like ‘often’, try setting out a specific time frame such as ‘at least once a week’.
Consider Vocabularies. The use of obtuse linguistic structures (complex sentences) and TLA (vague acronyms) will cause problems by impacting both completion rates and data quality. Consider trying out your test questions on some volunteers and ask them to repeat back to you in their own words what your question is asking. You may be surprised in how your questions were interpreted. When you reliably get people repeating back your questions as you intended then you’re ready to go.
Designing effective surveys does take time and effort but is worth it in order to obtain valuable feedback. It is important to allow for people’s biases and tendencies when designing a survey. If you’re judging the ‘success’ of your security awareness training by feedback from slackers who hang around to gossip after training sessions and tell you what you want to hear then you’re probably wasting your time.