This month I caught up with Angela Sasse, Professor of Human-Centred Technology in the Department of Computer Science at University College London, UK. She has had a huge impact on the field of usable security having worked in this field since 1996. Her background in psychology has enabled her to look at human security problems from a novel perspective that has lead to multi-disciplinary projects with economists, mathematicians and crime scientists. She became Director of the UK Research Institute for Science of Cyber Security (RISCS), co-funded by the EPSRC and GCHQ in 2012, and was elected a Fellow of the Royal Academy of Engineering in 2015.
You’ve been involved in some fantastic papers. Looking back, which one are you most proud of and why?
It’s very difficult to ignore the impact that ‘Users are not the enemy’, co-authored with Anne Adams in 1999, has had. It has become the most cited paper in usable security research and was recently declared one of the ‘most influential’ computer science papers (by Semantic Web) and it is one that many practitioners have heard of.
But many who cite it refer to one part of it only: that many users don’t have an accurate understanding of risks, and that security education needs to address that. This has been cited as evidence for a ‘fix the user’ narrative, whilst ignore that the main point of the paper is that many security policies are impossible to follow in the context of productive work environments.
In your work bridging business and academia, do you notice any major differences in how the security awareness challenge is perceived?
I think the idea that all we need to ‘fix’ employees behaviour by raising awareness and can ignore the fact that many security policies and mechanisms isn’t fit for purpose is equally embedded in both communities. Most academic research has been quite theoretical, talking about applying theories such as Fear Appeals, Protection Motivation, or the ‘Nudge’ approach. But there have been too few proper trials applying these to a concrete set of security behaviours, and measuring the impact.
If anything, I have found that experienced practitioners are more aware why even well-intended and -designed campaigns fail. They collect feedback and so thus know about the ‘blockers’ to behaviour change are. But when they suggest re-designing security to make it possible or easier for staff to be secure, they often meet push back from technical security specialists who are reluctant to change from ‘Best Practice’, and business leaders who only care about ‘being compliant’ or ‘no worse than our competitors’.
What advice do you have for security professionals who are struggling with managing human behaviour?
I have a slide in my talks with a picture of Captain Jean-Luc Picard on it that says ‘Engage!’ – with staff who’s behaviour you are trying to change, but also those responsible for security policies and mechanisms, and the business leadership. Everyone in the organisation as specific responsibilities for security – we use the metaphor of ‘Security as a Team Sport’ in a recent paper. The bad guys are out there, and they are collaborating to cut the cost of intelligence and delivering attacks. The good guys all need to work together to run effective and efficient defences.
What future research ideas do you have?
We have squandered much user attention and credibility on security mechanisms that are difficult to impossible to use and are not effective. SSL warnings that have a false positive rate of 15,000:1 have trained people to ignore warnings. Outdated ‘have a strong password’ exhortations from system owners don’t take steps to protect password files …we need to name and shame ineffective security, and promote only behaviours that have a proven effect. I am working on a ‘test’ for security policies and mechanisms to measure workload of security behaviours (which includes ‘friction’ that comes from a bad ‘fit’ with the business activity), and the risk mitigation the mechanism can achieve.
There are some really interesting themes that have come out of Angela’s work. A key take away to think about is the degree to which your advice or requirements are actually achievable. If you work in behavioural compliance haven’t already read ‘Users are Not the Enemy‘ then you should do so now.