We’re struggling to manage cyber security risk. Partly it’s because attackers are more agile than defenders. But it’s also because we struggle to get accurate information to make good risk management decisions. To make good risk management decisions we need to understand who the threat actors are, their capabilities, how likely it is that we…
Have security professionals helped make the privacy of citizens around the world ‘collateral damage’ in the hunt for terrorists?
Due to Edward Snowden’s disclosures we are now aware that millions of people have been unwittingly monitored by systems of indiscriminate surveillance. Many of these systems, having been developed in secret, were only possible due to the support of a large number of security professionals. We can suppose that the creators and operators of these systems are attempting to achieve legitimate objectives on behalf of their respective societies. What is less clear is if these systems do more harm than good or if the costs and risks of these activities have been fully understood, let alone accepted by the societies that bear the costs. Do the costs and potential harms of indiscriminate systems outweigh the benefits? Has privacy been compromised without due cause? If so, is it ethical for security professionals to support such systems?
A common objective of information security awareness is to encourage whistleblowers to use internal mechanisms to report their concerns. External whistleblowing and the airing of concerns in public view risks brand damage and exposure of sensitive information. The Snowden affair has shown how divided we are on the ethics of external whistleblowing. To date, much of the debate has been speculation about Snowden’s character flaws. Sometimes when trying to understand a controversial decision such as Snowden’s it helps to understand the chain of events leading up to the decision since failures in complex systems can rarely be given justice in a single newsbyte. In this case there are a series of failures that occurred prior to the employee of a subcontractor deciding to flee the country and leak sensitive information to foreign journalists:
Trust is an incredibly important concept in information security and a vital component of influencing an audience. We know from safety risk communication research that it’s not enough to be an expert in your field. It’s not enough to be correct. You also need to be trusted by your audience. Otherwise your level of influence will be reduced and people may decide to act in ways that challenge your mission objectives.
When I wrote the July column as satire imagining what a GCHQ letter to a supportive member of the public might look like I was poking fun at the unrealistic expectations about our intelligence services that were being perpetuated. That as ‘big brother’ they knew better and were always looking out for our best interests. I recognize now that what I was also doing was challenging the notion that intelligence services innately deserved a high level of trust.
Dear Michael Burgess of Tunbridge Wells in the UK, we in the GCHQ read with interest your recent letter to the Guardian Newspaper in which you state that you’re not bothered if the Government knows what web sites you’ve been visiting. It is refreshing sir, (and we know you are from the scanners at Heathrow airport) to find a true patriot who welcomes the state’s determination to know everything about everyone. Corporate security awareness programs have been advising for years that personal privacy is something that can’t be ‘fixed’ once lost so your willingness to permanently surrender your privacy (and the privacy of anyone you communicate with) is appreciated.
There’s no denying that some people are impervious to our attempts at security awareness and refuse to listen to warnings or instructions. There is a temptation when things go wrong to label such people as ‘bad apples’. I think that this saying is overused. Originally, the expression ‘bad apple’ referred to a rotten apple in a barrel that would spoil the good apples. Usage of the phrase has changed and its now often used to explain failures of scale. The perception is that when there are many apples you have to expect some of them to be bad.
I often hear the phrase used when a governance failure is attributed to human mistakes. Frequently however, I think the phrase bad apple is a convenient cover for poor management where processes and procedures were badly designed or supervised. The bad apple narrative can suit prejudices of humans being a weak link and any narrative is more comforting than no narrative at all. However, bad apple narratives rarely withstand serious scrutiny.