Home to Geordie Stewart's blog on information security awareness, risk communication and security ethics.

Risk IntelligenceRisk Intelligence
Risk Intelligence
Information Security Awareness
  • Home
  • About us
  • Services
  • Awareness Blog
  • Follow Us
  • Contact us
Menu back  
View allBlogConferencesFeaturedMental ModelsOrganisational CulturePrivacyRisk CompensationRisk PsychologySafetySecurity AwarenessSecurity EconomicsSecurity MetricsSurveillanceTrust
Date Name
DescAsc

Bounded Rationality

Are humans rational? When we see computer users to silly things which place themselves or their information at risk its easy to take a view that people are illogical. The problem is that logic can’t be examined separately from perception.

There is significant debate within psychology literature as to the extent to which humans can be described as rational. Rationality is sometimes described as the ability for individuals to select the “best” option when confronted with a set of choices. The best option is also referred to as a “value maximising” option when the most benefit is obtained for the least expenditure of resources or exposure to risk.

The problem is that people routinely fail to select a “value maximising” option and exhibit apparently illogical behaviour. Commonly, an option mathematically modelled as the best choice by the technical experts isn’t the choice chosen by information system users when responding to risk.  

Details
9th April 2011Leave a commentBlog, Risk Psychology, Security EconomicsBy rskadmin
Jul12015

Criminals and Moral Codes

Should we try to reason with criminals? Is the threat of punishment the only influence that criminals will respond to? What should we do when we suspect people are taking data with them when they leave a company, leaking to the competition or stealing equipment from the office but can’t prove it? As well as…

Details
1st July 2015Leave a commentBlog, Mental Models, Risk Psychology, Security AwarenessBy Geordie
Mar52012

Definition of Security Awareness

I’ve studied it for years, I’ve delivered it and I’ve even sat through it but I’m still not really sure what “it” is.

We talk about raising “security awareness” but what does that actually mean? The dictionary definitions I’ve seen commonly refer to awareness as a state of knowledge about risk. Thousands of articles and books have been written on increasing security awareness but very little time has been spent trying to define it.

The ISF Standard of Good Practice defines security awareness as “the extent to which staff understand the importance of information security, the level of security required by the organisation and their individual security responsibilities.” This seems like a reasonable definition but note that there is no behavioural component. People can (and do!) continue with unsafe behaviour despite their knowledge of the risks. Empirical evidence from outside of information security tells us that just knowing about a risk isn’t enough. Consider smokers and people who drive without using a seat belt. They’re surely all “aware” of the risks but somehow their behaviour continues.

Details
5th March 2012Leave a commentBlog, Risk PsychologyBy rskadmin
Nov162013

If You See Something, NSA Something

A common objective of information security awareness is to encourage whistleblowers to use internal mechanisms to report their concerns. External whistleblowing and the airing of concerns in public view risks brand damage and exposure of sensitive information. The Snowden affair has shown how divided we are on the ethics of external whistleblowing. To date, much of the debate has been speculation about Snowden’s character flaws. Sometimes when trying to understand a controversial decision such as Snowden’s it helps to understand the chain of events leading up to the decision since failures in complex systems can rarely be given justice in a single newsbyte. In this case there are a series of failures that occurred prior to the employee of a subcontractor deciding to flee the country and leak sensitive information to foreign journalists:

Details
16th November 2013Leave a commentBlog, Privacy, Risk Psychology, Security Awareness, Surveillance, TrustBy Geordie
Jul52013

Information Security Culture

As I escorted him to his desk I became conscious that everyone was looking at me. I did all the usual self-checks of fly, food on face and freaky hair but came up negative on all counts. When someone had tailgated me through a secure door I had challenged them. Rather than leave them outside when they didn’t have their pass with them I offered to walk them to their desk. I found his manager who told me with an expression more serious than a budget facelift: ‘Yes, of course he works here – he’s hardly here for the view’. What had encountered amongst the engineers at this small satellite office was a very different security culture than what I was used to with my head office, ivory tower view of the world. The culture that I had encountered worked on high levels of trust. They all trusted Dave so couldn’t understand why I didn’t (even thought I’d never met him). I less than a block from the head office of this organisation and yet the security culture was completely different. For me, the experience was an eye opener that effort is needed to understand not just if people are following security policy but the extent to which policy is reflected in security culture.

Details
5th July 20131 CommentBlog, Organisational Culture, Risk Psychology, Security AwarenessBy Geordie
Mar22013

ISSA Security Awareness Column December 2012 – Security Awareness Training Feedback Surveys

Whoever said that there’s no such thing as a stupid question, only a stupid answer, has probably never seen a feedback survey for security awareness training sessions. Questions such as “Did you learn anything?” and “Do you feel more secure?” are as common as they are idiotic. I guess its largely shaped by the motives of who is asking the question. The trainers involved are primarily interested in demonstrating that they are good trainers and questions are designed to elicit complimentary feedback. Feedback surveys are a great chance to obtain valuable feedback, but only if we’re asking the right questions.

In this column we’re going to look at training feedback surveys in more detail. Getting useful feedback from training sessions is challenging, but not impossible. For a start, you need to be aware of people’s biases. Surveys measure ‘declared preferences’ since they rely on people expressing their views. While easier to gather, declared preferences have inherent biases that need to be acknowledged and allowed for when interpreting the results. ‘Revealed preferences’ are what people actually do but measuring what people do accurately and efficiently can be difficult especially if people know they’re being observed. Here are some suggestions for allowing for people’s biases while obtaining reliable survey data.

Details
2nd March 2013Leave a commentBlog, Risk Psychology, Security Awareness, Security MetricsBy Geordie
Mar22013

ISSA Security Awareness Column Feb 2013 – Innovation in Information Security Awareness

Here’s a trivia question for you – how did President George Washington die? No points for anyone who thought he died in battle, fell from a horse or was poisoned.  Actually, he had an infection and suffered massive blood loss. Why he suffered massive blood loss is fascinating. For thousands of years people were convinced that blood could become stale and that ‘bad humours’ could cause illness for which bloodletting was the solution. When Washington became sick, his staff did the natural thing at the time and bled him. When he didn’t improve his staff bled him some more. Then the doctor was called and when he arrived Washington was bled again. All told, Washington lost some 6 pints of blood in a 16 hour period. He had a severe infection to be sure, but it’s likely that the massive blood loss significantly contributed to his demise.

Sometimes, how we define a problem limits our ability to solve it. Innovation counts for nothing if the approach itself is the problem. Physicians focused on how to let blood more effectively for thousands of years. Elaborate rituals developed to define where on the body blood could be taken from to fix specific aliments. Contraptions such as scarificators were invented to help people administer their own bloodletting – you didn’t have to visit someone to get them to do it for you (ever wondered what the red on a barber’s pole stood for?).

Details
2nd March 2013Leave a commentBlog, Mental Models, Risk Psychology, Security Awareness, Security MetricsBy Geordie
Mar112013

ISSA Security Awareness Column March 2013 – Lowering Security Awareness

We spend a lot of time talking about how to raise security awareness. We fill entire books, columns and conferences with it. However, anything that can go up must also go down. How about we turn the phrase on its head and ask what lowers security awareness? Just as there are behaviours that raise security awareness there are also some that lower security awareness. But what can we do about it? Name and shame was an important step in getting software vendors to deal with security vulnerabilities in their products. We should be equally critical when human vulnerabilities are created through the promotion of unsafe attitudes and behaviours. In this column I’m going to name and shame particularly egregious examples which I think reduces security awareness.

Details
11th March 2013Leave a commentBlog, Risk Compensation, Risk Psychology, Security Awareness, Security EconomicsBy Geordie
Nov142012

ISSA Security Awareness Column November 2012 – Why Do People Ignore Risk Advice?

How frustrating is it when you point out the risks to people and they just don’t listen? Every day around the world there are millions of people who smoke, drive too fast and click on strange emails, even though they’ve been repeatedly told about the dangers. They are ‘risk aware’ in the technical sense of the word and yet their behaviour continues. This is a big problem since the mainstream approach to security awareness assumes that all that’s needed to achieve behavioural change is an understanding of the risks. Traditionally when encountering non-compliant behaviour, we security technocrats reiterate the facts and increase the threat of sanctions. But, there is another way.

Luckily for us this is a problem that safety risk communicators have been grappling with for decades. The safety risk communications field has a number of explanatory frameworks to predict how people will react to risk communications. One of the most interesting models to arise is the Extended Parallel Processing Model (EPPM) which seeks to explain why people fail to take action once aware of a threat. This is a goldmine for security professionals looking to apply a more structured, formal approach for promoting behavioural change.

Details
14th November 2012Leave a commentBlog, Mental Models, Risk Psychology, Security AwarenessBy rskadmin

ISSA Security Awareness Column October 2012 – Learning From Safety Risk Communications

Any endeavour is made doubly difficult when pursued with a lack of metrics and without a clear understanding of cause and effect. When stumbling in the dark, facts are the flashlight of comprehension, illuminating the way forward when the path is unclear. Information security is often required to function in the dark with little in the way of facts to guide us. We hear noises and bump into things but can never be certain if we’re going in the right direction.

When security fails, how do we know? While failures of integrity and availability are obvious, failures of confidentiality can be silent and insidious. Some actors such as LulzSec boast about their exploits and derive their benefits from the resulting publicity. Other actors quietly go about their ‘business’ and organisations may not realise they’ve been breached. Often, even when we do discover failures of confidentiality the organisational interest is to bury it. As a result, our profession is rich in rumours but poor in facts which make it difficult when trying to understand the effectiveness of security controls.

Details
2nd October 2012Leave a commentBlog, Risk Psychology, Safety, Security MetricsBy rskadmin
12
Next page
Recent Posts
  • Getting Permission To Use HaveIBeenPwned From Your Legal Dept
    4th April 2018
  • The Craziest Information Security Stories of 2017
    4th January 2018
  • Rumor Has IT: How Fake News Damages Cyber Security
    7th June 2017
  • The Craziest Information Security Stories Of 2016
    11th February 2017
Categories
  • Blog(61)
  • Conferences(2)
  • Featured(1)
  • Mental Models(9)
  • Organisational Culture(8)
  • Privacy(8)
  • Risk Compensation(2)
  • Risk Psychology(19)
  • Safety(4)
  • Security Awareness(38)
  • Security Economics(11)
  • Security Metrics(8)
  • Surveillance(8)
  • Trust(6)
Risk Intelligence
Copyright © 2015 Risk Intelligence Ltd.
  • Home
  • About us
  • Follow Us
  • Contact us
Footer