Liu Tienan didn’t start out taking millions of dollars in bribes. His first bribe in 2002 was for three thousand dollars. Over time the bribes escalated. When he was caught in 2013 it’s estimated that he accepted nearly six million dollars for abusing his position as Deputy Head of China’s National Development and Reform Commission (NDRC). The interesting aspect to this isn’t that officials can be bought. Nor is it the sad story of how a talented economist lost his way. It’s the story of how the path to major crimes is paved with smaller misdemeanours.
Liu Tienan’s criminal story is unlikely to have started with three thousand dollars. Perhaps it began with some dinners or corporate gifts. Maybe a paid holiday. At some point, why bother with the dinner or the holiday, why not just get the money instead?
People are much more likely to commit a crime or breach of ethics if they can rationalise the action as small or insignificant. In contrast, people can be surprisingly honest if confronted with a major opportunity to profit from being unethical. If in 2002 Liu Tienan had been offered six million dollars in bribes he may well have refused it. Consider how many people go to great lengths to return money if the amounts are large enough. In contrast, who would go to great lengths to find the owner of twenty dollars found on the street?
Small steps are far more palatable for the ethically dubious. The infamous Milgram experiments involved an authority figure directing participants to deliver what they thought were electric shocks to another participant who was out of sight. The voltages they were asked to administer started out low but increased in intensity to a setting marked lethal. The experiments didn’t just show that people would do awful things if directed by an authority figure. They also showed that people would compromise their ethics if asked to do so in a series of small steps. Tavris and Aronson call this the ‘The Pyramid of Choice’ where small steps can lead people to adopt extreme positions that they would never have adopted outright.
Everybody with a responsibility for behavioural compliance should be concerned about the slippery slope effect. People rarely plan to end up as immoral. People have an internal narrative of their own moral beliefs where their behaviour is rationalised as reasonable. Even criminals usually consider themselves to be ‘good’, righteous people. If a course of action causes intolerable conflict with this sense of self then people are likely to act according to their self belief of righteousness. The extent to which they are right or wrong in their beliefs is irrelevant here. The point is it’s a belief they hold and the effect it has is to give them pause when confronted with a course of action which conflicts with this self belief. If we’re worried about how a series of choices might result in people participating in major wrong doing then there are three aspects we need to focus on.
Firstly, it’s not just people’s choices that we need to worry about. It’s also about the choices we take away from people by setting impossible compliance goals. Many organisations survive by ignoring of inconvenient parts of security policy which normalises non-compliance. Would full compliance with your security policies actually mean closing the trading floor or putting patients at risk? Setting an impossible compliance goal means starting people on the slippery slope. Also, holding someone to account when something goes wrong will be far more difficult if they have the ‘but everyone did it’ defence.
Secondly, if you have rule breaking without consequences you’re sending a message to the organisation that patronage is more important than policy. Security policies need to be enforced without fear or favour. If you can’t imagine ever enforcing the security policy on your Human Resources Department then you’re in trouble.
Thirdly, consequences for rule breaking need to be publicised. If people don’t hear about consequences then they will likely conclude that there aren’t any which greatly weakens your governance. This is the reason why the Romans held their crucifixions in public. Many organisations are uncomfortable talking about security incidents but they need to talk about the last one if they are to have a chance at preventing the next one.
People rarely go from loyal employees to malicious insiders overnight. We as security professionals do have opportunities to influence the outcome and which way people will go on the pyramid of choice.