
For those not familiar with the case, TJ Hooper was a landmark in tort law that established an important standard for negligence. The case was heard in 1932 to assign liability for a lost cargo. A tug towing the cargo on a barge had set to sea in good weather but later that night there was a storm and the barge sank. The owner of the cargo argued that if the tug had been equipped with a radio they could have checked weather reports and had the opportunity to seek shelter in a nearby breakwater before the storm hit. The owner of the tug disagreed and made a prevailing practice defence. That is, that tugs at the time were not usually equipped with radios and this was considered normal practice in the industry.
In a landmark decision handed down by Judge Learned Hand it was found that prevailing practice did not completely shield the tug owner against a claim of negligence. In one of the most beautiful legal phrases ever uttered the rationale was summed up as: “There are precautions so imperative that even their universal disregard will not excuse their omission.” Common prudence therefore was not always the same as reasonable prudence. In this case the value of the cargo, the likelihood of a storm and the relatively low cost of a radio meant that it was negligent to go to sea without one.
So what does this mean for information security awareness? It means that many organisations may be sitting on a liability time bomb. That is, in the event of a security incident, will their security awareness programmes be considered as adequate to shield them from 3rd party claims of negligence? There is a universal practice argument to be made for mediocrity – most organisations barely go through the motions, with CBT and a few security slides as part of the induction process. Some organisations don’t even have a security policy. Sooner or later this sad state of affairs will be put to the test.
If I had a contract with a third party which suffered a security breach related to human failings, I’d be asking if their security awareness programme was adequate given the risks. Not was it the industry minimum, but whether the effort the other party investment in security awareness was commensurate with the likelihood of a security incident, the value at risk and the benefits of security awareness done properly.
Why hasn’t the adequacy of security awareness programmes been repeatedly challenged in court? Part of this is the unknown unknown argument. For the Hooper case, it was easy to see that damage had been done. The barge was below the water instead of on top of it. Even a PCI auditor could pick up on that one if given half an hour and a detailed checklist. Failures in the information security field are different. Some organizations don’t know they’ve had a breach. Or maybe they do know they’ve had a breach but don’t want to make it public by engaging in litigation. There are two important aspects to this which are changing. Firstly, mandatory breach reporting requirements mean that the news of the breach is almost certain to be made public as organizations can’t place any confidentiality restrictions on their notification process. Once the breach is public there’s no incentive to refrain from litigation.
Secondly, we’ve seen a rise in organization ‘doxing’ where leaks are intentionally made public. Think of Sony, Hacking Team, Manning and the diplomatic cables…the list goes on.
It may well be that the next step change in professionalizing security awareness campaigns isn’t new standards, certifications or qualifications but the lawyers getting involved. Consider how your security awareness programme would fare if put under the spotlight. If it’s just a token gesture going through the motions of the industry minimum then you could be in trouble.