Insights in Risk Assessments

Cory Hardman · August 9, 2009

Bruce Schneier has a great insight on risk. In a recent post to his blog he wrote:
People have a natural intuition about risk, and in many ways it's very good. It fails at times due to a variety of cognitive biases, but for normal risks that people regularly encounter, it works surprisingly well: often better than we give it credit for.

This struck me as I listened to yet another conference presenter complaining about security awareness training. He was talking about the difficulty of getting employees at his company to actually follow his security policies: encrypting data on memory sticks, not sharing passwords, not logging in from untrusted wireless networks. "We have to make people understand the risks," he said.

It seems to me that his co-workers understand the risks better than he does. They know what the real risks are at work, and that they all revolve around not getting the job done. Those risks are real and tangible, and employees feel them all the time. The risks of not following security procedures are much less real. Maybe the employee will get caught, but probably not. And even if he does get caught, the penalties aren't serious.

Bruce hits risk assessments right on the head in this post. The risk inherent in what is considered risky Internet  behaviour is not as bad as many make it out to be. People often surf the web in untrustworthy wireless networks and store data on none encrypted memory sticks and nothing bad ever seems to happen because of it. At least on the personal level. We may read news articles once a week that describe how some sensitive data was lost on a stolen laptop or some other similar story. The thing to take from them is that they are in the news, if it happened to everyone it wouldn't be news worthy. Bruce goes on to say:
"Fire someone who breaks security procedure, quickly and publicly," I suggested to the presenter. "That'll increase security awareness faster than any of your posters or lectures or newsletters." If the risks are real, people will get it.

This is exactly correct. People need a real risk to be aware of the riskiness of their behaviour. However the network administrators need to take into account the actual risk, likelihood, and cost of each policy. You can't stop business in the name of security.

Twitter, Facebook