Skip to main content

Human Factors in Information Security - Errors & Violations

Human failures are often described as Slips, Lapses, Mistakes and Violations. These are grouped into two categories: Errors and Violations. The difference here is the intent - violations result from conscious decisions to disregard policies and procedures, whereas errors have no malicious intent. Also, violations often involve more than one form of misconduct, whereas errors are often isolated.

Don Turnblade has stated that in his experience "well trained staff had a 3.75% unintentional non-compliance rate; they did not realize that installed software compromised data security. About 0.4% of end users were intentionally non-compliant, generally willful persons with strong technical skill or organizational authority who were unaccustomed to complying with computing restrictions."

So what are the different types of error? Dealing with each in turn, we have Slips, Lapses and Mistakes.
  • Slips - actions not carried out as intended, e.g. pressing the wrong key by accident. Slips usually occur at the task execution stage.
  • Lapses - missed actions or omissions, e.g. forgetting to log out, or a step in a configuration process.
  • Mistakes - occur due to an incorrect intention, whilst believing it to be correct, i.e. they are deliberate actions with no malicious intent, e.g. misconfiguration of a firewall. Mistakes usually occur at the planning stage.
So who causes the error or violation and how do we combat them? Slips and Lapses are usually the fault of the user, but can be mitigated by making it more difficult for the user to make the error, e.g. by having confirmation dialogs for slips and better training for lapses. Mistakes tend to be the fault of designers and are slightly more difficult to combat as designer education is required or outside technical expertise needs to be brought in. However, this doesn't always solve the problem if they don't have the skills and knowledge required. Finally, violations can often be laid at the door of the managers. It is often the case that a culture of violations is accepted by senior management, who fail to impose proper sanctions or take the threat seriously.
All of these have to be dealt with to have a secure system and most of it boils down to having proper user education and training in place.

Comments

  1. Its interesting you frame the discussion in Human Factors terms. A big issue in designing interactive systems is the 'mental model' that uses have. That is the internal representation a user has of a system. Mental models give some depth to a users understanding of how the different parts of a system interrelate and consequently how it will behave given novel inputs or conditions.
    Mental Models can be difficult things to establish in a domain as intangible / complex as software but without them peoples understanding (or their ability to predict outcomes) is very brittle - a system appears to either do what its always done or is inexplicable different.
    I think the consequence of lacking good mental models in security is that people are unable to make judgments about the risks associated with their actions. Judgments get very binary with risks being either under or over estimated, neither of which are helpful.
    The response of the security functionality in systems often compounds this difficulty turning decisions into ok / cancel types of choices with little effort to inform the user of the potential consequences.
    I think if there was one goal for helping manage 'lapses' & 'violations' it should be to help users make informed decisions (informed in the sense of an awareness of the risks) rather than a paradigm based on just controlling and simplifying. Neither dumbing things down or automating too much in the background to second guess a user intent appear to be sustainable strategies. If anything they just make the impact of users less predictable.

    ReplyDelete

Post a Comment

Popular Posts

You say it's 'Security Best Practice' - prove it!

Over the last few weeks I have had many conversations and even attended presentations where people talk about 'Security Best Practices' and how we should all follow them. However, 'Best Practice' is just another way of saying 'What everyone else does!' OK, so if everyone else does it and it's the right thing to do, you should be able to prove it. The trouble is that nobody ever measures best practice - why would you? If everyone's doing it, it must be right.

Well, I don't agree with this sentiment. Don't get me wrong, many of the so-called best practices are good for most organisations, but blindly following them without thought for your specific business could cause as many problems as you solve. I see best practice like buying an off-the-peg suit - it will fit most people acceptably well if they are a fairly 'normal' size and shape. However, it will never fit as well as a tailored suit and isn't an option for those of us who are ou…

Coventry Building Society Grid Card

Coventry Building Society have recently introduced the Grid Card as a simple form of 2-factor authentication. It replaces memorable words in the login process. Now the idea is that you require something you know (i.e. your password) and something you have (i.e. the Grid Card) to log in - 2 things = 2 factors. For more about authentication see this post.

How does it work? Very simply is the answer. During the log in process, you will be asked to enter the digits at 3 co-ordinates. For example: c3, d2 and j5 would mean that you enter 5, 6 and 3 (this is the example Coventry give). Is this better than a secret word? Yes, is the short answer. How many people will choose a memorable word that someone close to them could guess? Remember, that this isn't a password as such, it is expected to be a word and a word that means something to the user. The problem is that users cannot remember lots of passwords, so remembering two would be difficult. Also, having two passwords isn't really…

Security is a mindset not a technology

I often get asked what I look for when hiring security professionals and my answer is usually that I want the right attitude first and foremost - knowledge is easy to gain and those that just collect pieces of paper should maybe think about gaining experience rather than yet more acronyms. However, it's difficult to get someone to change their mindset, so the right attitude is very important. But what is the right attitude?


Firstly, security professionals differ from developers and IT engineers in their outlook and approach, so shouldn't be lumped in with them, in my opinion. The mindset of a security professional is constantly thinking about what could go wrong (something that tends to spill over into my personal life as well, much to the annoyance of my wife). Contrast this with the mindset of a developer who is being measured on their delivery of new features. Most developers, or IT engineers, are looking at whether what they have delivered satisfies the requirements from t…