Skip to main content

Personal Mobile Devices Violate Compliance

Computer Weekly recently conducted a survey via Twitter on how many organisations allow their users access to corporate email from their own private phone. Unfortunately, I haven't seen any results from this survey as yet, but it made me think about organisations that do allow private devices to attach to the network, not just mobile phones. I have also had many comments on my blog post entitled 'Mobile Device Data Breaches', which have fed into this post.

In one of those comments, someone pointed out that in their experience users are often a weak link. Isn’t it always the case that users are the weakest link? A poorly educated/trained user can compromise the best security. Unfortunately, I have seen so many organisations that do not adequately train their users or make them aware that there are policies, let alone what they mean to their daily usage of the corporate systems. I have also come across one organisation where a top executive had all the system passwords stored, unencrypted, on his PDA. He didn’t see a problem with this as he always carried it with him!

How many organisations these days have push email onto a mobile? How many of those organisations send sensitive documents around via email? Do they have encryption and password access on those devices? Not many that I’ve seen. The typical Blackberry users that I see have no password or PIN access to their phone, but it does have full access to the corporate mail exchange. These devices also have the ability to store, and even sync, corporate documents. What policies do you have to cover them?

Quoting from ISO-27002:2005 11.7.1: A formal policy should be implemented, and appropriate security measures adopted, for mobile computing and communications activities. Controls should apply to laptop, notebook, and palmtop computers; mobile phones and "smart" phone-PDAs; and portable storage devices and media. Controls include requirements for:

  • physical protection;
  • data storage minimization;
  • access controls;
  • cryptographic techniques;
  • data backups;
  • anti-virus and other protective software;
  • operating system and other software updating;
  • secure communication (e.g., VPN) for remote access; and
  • sanitization prior to transfer or disposal.
The problem is that most organisations do not have adequate policies covering mobile devices. Moving away from mobile phones, are you allowed to plug a USB device into your corporate machine? Many of these devices can store sensitive data and even access the Internet themselves. What about an insecure iPhone connecting to the Internet and leaking data? Most organisations aren't even aware that you can lock down USB usage via tools, but policies should definitely be in place. Alan Goode, from Goode Intelligence, said the following:
"I feel that you can lock down with security policy and tools but this is a complex problem as the combination of mobility and technology diversity, e.g. I can use my iPhone to connect to the enterprise network and store sensitive data on it, is creating a major headache for infosec professionals. As well as the problem with laptops and USB drives we are also seeing a growing use of employee-owned mobile devices, netbooks, games consoles, smart phones, all having IP and WiFi capabilities and all capable of picking up enterprise data and email."
There are a number of things we can do to stop these devices from compromising the network by blocking their use. We can block USB devices from being able to connect unless they are a managed resource, so that users can't just plug anything they bring in from home. All USB devices have an ID, which can be registered with a central authentication server to check before a computer allows it to be used. Of course this needs third-party software, but can be done quite easily. We can also block devices from being able to obtain an IP address or connect to the corporate network in the first place. We shouldn't have a free-for-all attitude on the network. It should be locked down to approved devices only. Only managed devices can connect and they will have to authenticate.

I think it’s asking for trouble to allow users to connect their own private devices to the network or services. I don’t see how you can comply with any standards or your own security policies when allowing this, as you don’t know what’s connected or how it’s configured. Even if they are secure (a very big IF), by not knowing the configuration or being able to audit it, you are surely in violation of any accreditation or certification that you may have because you cannot test or 'prove' your compliance.

Comments

Popular Posts

You say it's 'Security Best Practice' - prove it!

Over the last few weeks I have had many conversations and even attended presentations where people talk about 'Security Best Practices' and how we should all follow them. However, 'Best Practice' is just another way of saying 'What everyone else does!' OK, so if everyone else does it and it's the right thing to do, you should be able to prove it. The trouble is that nobody ever measures best practice - why would you? If everyone's doing it, it must be right.

Well, I don't agree with this sentiment. Don't get me wrong, many of the so-called best practices are good for most organisations, but blindly following them without thought for your specific business could cause as many problems as you solve. I see best practice like buying an off-the-peg suit - it will fit most people acceptably well if they are a fairly 'normal' size and shape. However, it will never fit as well as a tailored suit and isn't an option for those of us who are ou…

Coventry Building Society Grid Card

Coventry Building Society have recently introduced the Grid Card as a simple form of 2-factor authentication. It replaces memorable words in the login process. Now the idea is that you require something you know (i.e. your password) and something you have (i.e. the Grid Card) to log in - 2 things = 2 factors. For more about authentication see this post.

How does it work? Very simply is the answer. During the log in process, you will be asked to enter the digits at 3 co-ordinates. For example: c3, d2 and j5 would mean that you enter 5, 6 and 3 (this is the example Coventry give). Is this better than a secret word? Yes, is the short answer. How many people will choose a memorable word that someone close to them could guess? Remember, that this isn't a password as such, it is expected to be a word and a word that means something to the user. The problem is that users cannot remember lots of passwords, so remembering two would be difficult. Also, having two passwords isn't really…

Security is a mindset not a technology

I often get asked what I look for when hiring security professionals and my answer is usually that I want the right attitude first and foremost - knowledge is easy to gain and those that just collect pieces of paper should maybe think about gaining experience rather than yet more acronyms. However, it's difficult to get someone to change their mindset, so the right attitude is very important. But what is the right attitude?


Firstly, security professionals differ from developers and IT engineers in their outlook and approach, so shouldn't be lumped in with them, in my opinion. The mindset of a security professional is constantly thinking about what could go wrong (something that tends to spill over into my personal life as well, much to the annoyance of my wife). Contrast this with the mindset of a developer who is being measured on their delivery of new features. Most developers, or IT engineers, are looking at whether what they have delivered satisfies the requirements from t…