It’s all in the mind

Get the psychology wrong in your security and users will simply find ways to work around it. But get it right and they will work with you, as Wendy Grossman reports

Balancing the need for security with the needs of users is a huge challenge. Too often the winner is technology, typically controlled by people who are good at technology but less so at communicating with - or understanding the needs of - people and the business.

This leads to common mistakes such as banning new technologies from the workplace, installing surveillance mechanisms to ensure that workers comply with the rules, and thinking of security as a set of technology products rather than as an evolving system incorporating both technology and people.

The temptation to do these things is understandable, yet they are mistakes all the same. They are attempts to turn back the clock to a time when organisations could rely on a controllable perimeter to keep their systems safe, but that option is long gone. The attack on RSA’s SecurID system was tightly targeted on a few low-profile individuals inside that company, for example, but it opened holes around the world.

Adrian Davis, a principal analyst for the Information Security Forum (ISF), warns: “Control everything and you end up controlling nothing.” He recommends against blanket bans. Even if you lock down users’ computers and rip out Internet access entirely, workarounds are as easily available as the smartphone in a user’s pocket.

At November’s Westminster Forum on ecrime, cyberthreats, and protecting the national critical infrastructure, Martin Smith, chairman and founder of The Security Company, told the story of the organisation that banned Facebook – and his check-up, which found 1.5 million Facebook hits. The ban had created an enormous, uncontrolled hole where what was posted was unknown and not backed up or archived.

The seemingly logical reaction is to install comprehensive surveillance and monitoring systems. Three basic legal rules apply, according to Chris Pounder, co-founder and director of the information law training company Amberhawk. The most important is the principle that lacking legal direction to the contrary all surveillance monitoring must be overt. As far back as the 1992 Ergonomics Directive, the EU banned the installation of keylogging software without the knowledge of the employees being monitored.

Similarly, an employer who wants to install a hidden camera to watch for employees dealing drugs in the restrooms can only do so if it is required by the police. Otherwise, the company is on the wrong side of the Regulation of Investigatory Powers Act. The second general rule, therefore, is that non-covert surveillance must be transparent and justified.

The price, however, is that employees tend to resent it. Legally, monitoring that records data and is not directed by law is classed as processing personal data and falls under the Data Protection Act. In the UK and the EU generally – though not in the US – aggrieved employees may challenge such monitoring by filing a complaint with the Information Commissioner’s Office.

Control everything and you end up controlling nothing. Workarounds are easily available

“If an employer wants to interfere with an employee’s privacy, they have to be able to justify it as transparently needed,” says Mr Pounder. The third general rule is the longestablished data protection principle that data should be kept for no longer than necessary. If, for example, Oxford’s purpose in installing audio recording devices in all of the city’s taxis is to keep cabbies from being beaten up, then under this principle the data should be deleted the instant the unharmed cabbie drives away after being paid.

The seemingly logical reaction is to install comprehensive surveillance and monitoring systems. Three basic legal rules apply, according to Mr Pounder, the co-founder and director of the information law training company Amberhawk. The most important is the principle that lacking legal direction to the contrary all surveillance monitoring must be overt. As far back as the 1992 Ergonomics Directive, the EU banned the installation of keylogging software without the knowledge of the employees being monitored.

Similarly, an employer who wants to install a hidden camera to watch for employees dealing drugs in the restrooms can only do so if it is required by the police. Otherwise, the company is on the wrong side of the Regulation of Investigatory Powers Act.

The second general rule, therefore, is that non-covert surveillance must be transparent and justified. The price, however, is that employees tend to resent it. Legally, monitoring that records data and is not directed by law is classed as processing personal data and falls under the Data Protection Act. In the UK and the EU generally – though not in the US – aggrieved employees may challenge such monitoring by filing a complaint with the Information Commissioner’s Office.

“If an employer wants to interfere with an employee’s privacy, they have to be able to justify it as transparently needed,” says Mr Pounder. The third general rule is the longestablished data protection principle that data should be kept for no longer than necessary. If, for example, Oxford’s purpose in installing audio recording devices in all of the city’s taxis is to keep cabbies from being beaten up, then under this principle the data should be deleted the instant the unharmed cabbie drives away after being paid.