With greater technological power comes greater responsibility. Firms adopting the latest systems need to do more than ensure regulatory compliance to retain the trust of an increasingly sceptical public
With applications ranging from biometric ID checks to hyper-personalised marketing, cutting-edge technologies such as big-data analytics and machine learning are powering business transformation. Yet, as key questions concerning the ethics of using artificial intelligence remain unanswered, this can be dangerous territory for businesses.
For boardrooms lacking detailed technical knowledge, hurtling into the next big tech project can be a tempting way to boost operational efficiency and keep up with the competition. While recent advances in IT have offered employers many new powers – monitoring homeworkers’ productivity by logging the number of keystrokes they make, for instance – actually wielding these will not always be the smartest move.
“Just because the algorithm says ‘yes’, it doesn’t mean that the board must slavishly follow its lead,” says Dr Ian Peters, director of the Institute of Business Ethics. “It’s worth remembering that, if a digital adoption goes wrong and your staff or customers suffer as a result, blaming the technology can never be an option.”
A company should first ensure that it has enough in-house technological expertise at the senior level – a chief information security officer, for instance – to future-proof itself, he says. The next step, if the firm is considering any tech innovation that would have a direct impact on employees, customers or any other group of stakeholders, would be to engage them in meaningful consultations.
“Each business has its own unique culture, purpose and set of values. Any transformation project that flies in the face of those three elements is doomed to failure,” Peters argues.
Organisations that break the law on data protection, even inadvertently, face hefty penalties. Experian, for instance, could be fined £20m if the Information Commissioner’s Office rules that the credit reference agency sold users’ personal information without their consent. But is merely ensuring legal compliance enough for many firms at a time when transparency and trust are as fundamental to them as the General Data Protection Regulation’s ‘privacy by design’ ethos?
Rightmove’s CFO, Alison Dolan, would argue that it isn’t enough. She believes that in-house compliance functions need to extend their remit, given the high level of public frustration with data security breaches and intrusive online marketing practices.
“While the GDPR took a lot of critical decisions about data privacy out of the hands of businesses, they can’t afford to let their guard down, considering the number of security problems we all see,” she says. “My advice to all content-based organisations is to hire the best compliance specialists you can find, look at their role in the context of the entire ecosystem of the business and promote good dialogue between them and your tech people.”
Although the demand for compliance experts notably outstrips supply, the growing determination of consumers to hold businesses to account has made their input vital, Dolan argues. These professionals will not only understand what the tech team is doing and be familiar with the legal ramifications. They will also be able to “inform the board of approaching ethical problems and set out the implications of any data misuse”.
Guarding the ethics
Instead of relying on in-house experts, biometrics software provider Yoti asks external ethics ‘guardians’ to hold it to account for the commercial decisions it makes. Ethics is hugely important to the company, which provides age-verification systems for clients ranging from governments to online casinos.
The firm’s guardian council is an independent panel of people with expertise in fields such as human rights and data privacy. Last year, it vetoed the management’s plans to extend Yoti’s services to e-voting, citing concerns about “the politicised nature of these processes” and the high level of reputational risk attached to them.
“For an organisation to build a set of core principles from the get-go is quite unusual in our sector,” says Julie Dawson, director in charge of regulatory and policy matters at Yoti. “It makes the company a very refreshing place to work. Having an ethical approach to something as potentially sensitive as biometrics is a clear advantage to us in attracting the right sorts of clients. It enables us to employ people who share our values too.”
Yoti also has an internal trust and ethics group. This invites representatives from all corners of the business – from receptionists and security guards to HR officers and marketing executives, to play their part in shaping the firm’s policies.
“It’s a bit like jury service: everyone gets a turn,” Dawson explains. “Group members are encouraged to use their antennae and report anything – anonymously, if they like – that doesn’t look right.”
While many organisations look to their senior experts in technology and compliance to balance their appetite for innovation against their wider ethical responsibilities, both Peters and Dolan believe that the buck must always stop with the chief executive.
“Facing the prospect of losing money to be the sort of business you ultimately want to be is probably the biggest test of your ethical values. A firm’s readiness to do that is an attitude that needs to come from the top,” Dolan says. “Money-spinning opportunities that appear to harm nobody but may not be fully GDPR compliant can and do arise. It puts me in mind of the old definition of integrity being what you do when no one’s watching.”