How lenders are using tech to show a more human side

Under regulatory pressure, financial services providers in the UK are adopting new tech to identify and support vulnerable customers. These automated systems are enabling lenders to show a more human side
Business people wearing headsets in office

If you received a text from your credit card provider recently, you may have noticed that the tone of it seemed a little friendlier than that of previous messages.

The way that financial institutions are communicating with customers has changed, driven in part by the Financial Conduct Authority (FCA), which has urged them to do more to support the many UK consumers whom it deems vulnerable. Research by the FCA has classified 27.7 million adults as having some “characteristics of vulnerability, including poor health, experience of negative life events, low financial resilience and low capability”.

The issue is about more than making text messages to customers sound more cordial, of course. Companies are using sophisticated tech to gather and analyse data about customers, so that their staff can offer tailored advice and support. Debt-collection company Lantern uses tools supplied by MaxContact, for instance. These can draw information about any given customer’s history from numerous systems and present it to the firm’s call-centre agents while they’re talking to that individual. 

This is about establishing the sort of relationship in which customers will call if something goes wrong because they know we’re here to help them

“All this data gives us a picture of the customer. It answers questions such as: have they defaulted before? How often? Do we have details of their income and expenditure? From there, we can offer a better service,” says Lantern’s CEO, Denise Crossley. “For instance, there would be no point in getting someone to agree to something they couldn’t afford, because they’d simply end up in default. This is about establishing the sort of relationship in which customers will call if something goes wrong because they know we’re here to help them and pause payments if we need to.” 

In some cases, firms can integrate their own information on a customer with third-party data for further insights into an individual’s potential vulnerability. For instance, REaD Group offers a system, which is used widely in the utilities industry, that bases its measures of vulnerability on average data in a particular postcode covering factors such as age, income, socioeconomic status, employment and long-term illness rates. 

Going one step further, some lenders are using a form of AI called natural language processing to predict a customer’s vulnerability based on their calls, emails and webchats. This software tracks words and phrases along with audio cues such as the pitch, tone and pace of a conversation. It can then crunch thousands of pieces of data to alert users at the first sign that a customer is starting to become distressed and/or confused. 

At Lantern, the MaxContact system scans conversations for particular ‘language triggers’ from customers that can prompt an agent to react in a specific way or share certain information. Crossley cites an extreme example: “If someone were to mention suicide, that call might be flagged to a senior manager, who could offer support.” 

This approach helps to reassure and mollify customers, but it also makes business sense for lenders, she adds. “In our market, what happens in the relationship between the company and the customer ultimately affects the pricing of our debt purchases. If we provide a better customer experience, we see lower default rates and higher recovery rates. This in turn enables us to price more competitively in the future.” 

The Key Group is a provider of equity-release mortgages to people aged over 50. Anyone signing up for such a loan is required to meet one of the company’s approved independent advisers to check that the product is suitable for them. The company recently invested in a speech analysis tool from Aveni, which scans audio recordings of these meetings to identify any potential vulnerability issues. 

“These meetings can last up to 90 minutes, so it would be a huge task for someone to listen to all of those recordings,” notes Key’s CEO, Will Hale. 

The Aveni system considers factors such as the language, pace and clarity of conversations. It then generates a report that will identify anything of potential concern. While age isn’t considered a vulnerability factor, the software does look for signs that a customer may not understand what’s being explained. If that happens, the adviser can arrange to speak to the customer again with a relative present, or refer them to one of Key’s so-called vulnerability champions – agents with advanced-level training in how to deal with customers who may need extra support. 

Over the next six months, Key will be rolling out a new customer relationship management (CRM) platform in its contact centre. Then it will use Aveni on calls to provide instant support to its front-line staff. 

“Ultimately, we want to get to a point where we have a full account history integrated with our CRM and telephony systems, so that we have full AI support of all communications,” Hale says. “It’s very much a tool that supports agents and it helps to ensure that we aren’t approaching conversations with the wrong preconceptions.” 

Although the systematic detection of vulnerable customers can be helpful to those individuals if it means that they receive more support, people are entitled to privacy, stresses Simon Thompson, head of data science at IT consultancy GFT. 

Financial institutions that use this type of analysis in contact centres need to establish proper procedures specifying how the results will be used, he argues. “You can’t just stir this data into the mix and hope for reliably positive results. You need to monitor and review both the interests of the customers and how well this technology serves them.” 

It’s vital that people are aware that their conversations are being analysed and the extent to which the analysis affects the firm’s decision-making, Thompson adds. “Ultimately, technology can tell you only so much. Does a customer sound distressed on the phone because of the way their query is being treated, or because a toddler is jumping on their feet? We all need to remember that there’s no such thing as an ethical machine. The ethics always rest with the people and the organisation. It’s a responsibility that cannot be outsourced.”