Back in 2006, Treating Customers Fairly ushered in a new era. From that moment on, ‘buyer beware’ ceased to be a justification for selling what might prove to be an inappropriate product.

Lenders were required to adopt more of a caretaking role to ensure the right fit of product to client. 

Since then, every few years has seen an incremental escalation of rules that have pushed responsibility even further back into the hands of the providers of financial products and services. 

Now we have the FCA’s Consumer Duty, which takes caretaking to a much more demanding level. With it, onboarding has entered a whole new world of complication. Knowing your client is beginning to require granular insight. 

The sort of granularity that we are in for can be seen by considering the emphasis on ‘vulnerability’. The regime demands providers identify vulnerable customers from the outset and focus on their needs at every touchpoint - from written comms, text and telephone channels to websites and online portals - to ensure Consumer Duty outcomes. This may mean that a host of existing touchpoints fail. How can an elderly customer with few digital skills be said to be supported by a chatbot? Can someone who has lost their job be properly supported by a tone deaf arrears chasing protocol? 

Given that a recent study claimed that 2.4 million UK households were at 'high' risk and 6.3 million were at an 'elevated' risk of vulnerability, ensuring compliance is obviously not going to be easy.

From this one area we get a flavour of what’s in store. It might not be immediately apparent, but what we are facing is nothing short of a revolution. The trend towards ever more client scrutiny has only just started - and it is not just going to be limited to the applicant. 

In addition to affordability, vulnerability, and the other applicant characteristics, we can expect similar focus on many other areas of the transaction. 

Already there are signs of this occurring. Mortgage lenders are now demanding applicants’ properties meet set standards of energy efficiency. Nationwide Building Society and NatWest want to make 50pc of their mortgage customers’ homes EPC rated ‘C’ or more by 2030, even though the government has backtracked on making this a legal requirement of rental properties.

Elsewhere, the FCA is working with the Trustees of the International Financial Reporting Standards (IFRS) Foundation, exploring ‘sustainability reporting’ so that the environmental footprint of every transaction can be assessed by decision engines. 

Apparently, the FCA envisages an onboarding process where loan application software is going to take into account a great deal more variables than we have been used to. 

The Farrage de-banking furore revealed just how far the Environment Social Governance (ESG) agenda, which encompasses social as well as environmental goals, is being embraced. How long before providers will be required to show that their loan book ticks a lengthening list of ESG-type requirements? 

Lenders need to realise that they may currently be at the relatively thin end of what will become a very wide wedge. The transition to automated loan decisioning can no longer be seen simply as the digitisation of existing scorecard processing. It is going to encompass way more than this. 

Once you accept that this is where we are headed, you begin to appreciate that access to ever more data streams, along with decisioning software’s ability to rapidly analyse, is going to be key. Banks and third party risk management providers are going to have to quickly access much wider data flows and have the capability to extract, digest and analyse at pace.

It is pretty obvious that most lenders’ data processing capabilities are not currently up to this standard. Many have information effectively spoon fed to them from data sources, including niche and CRA. 

They have become used to bureaus responding to their requests in the form of ‘compiled characteristics’ that only summarise raw client data. This could simply comprise the number of CCJs a prospect has, number of defaults, or extent of arrears over a 24-month period. 

This and other forms of rather standardised credit decisioning needs to be replaced by sophisticated company-specific credit risk analysis.

It will no longer be a case of making do with the scores that the bureau provides. They may be predictive, but as a minimum you are going to have to understand why they are predictive for your specific customer base. To be able to say ‘the data is predictive when these questions are asked and these elements are there’. 

From such a position lenders can finesse scorecard design over time so that risk assessment software becomes fine tuned to required customer demographics. In this way, they can extract the intelligence that they need for their business, rather than take the pieces of data plus the predetermined questions that a bureau has arbitrarily decided they need. So, in our example above, rather than a bureau merely telling the lender that an applicant has missed 'X' payments over the past two years, the lender should be able to take 24-months of raw data from the bureau and create their own characteristics to decision against. 

We have to move to a situation where all lenders have the ability to interrogate across all their existing and newly-developing data streams. 

Traditionally asking CRAs to undertake such extra analysis has come at a significant cost, and, if you wanted to undertake higher level analysis in-house, you would need the sort of dedicated IT resource that only big banks have. 

Thankfully, advances in data technology are making sophisticated in-house analytics achievable for every lender, even those that lack IT and data science knowhow. 

LendingMetrics’ innovative new ADP Data Orchestrator, for example, allows users to interrogate the data they receive from third parties in a way that is optimal for their business and in line with the latest regulatory or other stipulations. Whenever a package of data arrives, the user can create bespoke characteristics via a straightforward editor interface. So the lender’s underwriting rules engine determines sets of predictive questions across multiple data inputs that it wants to decision against, and then lets the Data Orchestrator get on with it. Decisioning becomes instant and 100% compliant with a lender's policies. 

It is a case of ‘provide the raw data and the Data Orchestrator can do the rest’.

As additional layers of compliance are introduced that take in an ever widening range of regulatory and other measures like ESG, such a capability is going to become crucial.

Finance providers need to realise that their ability to thrive in the future is going to increasingly depend on them becoming experts in data handling and analysis, not just lending.