Skip to content

Building Trust in Open Banking with Behavioral Biometrics and Machine Learning

trust in open banking and fraud prevention

Strategies for fraud prevention in payments are having to evolve quickly, as new technologies emerge and digitalization of the banking ecosystem continues at pace. I spoke with Giselle Lindley, Principal Financial Crime Consultant at ACI Worldwide and Tim Dalgleish, Head of Threat Analytics, Asia Pacific at BioCatch to understand how financial institutions can use payments intelligence to build trust in this challenging environment.

Rachael Tomaney: How is the move to a digital ecosystem impacting fraud prevention in payments?

Giselle Lindley: Our lives are conducted more and more online as we utilize digital devices and services. There are lots of benefits that result from being more connected, such as increased choice and convenience when it comes to products services, but the flipside is that we are sharing more of our data; willingly and otherwise. The challenge here for consumers is understanding and controlling what and how that data is shared, and with whom, especially when this data forms part of your identity.

Personally Identifying Information (PII) and other identity data is now more valuable to fraudsters than the details of our payment card or account. Identity thieves can use this data to access lines of credit that total more than your savings, and impact you well beyond repaying funds stolen from your account. Consumers might find it super convenient to be able to take out a loan with your provider of choice, straight from your smart device, but digital banking presents a new set of Know Your Customer (KYC) paradigms for all players in the payments value chain.

Those payments players must protect data at every stage of the customer lifecycle if they want to protect their customers and their own reputation – and maintain the positions of trust they have enjoyed in the past. Customer trust is more crucial than ever for banks. At Money20/20 Asia, we asked the audience: “Which type of institution do you most trust to protect your personal data?” 60 percent responded in favor of banks, and only 3 percent opted for fintechs. That speaks volumes to the value of trust that customers place in banks. The value of this relationship is immeasurable, but easily broken; it is precious and must be recognized as such, so it is the driver at the forefront of any product or service development, and in interactions with customers.

RT: How can banks prevent fraud in the digital banking channel?

Tim Dalgleish: It’s critical that banks catch fraudsters attempting to use stolen or made up identities to open new accounts in digital settings.

Traditional banking fraud controls have been focused on the accuracy of the data that has been submitted. However, this is no longer a sufficient strategy, given the huge volume of compromised personal data globally. As an industry, we need to focus on the steps that happen before the data verification. How was the data submitted? There are behavioral biometrics techniques that can ascertain whether the data was even entered by a human being, or if it was entered by a fraudster using stolen or synthetic identities. Identifying bots in digital banking is also an important part of a robust, modern fraud prevention strategy.

Beyond stopping new account fraud, creating a behavioral biometrics profile of the account holder is an effective way of preventing account takeover attacks. BioCatch uses behavioral data specific to the customer to identify whether the entity entering the data into the digital channel is the actual owner.

RT: How can banks turn behavioral biometrics data into a positive customer experience?

GL: The critical puzzle piece is for banks to be able to consume advanced fraud prevention data, such as behavioral biometrics, into machine learning models and to turn these combined data sources into actionable insights.

Once a flag is raised in your payments intelligence system, you need to verify whether it is a false positive. For example, if you need to verify whether it is a bot or a human completing an online application for credit, UP Payments Risk Management (PRM) can send an action to the user interface to check for a natural behavior response. What that means is perhaps temporarily suspending the ‘mouse’ movement on a PC screen to see whether the user then shakes that mouse furiously. A bot does not know the mouse has stopped working and will not respond like a human.

With the rise of smart devices, there are a whole range of sensory capabilities in these devices that can be used to feed these natural behavior checks. Even if we think we would all react in the same way, we all have nuances in the particular way we do it, and the way we use our device. Are you left or right handed? Does your smartphone have a cracked screen? Do you tend to use your device in portrait or landscape mode? And, if you’re like two percent of our respondents at Money20/20 Asia, how loud do you shout at the screen when it freezes?

Polling the audience at Money2020 Asia

Polling the audience at Money20/20 Asia, where ACI and BioCatch shared fraud prevention and behavioral biometrics insights from the stage.

 

RT: What other kinds of fraud can behavioral biometrics data help combat?

TD: Behavioral biometrics are crucial in tackling modern banking fraud, specifically vishing or Authorized Push Payment (APP) scams. The technology can be used to identify usual behavior patterns for customers, which banks are already tracking from a transaction point of view. It’s hard to detect, but there are subtle behavioral traces to work with. These include how a user navigates a page, their usual typing speed and cadence for entering information, or hesitations. When a user logs into their digital banking app, they usually have a deliberate reason to do so. They may be paying a bill or viewing a statement. However, during a vishing or APP scam, they are often being ‘coached’ through that session, and behave differently. That might be a verbal coaching from a fraudster on the telephone, or a written coaxing from an email with ‘new’ account details.

Our unconscious behaviors are unique to us as human beings and they can be used to better protect customers.

RT: How do real-time payments and open banking impact fraud prevention?

GL: With real-time payments, once the transaction is ‘pushed,’ it’s irrevocable. This is why scammers have begun to target consumers via social engineering to create scenarios where the customer genuinely pushes the payment to the fraudsters. The challenge is further complicated for banks by the rise of open banking and the opportunity for customers to initiate payments via a Payments Initiation Service Provider (PISP), or approve the sharing of their data with an Account Information Service Provider (AISP). It’s therefore critical that banks protect their customers to maintain their trust. Fraudulent payments must not be authorized and identity data must not be shared with fraudulent parties.

The liability of banks in these kinds of real-time payments and APP scams varies globally. The framework for open banking, especially outside of the European Union, is not clear on who is responsible for authenticating third parties that are accessing customers’ financial data. Asia-Pacific is considering its framework and the identity protection advice it offers to consumers, but it needs to move faster.

Ultimately, it’s a question of customer experience (CX). Even if a bank is not liable, if fraud happens on their account, customers will look to their bank to help them. They trust and expect to be protected when interacting with their bank online. The way to balance the CX with protection is to enable passive controls, while at the same time engaging the customer in the fraud prevention cycle. This includes everything from awareness and education campaigns, to clear data sharing controls, as well as advanced fraud prevention solutions that look at the enterprise view of the customer.

RT: What can banks do now to protect their customers against new kinds of fraud threats?

GL: In a more complex ecosystem, banks need an intelligent network compromised of fraud prevention partners that can feed specialist information into their artificial intelligence and machine learning models. The models should be orchestrated by a payments intelligence solution that enables insights to be drawn from the models themselves.

TD: Exactly, BioCatch is one such partner than can bring accurate and rapid identification of genuine customers to these models in a digital identity context. It’s a critical part of the decision-making for omnichannel fraud prevention, particularly in the real-time payments world.

GL: From our perspective, it’s important to combine inputs from partners like BioCatch to create actionable intelligence. This requires orchestrating non-financial and payments data from across the business to rapidly identify risk and respond to it. The non-financial data element is particularly important as banks respond to the challenges and opportunities of open banking. They need to retain the primary customer relationship, which means allowing access into the account, but it must be in a secure and managed way. The banking ecosystem needs to be real-time-ready for both payments and data analysis in order to protect customers and enhance the CX. This enhanced CX is going to be the way they leverage the value of the trust they have built with customers.

When we think about fraud prevention, it often includes an element of friction. The key is to provide either no friction, or just enough friction as appropriate for the scenario for the customer to feel secure and not frustrated. You must not only protect the customer without impeding their ability to transact, you should protect them in a way that enhances their financial services and lives. This is when the bank is truly deserving of trust.

 

Discover more about leveraging machine learning for payments intelligence. Listen to this webinar from Marc Trepanier, Principal Fraud Consultant at ACI Worldwide, and Julie Conroy, Research Director at Aite Group.