The Financial Inclusion/Financial Crime Nexus
Through our work in anti-financial crime (AFC), we’ve witnessed firsthand how FinTechs have been able to utilise new technologies in order to support traditionally underbanked customers - from international students to migrant workers. In this arena, FinTech leadership is absolutely necessary - nearly 2 million people in the UK are still considered financially excluded. And sadly, this isn’t the full picture; the under-banked, who don’t have access to the full range of financial resources and support, are also cause for greater efforts toward inclusivity. In the US, for example, where the unbanked population is around 6%, nearly another 20% is considered “underbanked,” indicating the scale of this problem.
Reporting from the FCA directly connects the financial exclusion of the unbanked and underbanked to the wider issue of vulnerable customers. Nearly half of people in the UK display one or more characteristics of vulnerability, such as mental or physical health difficulties or financial debt or distress. Vulnerability clearly can harm an individual’s capacity to navigate financial services. Without financial knowledge and support, a backbreaking cycle may develop, where financial anxieties worsen existing vulnerabilities, increasing the likelihood that someone is pushed further and further from the traditional financial ecosystem.
In July, the FCA published guidance for consultation on the treatment of vulnerable customers. This guidance aims to help firms better understand vulnerable customers, ensure their staff have the skills to engage and support vulnerable customers and build their products, services and processes to be more inclusive. We at FINTRAIL are impressed with the steps the FCA is taking to tackle this issue head-on, by providing clearer expectations, recommendations and examples of best practice.
With that said however, the guidance offers limited discussion around the intersection between vulnerability, financial inclusion and AFC efforts. This comes in spite of strengthening AFC efforts that, while designed to prevent criminals from exploiting and profiting off of victims, may unfortunately disadvantage some vulnerable individuals as well. For instance, customers who have had their identities stolen whose names have been added to fraud databases may struggle to get access to financial products - especially if they don’t know their identities have been stolen. Customers could also fall victims to common financial crime scams, such as romance fraud or authorised push payment fraud, or could be manipulated into becoming money mules - unaware that their actions are actually money laundering. Another example to consider comes from the 5th anti-money laundering directive (5MLD), which will reduce the threshold for applying simplified due diligence on prepaid card customers from €250 to €150 - making it more difficult for many financially excluded individuals to access one of the key financial products they rely on. As we near the close of the FCA’s consultation period on October 4, here are a few of our impressions of how we can improve financial inclusion while also championing best practice in our AFC controls.
For FinTechs onboarding a customer, we often see the use of electronic address verification and selfie + ID matching used to facilitate a smoother onboarding process. While selfie + ID matching can sometimes help firms identify vulnerable customers through visual indicators (e.g. evidence of coercion or injury), both tools can still struggle to identify or verify types of vulnerable people. For instance, someone fleeing domestic violence may not have access to their standard documentation or have proof of address. A recent immigrant to the UK may struggle due to poor language skills to understand the requirements for onboarding and again may not pass an electronic address check, by not being on the electoral roll. Young customers from financially disadvantaged backgrounds may not have a passport or a driving licence. A customer with mental or physical health disabilities may have someone assisting them in onboarding, such as helping them take a selfie, which could look suspicious.
Aside from requesting additional pictures of ID or proof of address documents, small-to-medium sized FinTechs may not have a specific, codified response for how to deal with a customer who fails their initial attempt at onboarding, which can lead to genuine customers who lack ID for legitimate reasons being de-risked. JMLSG provides some useful information on how best to formulate your approach, to ensure you maintain a risk-based approach while also practicing financial inclusion. For instance, other documentary evidence could be beneficial - such as a social services letter, confirmation of studies letter or evidence of an asylum application. Many FinTechs already use data about their customer as well as data from their customer - and a mix of data on the customer’s online presence, email address and phone number can support decisions not only on a customer’s genuineness but also their vulnerability. Whether due to the additional steps needed to verify identity or due to overlapping factors between vulnerability and financial crime risk (e.g. high levels of debt), it may be useful to apply enhanced transaction monitoring controls to customers, even if you do onboard them. We’ve already seen FinTechs taking steps to this, such as with customers engaging in gambling, so it would be great to see these efforts pursued even further within the sector.
FinTechs and other financial institutions typically engage in transaction monitoring tools that are designed to spot unusual customer behaviour. However, what could be unusual for a standard customer may be normal behaviour for a more vulnerable customer with non-standard needs. For example, vulnerable customers may be prone to sudden, impulsive purchases, unusual or large-value payments to legal firms or health suppliers, confusing financial patterns designed to repay debts or atypical rent agreements. Thus, it is important to consider vulnerable customers not only when evaluating alerts but also when designing rules. While no two customers are the same, transaction monitoring tools, especially those relying on machine learning, should be calibrated with an eye to avoiding false positives related to vulnerable customers. This may be difficult to fully achieve given the relatively small customer base of many FinTechs, but at least considering vulnerability indicators when working on rules and calibration is a good place to start. FinTechs may also want to consider allowing their customers to set their own behavioural flags, such as for gambling, binge drinking or shopping sprees. FinTech products like Toucan have been spearheading developments in this area. Within their platform, customers can link their bank accounts and set up personalised vulnerability rules and thresholds that can also trigger a general message to a “trusted ally,” who may be able to contact the vulnerable customer and check in on their overall health.
One requirement under the FCA’s new guidance is for firms to take a ‘proactive approach to understand the nature and extent of vulnerability’ within existing customer bases. FinTechs could engage in best practice to abide by this expectation by designing rules tailored to specific patterns of behaviour indicating vulnerability, generating a ‘soft stop,’ or a flag that is retroactively reviewed. These flags could be assigned to a person or team responsible for identifying and understanding vulnerability, and the results of the exercise could then be used to help tailor and refine rules that better separate the vulnerable from the suspicious. Vulnerable customers who may have had their accounts taken over or who may be victims of authorised push payment fraud should still face ‘hard stop rules,’ however, to prevent money from being laundered.
The FCA guidance provides strong recommendations on ensuring staff are trained in how to deal with potentially vulnerable customers. This is especially a concern when investigating or speaking to a customer who is suspected of financial crime. For a lot of FinTechs, there are two types of outreach to customers that can be used for financial crime-related investigations - automated messaging from robo-advisors and manual messaging from a live human. In the case of automated messaging, the FCA gives examples of how robo-advisors have been set up to help detect potential flags for vulnerability (e.g. detecting speed to type or respond); this can potentially be expanded into the financial crime investigation space to ensure customers demonstrating signs of vulnerability receive more tailored messages and where necessary, are escalated to a human.
For messaging done by a person, it is imperative that front-line customer relations and compliance staff receive training on how to handle vulnerable customers, as the FCA suggests. However, this training must be especially precise in the financial crime space, to prevent tipping off. Another concern that we have noticed is customers who are genuine fraudsters pretending to be vulnerable in order to play on the sympathies of front-line staff and financially benefit, such as through having their account unblocked. If you’re front-line staff, it is best to ensure you have a positive but firm stance when interacting with customers and be wary of how known or suspected criminals may try to influence you.
As much as we wish that it was easy to draw clear lines between vulnerable customers and suspicious customers, the waters are undeniably murky. Only through robust efforts can we truly understand the nature of our customers and build meaningful solutions to support those who are vulnerable while preventing the exact sort of suspicious customer that causes vulnerability. Here are a few steps you can consider taking today to help manage financial inclusion going forward:
Consider defining more specific approaches regarding vulnerable or potentially vulnerable customers, particularly in relation to customer due diligence, customer interaction and transaction monitoring. This should be even more robust for FinTechs specifically targeting the financially excluded. A good approach should start with the identification and confirmation of the customer’s vulnerability - ask yourself, is there a good reason they wouldn’t have the documents required for onboarding or would be transacting this way?
Once vulnerability has been identified, there should be clear escalation channels, training for front-line staff on engaging with vulnerable customers for KYC, as well as defined expectations around supplementary documents and enhanced monitoring where required.
Consider designing transaction monitoring rules with ‘soft stops’ to help identify patterns of behaviour for vulnerable customers, as part of your proactive approach to understanding vulnerability indicators on your platform and as part of your efforts to distinguish vulnerable from suspicious.
Tailor automated outreach messaging tools to detect signs of vulnerability and to escalate potential cases of vulnerability to trained human staff. Ensure all robo-advisor communication is friendly, respectful and easy to understand.
Ensure front-line staff training not only encompasses how to deal with vulnerable customers, but how to avoid tipping off and how to handle customers that may fake vulnerability to financially gain.
The FCA consultation ends soon. Click here if you want to provide your opinion directly. Or if you want to discuss these issues more and work to make your AFC controls support financial inclusion, contact the team at: email@example.com