{{item.title}}
{{item.text}}
{{item.title}}
{{item.text}}
It is not a discovery that artificial intelligence and machine learning could be used for automating decision making and robotic process automation which may limit manual work performed by analysts.
Although there were many examples of errors and mistakes of Large Language Models, the conventional statistical models are still informative in terms of the risks that clients and transactions bring to the financial institutions. Humans have a more comprehensive approach to anomaly identification and verification than a model limited to a specific task.
When it comes to transaction monitoring, we rather focus on eliminating the burden of time and use people’s attention to utilize their unique skills where they are actually irreplaceable.
How do we do that?
Before we start our journey we need to take care of these elements. The better they are, the faster and more efficient we will be able to accomplish our tasks.
Alerts are generated if there is a trigger, a rule, or a score exceeded that shows some red flags in customer’s activity. To evaluate risk of actual transactions we need to have all necessary information at hand. It is not only about who were the originators and the beneficiaries or how big the amount of money was. This also concerns countries, types of transfers, time of execution and many more.
Apart from transactional data, all data points covered by Know Your Customer policies are necessary for comprehensive investigation conducted by analysts.
It is easy to imagine how it would look for a single client, but paraphrasing a known song “imagine there’s no database”. But there is one - all data is stored somewhere, whether it is a relational database, data lake or a cloud they are kept somewhere.
Until now, financial data has been stored in relational databases which means we need to have infrastructure, governance and good quality of data.
One of the irritating activities is finding a piece of information in one application, documenting it, coping and pasting into another one.
Many financial systems have origins in late ‘80 when the general ledger software was built. In order to smoothen the process of analyzing client activity we need to have all necessary information just in sight. So at one glance we can see all transactions involving counterparties, countries and red flags blinking in the system.
If we could connect the system presenting this information with external repositories of sanctioned parties and watchlists we would obtain the actual 21st century system for alert handling.
All information at hand with highlighted red flags and all KYC data about customers available - no copy pasting.
Reviewing alerts generated by a transaction monitoring system is a main task for analysts. Their work is often focused on searching for information in different systems, managing simple tasks in databases or performing system administration.
Instead of looking for AI that will do the work for us it is probably more realistic to look at AI as a tool for process automations.
It is not only about connecting existing data sources, it is also about doing simple tasks in place of humans: automating client screening, fulfilling missing data, checking data quality.
Perceiving Artificial Intelligence as an escape from work is tempting though it is rather an interface between humans and machines that may have a significant impact on our productivity.1
Very sophisticated AI solutions use billions of variables that we are not aware of. The variables that are fed into statistical models are usually easily interpretable: transaction amount, country, number of accounts, etc. There are tools that could be used to limit the population of unproductive alerts.
Statistical models use known scientific methods that are based on historical data analysis to calculate the probability of particular events occurring. Using linear or logistic regression not only can you calculate the probability of a particular client’s behavior but also estimate errors and examine how the tool is performing on production.
On the top of statistical models, there are some additional tricks that could be applied such as:
Such ideas give additional conditions to generate the alert, i.e. less alerts are created for analyst review. Approach defined in that way ensures that alerts with low risk are put on hold and in case of additional events that increase their riskiness they are moved for review.
In case of alert grouping cases generated for the same customer are assigned to the same analyst. With such a way of alert alignment the same analyst will review all suspicious alerts across generated, even for different AML scenarios, at one time thereby saving his/her time.
Furthermore this approach assists in establishing a holistic view of customer’s activities and risks they present leading to an efficient risk rating and coherent record keeping. Only one analyst is checking the client's activities and applying this knowledge to all generated cases - simple but time saving approach.
When focusing on enhancing transaction monitoring alert review process following points should be considered:
Taking care of data - appropriate governance, data quality and availability
Eliminating unnecessary tasks within alert review process
Using automation (AI, ML) tools to limit simple tasks done by humans
Applying simple AI methods to decrease the number of created alerts for review
As we know Rome wasn’t built in a day, neither were Transaction Monitoring automations.
Source:
1. Briggs, J. et al. (2023) The Potentially Large Effects of Artificial Intelligence on Economic Growth (Briggs/Kodnani). Global Economics Analyst. Available at: https://www.goldmansachs.com/insights/pages/generative-ai-could-raise-global-gdp-by-7-percent.html (Accessed: April 13, 2023).
Check out other publications in the “Transaction Monitoring” series:
Partner, Financial Crime Unit, PwC Poland
Director, Financial Crime Unit, PwC Poland