Can AI combat complex cross-border payment crime?

Swift sign
Swift is deploying machine learning so corporations can add AI-powered fraud fighting tools.
Adobe Stock

Banks are adding new artificial intelligence to combat fraud involving payments, email phishing, identity verification, document analysis and other threats, but using AI to protect cross-border payments presents different challenges.  

AI is a great tool to mitigate financial crime, especially in domestic consumer payments due to the repetitive nature of consumers' transactions, said Stephen Grainger, head of data and analytics and FCC at the Society for Worldwide Interbank Financial Telecommunication, or Swift. Swift provides the main messaging network through which international payments are initiated.

"An individual tends to make the same types of repeatable payments," Grainger said. "It would be easy for your issuing bank of your credit card or your debit card or for Venmo to see [recurring payments]. You can start to model that type of behavior." 

Cross-border payments, however, largely involve corporations and lack a regular cadence that's prevalent in domestic payments, which makes those transactions harder to model. And  corporate payment sizes are usually higher than domestic payments, meaning higher potential losses.

"That's the complexity dynamic that we need to try to find a way to manage," Grainger said. "How can we reimagine the way we think about fraud and anomaly detection?" 

In addition, a majority of cross-border payment fraud is operational fraud, Grainger said. For example, a fraudster might impersonate a CEO and ask them to wire money to a new client. 

"That's where it also becomes harder to spot what a cross-border fraud looks like because it's more than likely part of an internal operation," he said. 

Swift plans to use AI in "several ways," Grainger said. The organization is deploying machine learning so corporations can embed rules into workflows. For example, a company may set a rule that says, "When I pay someone new, hold the payment." 

The La Hupe, Belgium-based company is also testing federated AI and machine learning, which uses modeling instead of data in the hopes of reaching banks that are reluctant to share proprietary information. 

"You take the model [and] train it on a dataset, and rather than transferring the data, you transfer the model." Grainger said.  

Swift is working with a "few banks" to define what that model might look like, Grainger said. "We believe that [federated learning] model is starting to gain traction with more regulators as they start to think about how to resolve the challenges and frictions associated with cross-border payments," he said. 

Finding ways to share information between the parties involved in the cross-border transaction is paramount to effectively using AI to fight fraud, said Ben Turner, president and CEO of payments fintech Verituity, which works with Mastercard and BNY among others. 

"The way we use [AI] is to identify and establish and test the fidelity of the relationships throughout that transaction," Turner said. 

"Whether it's the relationship between the buyer and the supplier, the relationship between the supplier's admin and the supplier, [or] the relationship between what the historical payment activity is versus today, you're trying to detect something that's anomalous," he said.  

Sharing information between payment parties can help combat the rise in deepfake fraud by leveraging data only known by those two parties, similar to out-of-wallet questions that financial institutions use to combat consumer fraud, Turner explained. 

AI is also frequently being deployed to check the direct beneficiary of the payment and also the beneficiary's acquaintances, which is called "entity resolution," said John Meyer, managing director at Cornerstone Advisors. 

"Now there's more sophisticated AI that's going out there and saying, 'The person you're sending money to isn't on any of these known watch lists … but they're really good friends with them,'" Meyer said. 

But fraudsters are also using generated AI and other new forms of machine learning to advance fraud through deepfakes. For example, fraudsters can take someone's voice sample and use it to ask an employee to wire a payment to a new client, or retrieve vital information, such as passwords, using known information about the person, Meyer said. 

With cross-border payments, which can often include numerous financial institutions across multiple countries and jurisdictions, the threat of fraud increases with each added party, said Luke Penca, executive director at consultancy firm Capco. 

That makes data communication between parties that much more important, Penca said. 

"We've had a rules-based system for a long time, but I think the AI really helps the banks ensure that they've got a really good, vigilant process there. The bad actors have learned as well, and they're equipped with AI," Penca said. 

For reprint and licensing requests for this article, click here.
Payments Artificial intelligence Cross border payments SWIFT Technology
MORE FROM AMERICAN BANKER