Commonwealth Bank of Australia made an alarming discovery in 2020 — thousands of transactions were being sent with abusive language attached. And the bank wanted no part of this aggressive new form of cyberbullying.
"Some customers were being sent a large number of low-value transactions that contained abusive language, words, phrases or threats in the description field of those payments, essentially the payment as a messaging service," said Caroline Wall, head of customer vulnerability at Commonwealth Bank.
These messages are often a precursor to financial abuse, a problem that plagues
The bank defines financial abuse as when money is used to gain control over another person, often a romantic partner, family member or an older person. The language, which is in the messaging fields of digital payments — similar to the "memo" field on a paper check — isn't necessarily vulgar; instead, the aggressor uses coercive language to gain financial leverage.
Because the abusive nature of the message is often subtle, the abusers can often circumvent more traditional controls that are designed to vet language. The message can be designed to bully or shame people into sending money, or can use payments to dispense non-financial abuse, such as a person adding toxic language to a child or spousal support payment.
CBA needed a new approach to this problem because law enforcement agencies don't immediately look at payment messages for signs of domestic abuse, and payment fraud vetting doesn't look for signs of relationship abuse.
"Not all abusive language uses certain keywords which we can detect as being abusive," Wall said.
In 2021, CBA enabled the CommBank mobile app and NetBank digital bank to block consumers from sending abusive words or phrases in transaction descriptions. The bank has since blocked about one million transactions.
CBA's system uses a combination of machine learning, natural language processing and large language models on public data, text analysis and graph concepts to identify abusive relationships. Graph concepts, or
CBA's use is similar to Chase. The Australian bank is analyzing evidence of sustained abuse across criteria in payments, such as the value of the transaction, the frequency and velocity of transactions and the types of messages.
In Australia, 40% of the adult population has suffered or knows someone who has suffered from financial abuse, according to research from CommBank and
Financial institutions take a variety of approaches to fight the abuse that can result from transaction messaging.
CBA built its model in partnership with AI firm H20.ai. It is available on
"This means that any bank can choose to use the source code and model to monitor and detect high-risk transactions that may constitute financial abuse," Wall said. "From there, they can investigate and take further action if they choose. Helping to address financial abuse is an issue for everyone. And the benefit will be for everyone."
AI is widely used to fight financial crimes such as
Some digital payment systems, such as the New Payments Platform Australia, are able to include emojis as well as text, Lodge said.
"While many are innocent — 'we'll have a blast at the party tonight' — others are more sinister, and sadly there are cases of harassment using the text fields," Lodge said. "Understanding [the good from the bad] is something that AI will be able to help with."