Fintechs warned to step up anti-fraud efforts using AI

DC Studio

"Fintechs who do not protect their customers will not have customers."

This stark statement sums up the sentiment at a Fintech Security Summit hosted by Fintech is Femme in New York on Wednesday.

The speaker, Frances Zelazny, has a vested interest: Her company, Anonybit, offers identity management software. But she's not wrong. 

"Fintech is scaling faster than ever, but our threat landscape is scaling even faster," Zelazny said. "Consumers lost $12.5 billion through fraud last year alone, thanks to cyber threats, scams, regulatory roadblocks and systems that were never built for the speed, scale and complexity of today's economy."

If cybercrime were a country, it would have the third-largest economy on the planet, after China and the United States, said former cybercriminal turned cybersecurity consultant Brett Johnson. 

The criminals are winning because they collaborate and share information better than banks and fintechs do, Johnson said. 

"Ninety percent of attacks use known exploits," he said. "They're not zero-day attacks, not unknown vulnerabilities, not computer geniuses like the media would have you believe. It's typically the stuff you know about that you've not done anything about, that creates that threat landscape that's out there."

Johnson built and ran one of the first organized cybercrime rings, called ShadowCrew. He was on the U.S. most wanted list, racked up 39 felonies, served time and worked for the Secret Service. He now consults with companies on fraud and security. 

Bank and fintech information sharing

Johnson stressed the need for fintechs and banks to share information with each other and to analyze the transaction and web traffic information they have. This is what cybercriminals do and it's a big part of their success, he said.

In the early 2000s, Johnson built a trust mechanism that criminals could use to share information and learn from each other. 

"You knew by looking at someone's screen name what the skill level of that individual was and if you would trust that individual, learn from that individual and network with that individual," he said. 

Vouching, review and escrow systems helped establish trust among people who never met in person nor knew each other's real names. 

"That type of trust mechanism is still used today throughout the dark web and much of criminal environments online," Johnson said. "We understood as we were building these platforms, that at the end of the day, people have to work together, and you have to trust each other in order to work together. By working together, everyone becomes more educated, and more educated means more profitable. So we were all about sharing and exchanging information."

Cybercriminals are just very good social engineers, he said. "We know what it takes to manipulate you to give up information, access data or cash," Johnson said.

A fraudster might find out about an advanced replacement warranty for Microsoft tablets and share that out to a cybercriminal network. Someone in the group might offer a tutorial for exploiting this that will also spread through the criminal community. Another might offer malware-as-a-service to automate the technique.

Using AI to catch fraud

Johnson urged the fintechs in the audience to use AI to watch user behavior for signs of fraud. "That's absolutely helpful," he said. "No one's going to do business if they cannot trust you." 

This idea of earning trust through better fraud protection was echoed by Dan Ganopolsky, head of core banking, payments, fraud and servicing product and technology at Forbright Bank in New York.

"We want to not only protect our bank, but protect our customers, because the most important thing for us is trust, and that's going to take a very long time to build and only seconds to lose," he said.

Forbright uses AI throughout its infrastructure, Ganopolsky said. It uses Alloy's AI-based onboarding and fraud-monitoring software, but it also creates its own AI models. 

"At the end of the day, we're the only ones that really know our data as well as we should, and we should leverage every data point, from onboarding, from the call center, from online banking usage and other sources," he said.

Many banks are prevented from having a clear view of fraud because their information is in silos, Ganopolsky said. "That's a huge issue, especially when it comes to fraud," he said. 

Forbright is gathering data across its entire organization, including marketing campaigns, new customer onboarding and transaction monitoring, and building models off that. 

"At the end of the day, it is that same individual that's trying to commit that fraud," he said. "So when we start opening up that data and start sharing that information, we're not only saving money for the bank, we're making our employees' lives a lot easier, but also just making people a lot happier too."

If a fraud attack does occur, the bank can react more quickly because of this shared repository of real-time data. "Then we can start reacting and start figuring out how to get rid of these fraudsters and start monitoring our models and so on," Ganopolsky said. "We've been able to react within minutes, as opposed to days or weeks."

Reporting fraud

Johnson also stressed the need to report crime to law enforcement. Victims often don't complain because they're embarrassed or justifiably afraid of being judged. 

"People like me, we feed on that," he said. "We thrive on that, and we succeed on that."  

Two fraud victims at the event shared their stories of being conned, one by being offered a fake job on Craigslist, another through theft on his Zelle account. Both said their banks reacted by blaming them for falling for fraud. 

Victim blaming helps make fraud successful, Johnson said, "because that victim doesn't rely on that safety net or support group of friends, family members and associates that they would usually talk to, they are scared of being judged. Because, guess what? They are going to be judged."

When investigating fraud attacks, Johnson advises companies to provide as much information as they can to law enforcement. The FBI has only about 300 agents focused on cybercrime, he said.

Johnson advised fintechs to think about all the ways their services could be gamed. 

"No matter what product or services you're using, there are predators out there," he said. "If you understand that, you will start to inherently build the situational awareness toward these products and services that you're developing. You'll start to start to say in the back of your head, OK, how can a criminal or an attacker profit from this? How can they use this to victimize people?"

For reprint and licensing requests for this article, click here.
Fraud prevention Fraud detection Payment fraud Technology
MORE FROM AMERICAN BANKER