Fraud continues to evolve, with technology making it easier to get started and thrive, Sift’s new Digital Trust and Safety Index shows.
It’s becoming easier to buy and sell stolen information, Sift trust and safety architect Jane Lee said. Finagle an invite to a Telegram forum, and you’re in. Discussion group posts on this are easy to find.
Fraud gets the ‘as a service’ treatment
Technological changes complicate efforts to identify and stop fraud, Lee, who has spent a decade in the field, added. Ten years ago, scams were one-dimensional, revolving around stolen credentials. Misuse was easier to detect and combat.
With synthetic identities, that process becomes more arduous. Fake profiles proliferate, but they’re developed with real credentials. It looks like you but is run by someone else. That mutes the impact of some traditional safeguards.
Just as it becomes easier to commit fraud, Lee also sees larger, more organized groups forming. Like other digital services, financial crime now has its own “as-a-service” label, with groups offering FaaS (fraud as a service). Developers sell on-demand services to the less experienced on the deep dark web. Folks can even hire scammers to deliver free goods and food right to their door.
Digital economies are ripe for fraud
The digital economy is getting hammered by fraud. Digital goods and services fraud is up by 27%. Fintech fraud has risen by 13%. Cryptocurrency exchange fraud has surged by 45%.
BNPL practitioners would kill for such numbers. BNPL fraud has exploded by 211%, with its structure to blame. Often when a BNPL account is created, the account holder receives a list of other merchants who accept BNPL. That’s a Christmas list for criminals.
“Once a bad fraudster gets access to a buy now, pay later account, they have a laundry list of places where they can go and make purchases,” Lee explained. “I was… not surprised, given the level of information available once the wrong person gets access to an account.”
Card hopping explained
Many fraudsters engage in “card hopping,” which can mimic legitimate behavior. Sometimes real users use different cards to take advantage of welcome points, Lee said. When fraudsters do it, they have handfuls of stolen but validated credentials that they use for BNPL purchases. When they receive an item, it’s 100% profit when resold.
The process begins with fraudsters testing a series of low-dollar transactions. The validated ones provide a list to be used for more significant thefts.
This process has also been simplified by technology, Lee noted. Fraudsters can automate their transactions, generating multiples more attempts. The response needs the same level of technology to be practical.
“They’re becoming automated, and so fraudsters are leveraging automated scripts to run these frauds at an inhumane speed,” Lee said. “If you rely on a team to manually review certain transactions, you won’t be able to keep up with it. (Scripts) can easily overwhelm teams that don’t have the right tools to address attacks like that at scale.”
Take a turbulent economy that’s hurting many and mix in technology that simplifies theft, and it’s no wonder more people are committing fraud. Roughly one in six admit to committing fraud or knowing someone who has. A similar rate has seen online offers to commit fraud. More than 60% of us (62) have experienced payment fraud between two and four times.
Sift’s response
Sift’s response lies in recognizing when behavior deviates from the norm. Most of us (64%) only use a couple of payment cards every month—fewer than five percent use at least five. Sift can zero in on the most suspicious transactions by detecting those identities that use many cards.
When the associated IP addresses are identified, other linked transactions are analyzed. Sequences can be assessed with Machine Learning. It may suggest card testing if a series of failed attempts and one-dollar transactions is detected.
How ChatGPT complicates fraud detection
Programs like ChatGPT will make fraud harder to detect, Lee suggested. She recently identified a cryptocurrency scam run through a dating site. Scammers wooed match seekers and, over time, asked for cryptocurrency. The victims were directed to a fake crypto site where they were relieved of their money.
Also, read:
One of the ways fake sites are identified is through grammatical errors. But with AI programs, that option is eliminated.
“I enjoy the technology, but it also makes my heart sink a little bit when I think about what this means for fraud,” Lee said. “I always say that when there’s something shiny and new like that, the fraudsters will be there.”
Sift releases product updates
Sift recently released updates to its fraud automation and orchestration capabilities. Workflow Backtesting allows clients to experiment and find the correct set of controls. That will enable them to find the right mix before making changes.
“Workflow Backtesting allows our customers to… go retroactively and see how much of their customer base (a potential change) would impact (operations),” Lee said. “That allows them to make more accurate decisions before implementing a business rule.”
Percentile Scoring helps companies better assess risk, adjust decisions and accept more trusted transactions. For example, it can identify a transaction as being in a top percentage of fraudulent traffic.