One minute is all it takes: Deepfakes and the new era of financial fraud

As AI advances, so do the tactics of cybercriminals. This article explores how deepfakes and emotion-driven scams are reshaping the fraud landscape.

Rina-Maria Ingemannsen / June 24, 2025
Read more about our Identity offerings here

Artificial intelligence delivers smarter solutions, safer banks, and more efficient services. But the same technology also provides fraudsters with unprecedented opportunities. AI is used to prevent crime, but also to perpetrate it.

Today, scammers don't need extensive resources. With just one minute of recorded audio, AI can generate a realistic deepfake convincing imitation of your voice. Add images and information from social media, and anyone can convincingly pose as you, a colleague, or a family member. Deepfake videos, fraudulent investment advice, and AI-generated romantic relationships have become a reality - and they work. “Criminals have realized it’s no longer most profitable to hack systems. Instead, they’re hacking people,” says John Erik Setsaas, Director of Innovation, Financial Crime Prevention, at Tietoevry Banking.

Tietoevry Banking is a leading provider to the financial sector in the Nordics and globally, offering solutions in core banking, credit, fraud prevention, transaction banking, and wealth management. With more than 3,500 experts, the company helps financial institutions improve both agility and efficiency.

The new face of fraud: Emotion and precision

AI has made fraud cheaper and more effective. The most powerful tool in a scammer’s arsenal is no longer technical sophistication, but psychological insight.

“When fraudsters trigger emotions like stress or fear, they activate the brain’s fast, intuitive response system. This is what Daniel Kahneman calls ‘System 1’. And that’s when we act before we think,” explains Setsaas.

With deepfakes and AI-generated content, scammers can craft scenarios that appear immediately credible: a colleague asking for a quick transfer, a child who’s lost their phone, or a familiar face endorsing an investment. These are designed to provoke instinctive (not rational) reactions.

Behind this lies extensive social profiling. The quizzes we take, the photos we like, the posts we comment on, all of them help scammers understand how we think and feel. “On the internet, nobody knows you’re a dog,” a famous 1990s cartoon once quipped. Today, Setsaas points out, “With AI, you can be anyone online.”

Fraud is no longer improvised, it’s organized

On digital marketplaces, “Cybercrime-as-a-Service” is now readily available: phishing kits, money laundering services, fake IDs, and deepfake generators - complete with customer support, reviews, and money-back guarantees. This professionalization has dramatically lowered the threshold for launching attacks and significantly increased the threat.

One of the most pressing challenges in the current fraud landscape is authorized transactions, where the victims themselves are tricked into transferring money entirely voluntarily.

“When the user clicks pay themselves, it doesn’t matter how secure our systems are,” emphasizes Setsaas.

Traditional safeguards like biometric authentication, two-factor verification, security tokens, and 3-D Secure Monitoring have been essential in stopping account takeovers and identity theft. These types of fraud have significantly declined, largely due to technologies that make direct access to accounts much harder for criminals. As a result, fraudsters have shifted their attacks to a more vulnerable target: the human being.

Real-time user presence: Key to secure identity

Tietoevry Banking’s solutions are built on biometric core services enabling precise and flexible risk management, especially through the distinction between user presence and device presence. For low-risk transactions, it’s sufficient for users to identify with fingerprint or facial recognition on their mobile device. But as the risk level increases, for example, with large transfers, the bank can require verified user presence.

This involves a real-time facial match against a validated image stored during onboarding, ensuring that the correct person is physically present and authorizing the transaction at that moment. This allows banks to strengthen security where needed, without adding friction to everyday actions.

“Most people expect more friction when larger sums or higher risk is involved,” notes Setsaas.

This differentiation gives banks better control and greater flexibility without compromising the user experience. At the same time, technology lays the foundation for future digital identity services, where identity and authentication are closely connected over time. Read more about our Identity Proofing service here.

In fraud types such as romance scams and investment fraud, we see how victims are manipulated over time, often by AI-driven assistants that appear to be trustworthy partners or advisors. This even before willingly making the payment themselves. In these cases, it’s not the technology that fails, but rather the trust that’s exploited.

Combining human insight with AI precision

Tietoevry Banking addresses this reality by combining advanced technology with human insight. The results speak for themselves:

“In 2024, Tietoevry Banking’s Defence Centre blocked over 90% of all fraud attempts through our monitoring services, this is equivalent to more than €400 million,” says Setsaas.

The company is also developing a Large Financial Model (LFM) to analyze transaction patterns, relationships, and customer history. Explainable AI (XAI) will be used to interpret suspicious transactions in real-time and reduce false positives.

But technology alone is not enough. Preventing fraud also requires oversight, judgment, and human decision-making. In the gray zones, where patterns aren’t immediately clear, it’s people who make the critical difference.

In a sector that has long championed seamless user experiences, Setsaas raises a provocative question: "Perhaps we sometimes need more friction, especially when dealing with large sums or critical decisions?"

As human beings, our expectations for security and control tend to rise with the value or importance of what we’re doing. An extra confirmation, a short delay, or one last verification before a major transaction can be reassuring rather than obstructive.

That small pause gives users the chance to reflect and possibly catch warning signs before it’s too late.

Many consumers say they’re willing to share financial data for improved security, but skepticism grows when AI is part of the equation. Understandably so. Trust must be earned over time, and banks must get better at explaining how AI protects customers, rather than surveils them.

The question of responsibility is complex. Is it up to individuals to protect themselves? Should banks shoulder the burden? Or is it a collective societal obligation?

“We need to elevate this to a societal level. This isn’t just about money—it’s about preserving the trust that underpins our entire financial infrastructure,” says Setsaas.

He draws a parallel with how society previously tackled major challenges like seatbelt use and smoking. Once these issues were framed as collective concerns, both awareness and behavior changed. In the same way, financial fraud and manipulation must be viewed as a shared responsibility, involving governments, businesses, and individuals alike.

AI presents enormous potential for innovation, for protection, and for those looking to exploit the system. The most important lesson is this: technology alone is not enough. It’s equally about awareness, education, and collective preparedness.

Read more about our Identity offerings here

Rina-Maria Ingemannsen
Marketing Communications Specialist, Financial Crime Prevention

John Erik Setsaas
Director of Innovation, Financial Crime Prevention

With over 25 years’ experience in digital identity, John Erik Setsaas is a pioneer in this space. He has deep knowledge in the areas of digital onboarding, authentication, electronic signatures and seals, time stamping and digital identity wallet.

He is a prolific speaker at fintech industry events around the world.

Author

Rina-Maria Ingemannsen

Marketing Communications Specialist, Financial Crime Prevention

John Erik Setsaas

Director of Innovation, Financial Crime Prevention

Share on Facebook Share on Threads Share on LinkedIn