Deepfake technology: is a threat to financial systems real?

Alyona Shevtsova
3 min readJan 13, 2022

--

Technology is evolving, making people’s lives easier and more comfortable, but at the same time, the ways to cheat are evolving as well. Progress in artificial intelligence allows algorithms to be used for new forms of fraud. Now, with the help of services and applications, it is possible to replace the face of a movie actor in a segment of a video with one of your own, or even animate a photo. These are publicly available technologies for entertainment, but there is another side to the coin. The creation of fake photos, audio, and even video footage indistinguishable from the original, combined with the unlimited possibilities of the media, is a cause for concern. At the same time, there is the threat of influencing financial systems with new deepfake technologies.

Deepfake is a technology based on artificial intelligence and machine learning. It uses found images, audio, and video files to create a similar visual and voice image of a person. The main character in a deepfake video can speak and act while impersonating another person. Such videos are becoming more realistic every year. For example, an account where Tom Cruise’s deepfake plays golf and shows tricks has become popular on social media. Once the video became viral, it drew widespread attention to the technology, but it also emphasizes the possible problems associated with identity falsification. The strength of AI-based technology is that it collects more data and gets smarter over time. It is worth remembering how, in 2019, cybercriminals were able to fake the voice of the CEO of a British energy company and steal €220,000.

There are many scenarios for the use of deepfake technology, from cyber extortion to falsified government statements. But some ways can harm financial systems. Among the most common are logging into personal accounts on online financial services, bank accounts, and loan processing through identification with real-time face and voice replacement technology. With the spread and improvement of artificial intelligence, this method is becoming more accessible to fraudulent activities in financial institutions.

Fraud with new accounts. This type of scam is usually available in apps. It is possible to use stolen IDs to open new bank accounts with deepfake and bypass identification technologies. If attackers gain the ability to create fake IDs, such an attack could become a global problem for financial services.

Fraud by means of artificial identification. This method is extremely difficult to detect because instead of stealing or faking an identity, attackers combine real and fake information to “create” a completely new person. Afterward, it is possible, similar to the previous method, to create accounts at financial institutions and receive money through opening credit lines.

Deepfake technology helps bypass the security system when identifying a person. That is why many banking services now use multiple layers of protection, where only biometric data is not enough. Some services determine a person’s “mobility”. You are asked to turn your head or make simple gestures. An example of the use of such technology is the Ukrainian service for self-isolated people, “At Home”.

Artificial intelligence technology is a weapon for fraudsters who specialize in financial crime. However, this does not mean that fraud has become an easy profit for the perpetrators. Deepfake technology is based on the Generative Adversarial Network or GAN system, where one neural network generates forgeries, while the other learns to recognize them. So the solution to the problem is hidden in its creation, however paradoxical that may be. The U.S. Department of Defense Advanced Research Projects Agency (DARPA) is already creating new ways to counter deepfakes, so it has prepared two programs to detect them. They find inconsistencies in images of people and analyze content for digital, physical, and semantic integrity. A difference in paired jewelry, uncharacteristic facial features, or even background can give a fakeaway. Such details are difficult to notice with human analysis.

Deepfake-technologies are developing in Ukraine as well because it was Ukrainian startups who created one of the most popular face replacement applications. However, it is too early to talk about mass threats to financial services, because while biometric identification technologies are widespread in the West, this trend is just coming to us.

--

--

Alyona Shevtsova
Alyona Shevtsova

Written by Alyona Shevtsova

CEO of the international payment system LEO, the shareholder of IBOX Bank

No responses yet