How GenAI Could Render KYC Processes Obsolete

KYC, or "Know Your Customer," explained

The KYC process is essential for financial institutions, fintech startups, and banks to verify customer identities. A common method of KYC authentication involves “ID images,” typically incorporating cross-checked selfies to ensure individuals are who they claim to be. Noteworthy companies like Wise, Revolut, and cryptocurrency platforms such as Gemini and LiteBit utilize ID images to enhance security during onboarding.

However, the rise of generative AI poses potential risks to the integrity of these checks.

The potential dangers of generative AI

Recent viral posts on X (formerly Twitter) and Reddit illustrate how attackers could exploit open-source and readily available software to download a person’s selfie, use generative AI tools to modify it, and pass a KYC test with a manipulated ID image. While there have been no confirmed reports of using generative AI to deceive real KYC systems yet, the ability to create reasonably convincing deepfaked ID images raises serious concerns.

Understanding the KYC ID image authentication

In a standard KYC process, a customer submits a picture of themselves holding an ID document—like a passport or driver’s license—that only they should possess. An individual or algorithm then cross-references this image against existing documents and selfies to prevent impersonation attempts.

ID image authentication has never been foolproof, as fraudsters have sold forged IDs and selfies for years. However, the advent of generative AI significantly broadens the scope of these malicious activities.

Creating realistic deepfake ID images

Online tutorials reveal that tools like Stable Diffusion, which is a free and open-source image generator, can produce synthetic images of a person in various settings (for instance, a living room). With some experimentation, an attacker can create a rendering that appears to show the individual holding an ID document. They can then edit this image to include either genuine or fabricated documents.

Achieving the best results with Stable Diffusion entails installing supplementary tools and acquiring around a dozen images of the intended target. A Reddit user known as harsh detailed their workflow for generating deepfake ID selfies, stating it typically takes one to two days to create a convincing image.

Despite these challenges, the learning curve is much less steep than in the past. Generating ID images that feature realistic lighting, shadows, and environments used to require advanced photo editing skills, which is no longer the case.

Exploiting KYC processes is becoming simpler

Submitting these deepfaked KYC images to an app is even more straightforward than generating them. Android apps running on desktop emulators like BlueStacks can be tricked into accepting deepfakes in lieu of a live camera feed. Furthermore, web-based applications can be misled by software that transforms any image or video source into a virtual webcam.

The growing threat to identity verification

Many apps and platforms have integrated “liveness” checks as an extra security measure to confirm identity. These checks typically require users to record a short video doing actions like turning their heads or blinking to demonstrate that they are indeed human.

However, even liveness checks are susceptible to generative AI manipulation.

Last year, Jimmy Su, the chief security officer for the cryptocurrency exchange Binance, indicated to Cointelegraph that current deepfake technology is capable of passing these liveness checks, including those requiring users to perform real-time physical actions.

The critical takeaway is that KYC, which has always had its vulnerabilities, may soon be rendered ineffective as a security measure. While Su does not believe that deepfaked images and videos have yet reached a level where they can deceive human reviewers, it is likely only a matter of time before this capability develops further.

Most people like

Find AI tools in YBX