Written by 5:58 pm Generative AI

### Revolutionizing KYC: How Gen AI Could Make Traditional Verification Obsolete

Gen AI tools like Stable Diffusion threaten to make KYC tools effectively useless by creating synth…

KYC, known as “Know Your Customer,” is a process aimed at aiding banks, financial enterprises, and financial institutions in verifying their clients’ identities. During KYC authentication, individuals often use “ID visuals” or cross-verified selfies to confirm their identity. Among the platforms utilizing ID images for security validation are the cryptocurrency exchanges Gemini, Revolut, and Wise.

Nevertheless, the emergence of advanced AI technology poses a potential threat to these verification processes.

Recent viral posts on social media platforms like X (formerly Twitter) and Reddit illustrate how individuals could manipulate selfies using readily available software powered by AI algorithms to deceive KYC checks. While there is currently no concrete evidence of widespread exploitation of AI tools to bypass authentic KYC protocols, the ease with which convincing deepfake ID images can be created raises significant concerns.

KYC Vulnerabilities

In the standard KYC process, a customer submits a photo holding an ID document, such as a passport or driver’s license, that only they possess, for identity verification. To combat impersonation attempts, an individual or an algorithm cross-checks the submitted image with existing documents and selfies on record.

Historically, ID image verification has never been foolproof, with counterfeit IDs and manipulated selfies circulating for years. However, the advent of AI introduces new possibilities for fraud.

Tutorials online showcase how tools like Secure Diffusion, a freely available image generator, can be leveraged to generate synthetic portraits against various backgrounds, such as a living room. By tweaking these renders, an attacker could create an illusion of the subject holding an ID document. Subsequently, the manipulated image could be used to gain access to genuine or fabricated credentials using standard graphic editing software.

The integration of AI is poised to accelerate the adoption of decentralized ID systems and robust cryptographic measures.

Explore this Reddit “verification blog” and an ID crafted using Secure Diffusion. The reliance on applied cryptography will increase as traditional visual verification methods become less reliable. Link to Twitter post

— January 5, 2024, Justin Leroux (@0xMidnight)

To achieve optimal results with Secure Diffusion, multiple photos of the target subject and additional resources are required. A Reddit user named _harsh_ has shared a guide for creating doctored ID selfies, a process that typically takes one to two days to produce a convincing outcome, as reported by TechCrunch.

However, the barrier to entry for creating such manipulations has significantly lowered. While previously sophisticated software was necessary to craft authentic-looking ID images with accurate lighting, shadows, and backgrounds, this is no longer the case.

Moreover, the process of feeding deepfake KYC images into applications is simpler than ever. By utilizing software that can emulate any image or video source as a camera feed, individuals can trick online programs. Additionally, Android applications running on desktop emulators like Bluestacks can be deceived into accepting manipulated images instead of genuine camera feeds.

Escalating Risks

Some platforms and systems employ “liveness” checks as an added layer of security for identity verification. These checks typically involve the individual recording a short video demonstrating their presence through actions like turning or blinking.

However, even liveness checks are susceptible to manipulation using advanced AI techniques.

Breaking News: Our latest study is out!

A significant vulnerability to real-time deepfake attacks has been identified in ten prominent biological KYC providers. Your financial institution, insurance provider, or healthcare service could be at risk too.

— May 18, 2022, Sensity (at least ai)

According to Jimmy Su, the Chief Security Officer at the cryptocurrency exchange Binance, current deepfake tools can circumvent liveness checks, including those requiring real-time interactions like head movements. This underscores the looming threat that KYC procedures, already imperfect security measures, may become obsolete in the face of AI manipulation. Su suggests that while deepfake images and videos may currently deceive many, necessary adjustments are imminent.

Visited 4 times, 1 visit(s) today
Last modified: January 10, 2024
Close Search Window
Close