The Attack

In early 2024, an employee at a Hong Kong finance firm received a message to join an urgent video call. The meeting was supposedly called by the UK-based CFO to discuss a confidential transaction.

When they joined, they saw and heard the CFO — along with several other executives they recognized. The video quality was good. The voices sounded right. They discussed a sensitive deal and instructed the employee to wire funds.

The employee initiated 15 separate wire transfers totaling $25 million USD.

It was all fake. Every person on that call was an AI-generated deepfake, created using publicly available video from earnings calls and interviews.

$25,000,000

Lost to AI-generated deepfake impersonation in a single video call

Why It Worked

The lesson: In the AI era, seeing is no longer believing. Video and audio can be faked convincingly. Organizations need verification processes that don't rely on visual confirmation alone.

What Could Have Prevented This

Source: SensCy 2025 Threat Intelligence Report / CNN

Prepare your team

We'll discuss your verification procedures and identify gaps.

Schedule Call →