
You receive a call from your "boss" requesting an urgent money transfer to a partner. The voice is identical, the speaking style familiar. But that's not your boss — it's AI.
How Voice Cloning Technology Works
Modern AI tools like ElevenLabs and Resemble AI need only 3-10 seconds of voice samples (from TikTok videos, Facebook, phone calls) to recreate a near-perfect voice clone. The AI can say anything in that person's voice.
Common Scam Scenarios
- Boss impersonation: Calling accounting staff requesting urgent partner payments. In the US, a CEO was scammed out of $243,000 this way.
- Child impersonation: Calling parents crying "I've been kidnapped", "I'm in an accident" demanding immediate ransom.
- Friend impersonation: Combined with hacked social media accounts, calling to "borrow money" using the real voice.
Prevention
- Set a family password: Agree on a secret word that only the family knows, used for verification when in doubt.
- Call back: Hang up and call back using the number saved in your contacts.
- Limit voice sharing: Reduce posting videos with your voice on public social media.
Golden Rule
Any call requesting urgent money transfer — no matter how familiar the voice — must be verified through a different channel before taking action.