.
Image: Ron
AI-powered voice cloning has leapt from science fiction to everyday reality, bringing with it groundbreaking opportunities and unsettling risks.
“Just a few seconds of audio is now enough to create a near-perfect copy of someone’s voice,” according to Nicolene Schoeman-Louw, Managing Director of SchoemanLaw Inc.
“It’s astonishing technology, but also incredibly dangerous when misused.”
From celebrity deepfakes to fraudsters mimicking CEOs, the technology is spreading faster than laws can keep up, Schoeman-Louw advised.
Voice cloning uses artificial intelligence and machine learning to create synthetic speech that’s almost indistinguishable from the real thing.
While businesses are tapping it for marketing, customer service, and accessibility solutions, bad actors are using it for scams, defamation, and impersonation schemes.
“Voice is personal, it’s part of your biometric identity,” Schoeman-Louw explained.
“But right now, there’s very little stopping someone from taking a short recording of your voice and using it for purposes you never agreed to.”
There’s no dedicated “voice cloning law” in South Africa yet, but several existing frameworks can offer partial protection, according to Schoeman-Louw.
Data Privacy (POPIA)
"Your voice can be classified as biometric data,” Schoeman-Louw said.
“Under the Protection of Personal Information Act, using someone’s voice without consent can be unlawful. Consent isn’t optional, it’s a legal requirement.”
Digital Takedowns (ECTA)
If a cloned voice is used in online content, victims can request its removal under the Electronic Communications and Transactions Act. But enforcement can be slow and uneven, she advised.
Defamation and Misrepresentation
“If a cloned voice damages your reputation or falsely links you to a product or statement, you can sue,” Schoeman-Louw noted.
This falls under the categories of defamation or “passing off” in common law.
Fraud and Cybercrime
Using cloned voices for scams, like tricking a bank into releasing funds, could lead to criminal charges.
“The problem,” she warned, “is that cybercriminals often operate across borders, making prosecution complicated.”
Schoeman-Louw urged businesses to act proactively:
Secure Written Consent: Never use someone’s voice, even internally, without clear, documented permission.
Update Contracts: Include specific clauses about AI use and voice/image rights.
Train Teams: Teach employees to verify voice-based instructions, especially for financial or sensitive requests.
Adopt AI Ethics Policies: Don’t wait for the law to tell you what’s ethical. Set your own guardrails now.
Change is coming, slowly. In June 2025, Denmark became the first country to give individuals copyright over their faces and voices, meaning unauthorised AI clones could be treated as direct copyright violations.
Schoeman-Louw believed similar reforms are inevitable elsewhere.
“South Africa and many other jurisdictions need to update their laws,” she said.
“Until then, businesses have to navigate this space carefully, because one bad deepfake can destroy trust overnight.”