AI Voice Cloning: When It's Useful, When It's Not, How We Use It
Voice cloning is one of the most powerful — and most ethically loaded — tools in the AI production stack. Here's how we decide when to use it, how we get consent, and what guardrails we put in place.
The four uses we actually approve
We use voice cloning for four specific things, with a clear consent paper trail for each:
- Cloning the client's own voice for content they create themselves but don't have time to record. The voice belongs to them; the use case is them; the deletion right is theirs.
- Cloning a hired voice actor's voice under a contract that explicitly licenses cloning, with usage caps and revocation rights.
- Synthetic voices that don't claim to be anyone real — a "narrator" or "host" character with a unique synthetic identity used consistently across a brand.
- Voice cloning for accessibility — restoring or extending the voice of someone who has lost it, with their consent or their family's.
What we won't do
- Clone a public figure's voice, even for satire — risk-reward is too lopsided.
- Clone a deceased person's voice without explicit estate authorization.
- Produce content where a real person's cloned voice says things they would object to.
- Build voice clones for impersonation, scams, or manipulation, ever.
The workflow
For an approved cloning project our standard workflow is:
- Written consent and scope — what the voice will be used for, what it won't, how long, and the revocation process.
- Reference recording session — 5-10 minutes of clean audio in a quiet room, varied content (declarative, conversational, emotional).
- Train the clone — typically takes minutes with current tools.
- A/B test — generate 5-10 sample lines and have the voice owner approve the quality and accuracy.
- Production use — generate content with the clone, with the owner reviewing key deliverables.
- Audit log — every generated audio file is logged so the owner can review what was made on their behalf.
The honest tradeoff
Voice cloning saves enormous amounts of time. It also creates a real artifact — a model that exists, that could be misused, that someone might want to delete later. We treat that artifact with the same seriousness as any other identifiable data.
If your AI vendor doesn't have clear answers about consent, deletion, and audit, walk away from the engagement. The tools are widely available; integrity isn't.
Gen Art Studios
AI-powered creative studio building apps, videos, music, and marketing assets.
Frequently Asked Questions
Keep Reading

