Real Models, Fake Models, Same Old Lie
Advertising has always lied politely.
We cast people who don’t use the product, shoot them “living” in staged homes or aspirational settings, and pay them for their complicity in our deception.
Brands hand cash to influencers to perform enthusiasm on cue, all under a “make it feel authentic” brief.
But swap the human for a synthetic person and suddenly we’re talking about ethics. AI isn’t where the lie starts. It’s where we finally notice it.
The Fiction We Were Already Buying
When you hire a model, you’re not buying their reality.
You’re renting their surface.
You don’t care if they actually drive the car, sleep on the mattress, or use the banking app. You care that they can convincingly inhabit the brand’s little fiction for 1/125th of a second.
AI doesn’t change that transaction; it just makes it more obvious.
Where the Line Actually Is
The truth is admitting that the image was never a documentary in the first place. So the ethics can’t live in whether a pixel was captured by a sensor or baked by a GPU.
The questions that matter:
Consent – Are we using any real person’s likeness without permission? This is where AI can be genuinely violent: deepfakes, stolen faces, forced performances that never happened.
Harm – Does the image materially hurt someone in the real world? Fake testimonials, fabricated “doctors”, false safety or health claims, body ideals dialled past human. If it changes what someone does with their money, health, or safety, it can’t be a casual fiction.
Truthfulness about facts, not pixels – Did this thing happen? Is that a real endorsement? Are we implying a real-world guarantee? Those answers need to be true regardless of how the picture was made.
Do We Have To Say “This Was AI”?
Sometimes, yes. Often, no.
If an image is used as evidence – news, politics, testimonials, scientific claims – viewers deserve to know they’re looking at a synthetic scene. In those contexts, hiding AI is fraud, not art.
Most brand imagery isn’t evidence. It’s theatre.
We don’t publish which walls were painted versus comped, which skies were replaced, or how many layers of retouching sit on the talent’s skin. Expecting “AI-generated” labels on every composite image while ignoring the rest of the illusion is a strangely selective morality.
A more honest position:
We disclose when the synthetic nature changes the stakes for how the audience should interpret what they’re seeing.
We don’t pretend commercial images are neutral documentary records.
We treat AI as part of the same family as VFX, illustration, and 3D – not a separate moral category.
The Real Threat Isn’t Pretty Fake People
The anxiety around AI models is really about something else: the sense that our grip on “real” has slipped.
It’s unsettling to realise the guy in the car Ad never existed, because it forces you to remember that the guy who did exist was probably paid to look like he cared – and you accepted that without blinking. The unease isn’t new; the tool just stripped away the last fig leaf.
If we’re going to be upset, let’s be upset about the right things:
Exploited labour behind cheap, endless content.
Data and likenesses scraped without consent.
Synthetic endorsements dressed up as human conviction.
Those harms land whether the face in the frame is AI, a freelancer, or an influencer doing a “collab”.
What “Ethical” Looks Like When Nothing Is Real
For Nothing Is Real, the working rules are simple:
Fiction is allowed. Harm isn’t.
Invent as many uniforms, faces and cities as you like. Just don’t lie about material facts.No stolen selves.
No deepfakes of real people, no copying specific faces, no resurrecting the dead for a fee.Own the manipulation.
Internally, everyone in the chain knows what’s built and why. Externally, we’re clear when the synthetic nature of an image would change how someone should trust or act on it.Use the machine to break, not reinforce, the usual biases.
If you can conjure anyone, you lose the excuse for why everyone still ends up looking the same.
The point isn’t to drag our work back to some lost reality. It’s to accept that images were always engineered – and then take responsibility for what that engineering does to people outside the frame.
In the end, AI models aren’t a corruption of an otherwise honest industry.
They’re just a sharper mirror.