Saturday, January 3

All it takes is one photo from Stories. How AI porn is changing sex, consent and feelings of safety in the online world


Until a few years ago, “porn with your face” belonged in the same box as the dystopia of Black Mirror. Something that’s meant to scare but stays safely on screen. Today, it’s reality. Generative artificial intelligence can make explicit images and videos out of virtually anything – an Instagram photo, a LinkedIn profile picture, a snap a friend sent you in a group chat.

And with that, something fundamental starts to crumble. The line between fantasy and abuse. Between eroticism and digital violence. Between what’s “just on the internet” and what has very real consequences.

AI porn is not just another content category, but a technological shift that affects relationships, dating, work and public space.

What all falls under “AI porn” today

Three different worlds meet under one umbrella. It’s good to distinguish them, because each brings different problems – and different victims.

The first plane is synthetic erotica without real people. Pictures, videos or “AI models” created completely from scratch. The debate here revolves mainly around stereotypes, addictive behaviour and how personalised erotica changes expectations of sex and intimacy. This is where the “harmlessness” argument often comes up – it’s not about a specific person.

The second level is much darker. Deepfake porn, that is, fake porn created by artificial intelligence that works with the face and often the voice of a real person. The result is embedded in explicit content to make it believable. Motivations vary from humiliation and bullying to blackmail and ‘normal’ non-consensual pornographic consumption. But the impact on the victim is similar: loss of control over their own identity.

The third category is so-called ‘nudify’ or ‘undressing’ apps, tools that turn an ordinary photo into a fake nude. Importantly, these are not fringe tools from the dark corners of the internet. Investigative texts have long shown that these services spread through advertisements, commission systems and sometimes through convenient logins using accounts on major platforms. This is what gives them a dangerously familiar, almost legitimate coat – as if they were a regular application, not a tool for abuse.

But in all three cases, the same thing is repeated. The more accessible the tools, the less “specialized” evil is needed. It’s no longer the domain of hackers and technical outsiders. Sometimes all it takes is anyone with time, frustration and basic digital literacy.

Why is this happening now?

Deepfakes are not new. But generative AI has done two key things at the same time: it has made production dramatically cheaper – both in time and money – and it has made distribution easier. Social networks, dedicated websites, closed online groups or encrypted chats. Content spreads faster than before and disappears more slowly.

Earlier analyses already pointed out that most deepfake videos online are pornographic and that the vast majority of them are made without consent, primarily at the expense of women. In recent years, another layer has been added. “Nudify” services have ceased to be an experiment and have begun to function as a full-fledged product. And the product, as anyone who has ever dealt with marketing knows, is trying to reach as wide an audience as possible. Even at the cost of circumventing the rules of advertising systems.

There’s one more detail that sounds trivial, but in practice is crucial. AI porn is extremely easy to copy. Deleting one link is often not the end, but the beginning of an endless game of cat and mouse.

Consent 2.0: when it’s no longer just about what happened

The classic revenge porn, the publication of real intimate material without consent, had a terrifying logic. With deepfake porn, that logic is reversed. The intimate material doesn’t exist, but it looks like it does. “Proof” can be manufactured after the fact. It puts the victim in an absurd position. With that comes a new form of vulnerability. It’s not just about reputation, but about feeling safe in one’s own body. The knowledge that someone can steal your identity and insert it into situations you didn’t choose.

When AI porn interferes with work, family and dating

What’s most frightening from a privacy perspective is how easily AI porn can bleed into everyday life. And how quickly a “private thing” becomes a social weapon.

In the era of dating apps, the photo is the basic currency. But it’s also a commodity. The more a person shares – selfies, stories, short videos – the more material someone else gets to create a digital double. And when a deepfake emerges, a third player enters the relationship. Algorithm. And with it, doubt about whether you can trust what you see, and sometimes what your partner tells you.

For some professions, a single deepfake can mean years of explaining. Teachers, health professionals, people in the media, public faces, but also anyone in a gossip mill environment. Ironically, the more “serious” the image you have, the more effective the deepfake is. Precisely because it shocks and shatters expectations.

Where sexuality is still punished with shame, deepfake porn is the perfect tool. Not because it proves anything, but because it triggers an emotional avalanche. The stares, the talking, the cutting off. And the victim often becomes the problem instead of the attacker being the problem.

Blackmail and bullying

False intimacy is the perfect fuel for blackmail. “Pay up or I’ll send it to the family. To work. To school.” In this game, the key is not whether the material is real. What’s key is that it’s believable and that the victim fears the social consequences.

It is especially painful in the school environment, where the pressure is brutal and the rate of spread is relentless. The Stanford analyses describe specific cases of students misusing “nudify” apps, while showing how unprepared schools and institutions often are.

When it comes to child protection, the point needs to be made. This is not a fringe issue. The Internet Watch Foundation has long warned of the rapid increase in AI-generated child sexual abuse material and that these outlets are becoming increasingly realistic. The UK, for example, has responded by tightening legislation to penalise not only the distribution but also the actual use of AI tools to create such content.

Platforms that “merely mediate”

One of the most uncomfortable truths is that AI porn often operates on an infrastructure that we think of as neutral. Login via Google or Apple. Social network advertising systems. Hosting, payment gateways, commission networks.

Investigative journalists show that some malicious “nudify” sites used regular logins through major platforms, and that the corporate responses came only the moment the matter was leaked to the media.

Meta itself has repeatedly admitted that ads for nudify apps return after deletion. Advertisers are quick and inventive in circumventing the rules.

If AI is a porn business, then its supply chain is not just made up of pornographic websites. It’s an entire ecosystem.

Tipy redakce

When AI starts stealing from erotica itself.

Perhaps the biggest irony of the whole situation. Technology that sells itself as the “new erotic creativity” is often based on old exploitation. Just in a more modern form.

Performers and creators in the porn industry point out that AI is being used to create scenes and voices without their consent. Yet consent is not a moral bonus in porn, but a basic working condition. And AI can get around it.

Alongside this, a grey area of digital “doubles” is emerging. AI influencers and models based on stolen photos of real people or their visual style. No permission, no profit share, no way to defend themselves.

The topic of digital copies of humans is also addressed outside of porn – with actors, voices, cultural production. But in erotica the problem is more acute because it touches on intimacy and social shame.

Laws are getting tougher, technology is running away

In the United States, the TAKE IT DOWN Act was passed in 2025, targeting the illegal distribution of intimate materials without consent, including those created by artificial intelligence. The law requires platforms to respond to reports and remove content quickly.

Meanwhile, Britain has announced that the creation of sexually explicit deepfakes is to be a separate offence, while tightening protection for children and surveillance of pornographic sites.

The European Union is taking a different route, introducing a requirement under the AI Act to label synthetic content as artificially created. It is currently working out how these rules will work in practice and how the platforms themselves can enforce them.

So far, the laws are best able to deal with the dissemination and deletion of content. The actual production and re-uploading is much harder to catch. At the same time, the pressure to delete quickly risks leading to excessive blocking of legitimate content.

The Czech context: we are not off the map

The Czech Republic is not an isolated island. The tools and platforms are global and the ‘Czech problem’ is often just the local impact of a global system. Synthetic intimate images fit into a broader framework of image abuse and cyberbullying, which states are addressing in different ways.

To put it in Czech: when deepfake porn is produced via the web in English, Czech does not protect the victim. Only a quick response from platforms, clear legal pathways and a culture that stops punishing victims with shame will protect them.

What’s next

One way forward is quiet normalisation. Jokes, memes, ironic remarks, hand waving. For those around us, perhaps a relief; for victims, a second hit – this time a social one. When violence becomes fun, empathy disappears.

Another scenario is the transformation of “evidence” into a weapon. Deepfake porn fits wherever doubt needs to be sown. In breakups, in the workplace, in stalking, in small and large revenges. Not because it proves anything, but because it creates shame and chaos.

The third direction leads to technical safeguards. The pressure for authentication will grow: tagging synthetic content, digital signatures, tools to spot fake videos. The direction is clear, but the practice will be painful, slow and full of blind spots.

And then there’s the reality we know from the history of technology. Erotica as an accelerator. From video to the internet to online payments. Even today, it’s often erotic products that show how new technologies are really used – faster and without illusions.

Whether all this will lead to further erosion of consent and a sense of security is no longer a question of technology. It’s a decision for society to make, which needs to be clear about where the fun ends and the abuse begins.

Source: WIRED – Deepfake Porn Is Out of Control, The Guardian – Inside the deepfake porn community, The Verge – San Francisco sues websites that use AI to undress women, 404 Media – AI ‘Nudify’ Apps Are Advertising on Meta Platforms, Associated Press – Trump signs Take It Down Act.





Source link

Leave a Reply

Your email address will not be published. Required fields are marked *