Technology has transformed how we connect, work, and express ourselves. But it has also created new spaces for abuse. 

Creating “deepfakes” with AI is one issue that is becoming more concerning. These are manipulated images, videos, or audio that make it appear a person has said or done something they haven’t. 

This is more than just a technological problem for women and girls. Deepfakes are being used more and more to humiliate and intimidate. They can be made from photos or videos, often from social media, and turned into materials that have not been consented to. The result is usually someone’s face, voice, or body used in a way that is extremely violating. 

Perpetrators are using as a form of abuse against women and girls. A recent study found that 98% of online deepfake videos were pornographic, and 99% of those were targeted at women (Security Hero, 2023). Those numbers speak for themselves. 

People who have been subjected to this kind of abuse have described feeling shocked, helpless, and unsure where to turn. Many worry about who has seen the material and how it might change the way others view them. Having something like this happen can be frightening, isolating, and difficult to talk about. 

Earlier this year, Channel 4 journalist Cathy Newman spoke publicly about her experience of seeing a deepfake pornographic video that had been made using her image. She described it as “haunting” and said she felt “utterly dehumanised”. Her response highlights how emotionally harmful this type of abuse can be. 

For many, the harm doesn’t stop online. Deepfakes can affect relationships, employment, mental health, and feelings of safety. Survivors can feel blamed or disbelieved, and may struggle to find support. It’s important to remember the responsibility always lies with the person who created or shared the content, not the person targeted by it. 

As the law works to catch up with the ways perpetrators misuse technology, it’s vital that trauma-informed support is available. Survivors need to be listened to without judgment, believed, and offered clear information about their options. Digital evidence can feel overwhelming, so advocacy and guidance can make a real difference. 

If you are being subjected to intimate image abuse, help is available.

The Cyber Helpline is a free, confidential helpline for anyone who has been a victim of cybercrime. They help individuals contain, recover, and learn from cyber attacks by linking them with cyber security experts who provide relevant advice and guidance. Their chatbot and team of volunteer cyber security experts will talk in a language that you understand and are able to advise you in all cyber security scenarios.

The Revenge Porn Helpline provides information and help one on getting online images removed.

#16days #NoExcuse and #ACTtoEndViolence