Healing from Image-Based Sexual Abuse: A Therapist’s Perspective
Therapist Francesca Rossi works with clients who’ve had real images of themselves turned into sexually explicit content, without their consent. All of her current clients identify as woman. This type of image-based abuse, known as an explicit deepfake generated by artificial intelligence, is frequently perpetrated by a current or former intimate partner, or a known friend, coworker, or neighbor, as a form of harassment and stalking.
Why Safety Planning is Critical for Healing
Rossi says that people need to feel safe in order to restore their sense of reality. Creating that safety happens through measures big and small. In the beginning, when the deepfakes are discovered, Rossi says that it’s important to gather trusted loved ones who can offer emotional support, help locate where the deepfakes appear, and try to remove them, or develop a strategy for navigating this complex process, possibly in partnership with law enforcement or attorneys.
Getting and Staying Grounded
While practical steps, like issuing takedown notices, are key to safety planning, both Keisel and Rossi say survivors also benefit from grounding and mindfulness practices that decrease psychological distress and anxiety. Survivors suffer particularly because their nervous system perceives a constant threat; deepfakes, after all, have the potential to re-emerge, or may still exist online or in someone else’s possession.
There is Hope
Rossi and Keisel are among several therapists and professionals in the U.S. who specialize in treating survivors of image-based sexual abuse, but their expertise is uncommon. Rossi says she has more consultation requests than she can handle; they’ve increased markedly since AI software and apps capable of producing explicit deepfakes became more widespread late last year.
Conclusion
The journey to healing from image-based sexual abuse is long and rarely predictable, but with the right support, it is possible. Safety planning, grounding and mindfulness practices, and a trauma-sensitive approach to therapy can help survivors feel more in control and reduce their psychological distress. With the right support, survivors can move past this trauma and live fulfilling lives.
FAQs
Q: What is image-based sexual abuse?
A: Image-based sexual abuse is a form of harassment and stalking where someone’s intimate images are shared without their consent, often using technology such as deepfakes.
Q: What can I do if someone has made a deepfake of me?
A: If you believe someone has made a deepfake of you, contact the Cyber Civil Rights Initiative’s 24/7 hotline at 844-878-2274 for free, confidential support. The CCRI website also includes helpful information as well as a list of international resources.
Q: How can I stay safe after discovering a deepfake?
A: Staying safe after discovering a deepfake requires a combination of practical steps, such as issuing takedown notices, and grounding and mindfulness practices to reduce psychological distress and anxiety.