These laws, no matter how common sense they may seem, will still face an uphill battle. In May, Elon Musk's platform X sued the state of Minnesota due to a law prohibiting the creation of deepfakes to influence elections, which they say violates freedom of speech. In August Musk won a similar lawsuit v. State of California for banning deepfakes.
For Histoliese, the event caused immeasurable pain. She has lost trust in others. She is afraid of how this might affect her future and career. She does have a “next” though. She will continue to be the sister and friend she always has been. She is looking forward to going to work tomorrow. She trains her pit bull puppy, which she named accordingly. Olivia Bensonhow to give a high five. And despite it all, “I love people,” she says, before pausing. “I think I still do.”
Some practical steps to take if you are a victim of deep fakes
It is impossible to measure the damage and reach of deepfakes because apps allow users to create them in seconds. However, according to the experts we spoke to for this article, there are practical steps to take if you find yourself being a victim.
Call a loved one.
“I recommend having someone call a friend and help them through the process,” Martone says, noting that people may find it difficult to look at images over and over again alone.
If you are not comfortable talking to a friend or loved one, the SVPA has a hotline you can call at 800-656-HOPE (4673). chat with RAINN online.
If you do want to pursue a potential lawsuit, Martone recommends asking a friend or loved one to document all deepfake incidents so you can easily obtain them for lawyers or police. (Only caveat: if the victim is underage: do not take a screenshot any content. Instead, take your phone directly to your local police station to have it cataloged.)
Warn the platform.
The next step is to alert all social platforms where the deepfake may have been distributed. On Instagram, you click on an image, then “Report,” then “False Information,” and “Altered or Digitally Created.”
Tell us about your school or work.
If you're comfortable talking about work or school, or you think your deepfake might be spreading in these places, it's a good idea to talk to HR or an administrator (especially if you're the parent of a minor child).
Deepen your media literacy.
Not all AI is bad. Take BitMind as a good example, which specializes in “detection of artificial and synthetic media.” Narrated by its founder, Ken John Miyachi. Glamor that the program can identify both synthetic and semi-synthetic media that may be difficult for humans to detect, which can help people understand whether what they see is real or not. Here More expert advice on spotting disinformation.
Talk about it.
Megan Cutter, head of victims' services at RAINN, expects change not just from politicians, but from all of us. “How can we as a society discuss sexual violence and different forms of sexual violence, and how can we make communities safer places for survivors to speak out, ask for help, and identify that they may need support?” she says. “[We want to] create that awareness so that when it happens, people know it's not okay and there's something I can do.”
Know that it is not your fault, but the violence.
As Cutter says, “Just because someone didn't actually physically touch you, or maybe the image is of your face and not your body, doesn't mean it's not a form of abuse. It's a form of assault. It's a form of sexual abuse. I think it's really important to talk about it openly to help survivors understand that they have options and to find words to describe their experiences.”






