Imagine this: Your daughter is distraught because explicit photos of her are circulating around school. She remembers posing for the photo, but swears she was fully clothed. It’s likely she’s the victim of a deepfake nude, an alarming new trend among teens. But what are deepfake nudes? Here are the alarming implications for kids and what parents can do about it.
Deepfakes are images or recordings that have been digitally manipulated using artificial intelligence (AI) to make it appear as if someone is doing or saying something they didn’t. The rise of “nudify” apps, AI-powered tools that can transform a clothed photo into a nude, has made it easier than ever to create these misleading and harmful images at the press of a button.
Nudify apps have soared in popularity in recent years, and teens are jumping on the bandwagon. They might ask a peer to pose for a clothed photo or pull images from Instagram. Armed with the innocent image, they can go to any number of free sites and quickly and easily generate nude photos — and even pornographic videos. These altered images are then distributed at school or through group chats.
In some cases, they might even make their way onto the broader internet. While children of any gender could be involved, the kids making the explicit AI-generated images are predominantly boys, and 99% of the victims are girls.
Deepfake nudes are a recent phenomenon, and the full extent of their impact is still being understood. Here’s what we do know about the implications:
Explicit AI-generated photos and videos can have long-lasting negative effects on victims. Many report feeling violated, traumatized, and unsafe at school. A school’s ineptitude or inaction in dealing with the incident can make things worse. And once the images are out there, they become part of the victim’s digital footprint and could impact their future college and job prospects.
The consequences for kids who make or share deepfake nudes are serious. Child sex abuse material generated with AI is illegal under federal law. The picture at the state level is generally murkier, especially when both the perpetrator and victim are minors.
In some cases, kids who make and distribute explicit deepfakes have been charged with misdemeanors — and even felonies. As schools scramble to figure out how to address this type of content when it shows up at school, some kids have faced suspensions or expulsions.
As they say, prevention is the best cure. Here are some things parents can do now to address explicit content with their child before they become a problem:
Open communication with your child about the risks of deepfake nudes is one of the best things you can do. Here are some talking points:
Talk to your child (early and often) about consent. Make it clear this is a non-negotiable in your family. Be explicit that making or sharing explicit images violates a person’s consent and is never okay.
Teach your children digital literacy by emphasizing the importance of thinking before they share. If they suspect an image or video of a classmate might be a deepfake, or could negatively impact the person, they shouldn’t share it.
Because of their underdeveloped moral reasoning skills, it’s possible that adolescents might be involved in explicit deepfakes without comprehending why what they did was wrong. They might even think it’s a funny prank. Explain that the situation can quickly circulate out of control and have serious implications for them and the person in the picture or video.
Being aware of what your child is doing online is also important to prevent and address the risks of deepfakes. Make a point to do regular tech check-ins where you sit with your child, look at their social feeds and messaging platforms, and talk about what you see.
A parental monitoring app like BrightCanary scans your child’s online activity and alerts you to any issues. That way, if your child searches for a nudify site or if altered images show up on their phone or in their DMs, you’ll find out — and can hopefully head it off before the situation gets worse.
Deepfake nudes are an increasing problem for teens, and parents must take proactive efforts to prevent their child from being involved. It’s important to talk to your child about the risks of deepfakes and teach them about consent and digital literacy. Parents should also stay involved with their child’s online activity to help guide them and address any issues that arise. Digital check-ins and the use of a child safety app like BrightCanary are useful monitoring tools.