Skip to main content Skip to section navigation Alert: Fake Images Cause Real Harm: AI and Deepfakes

With most youth having more time to spend online this summer, one concerning trend to be aware of online is the use of artificial intelligence (AI) to create “deepfakes.” Deepfakes are images, videos, or audio recordings that look or sound completely realistic but have been altered.

In the past year,®, Canada’s tipline for reporting the online sexual exploitation of children, has processed close to 4,000 sexually explicit deepfake images and videos of youth.Go to footnote 1

We encourage parents to learn more about deepfakes and have discussions with their children.

What are the risks with sexually explicit deepfakes?

  1. Deepfake imagery is increasingly being used to sexually exploit and harass people. Although the images or videos are fake, the harm to those victimized is very real. Sexually explicit material is often made without consent. If the content is also made publicly available online, the embarrassment and distress for the person depicted in the fake content is amplified.
  2. Sexually explicit deepfakes are also used to trick youth into sending nudes or livestreaming sexual acts. Known as sextortion, boys are overwhelmingly extorted for money, and girls are more often extorted for more sexual images or videos. Whether it involves boys or girls, it is common for sextorters to threaten to use sexually explicit deepfakes to ruin their victims’ lives, saying they will share the deepfake with the youth’s social media followers if they do not comply with their demands for money or additional images.
  3. In Canada, making, sharing, or having sexual material of a minor (under 18 years of age) with a focus or emphasis on the sexual organs or where they are engaged in explicit sexual activity is a criminal offence. This includes the depiction of a minor where the image has been created or altered using AI.

How are youth affected by sexually explicit deepfakes?

Victimization can have a serious impact on youths’ mental health and well-being. They may suffer ongoing anxiety and depression and become preoccupied with searching for their images. There may also be future or ongoing impacts if the material continues to be shared long after it has been created.Go to footnote 2

What can parents do?

  1. Have open conversations about deepfakes with your children to introduce the risks that go along with some AI tools. Ask them what they know about deepfakes and what harms may be caused by using them and build on their answers.

    Youth benefit from opportunities to solve problems, practicing decision making, and apply what they have learned to real challenges. Create “what if” scenarios and discuss how to deal with difficult situations online. For example, you can ask, “what is someone at school showed you a deepfake of a classmate? What would you do? What if someone made one of you?”

  2. Explain youths’ right to safety, privacy, bodily autonomy, and sexual integrity. If someone violates their rights, emphasize that you want to know, so you can help. Remind them you are on their side and will be there to walk with them through tough situations.
  3. Share services that can help if a sexually explicit deepfake has been created, such as:

Next Steps

Visit for more information about sexually explicit deepfakes, how to support your child if they have been victimized, and to report instances of online sexual exploitation.

Youth can access support, resources, and help with next steps if they have been sexually victimized online at

  1. 1 June 1, 2023 – March 31, 2024 n=2,678
  2. 2 Canadian Centre for Child Protection Inc. (2021). Online Sexual Victimization: Picking up the Pieces. Winnipeg, MB