The Alberta Law Enforcement Response Team (ALERT) has charged a Calgary teen on drug charges. artificial intelligence create materials related to sexual abuse and exploitation of children.
Investigators say artificial intelligence technology was used to sexualize photos of teenage girls who attended several high schools in the Calgary area.
However, ANXIETY is not naming the schools to protect the identities of the victims.
Staff Sgt. Mark Auger of ALERT's Internet Child Exploitation Unit (ICE) says the investigation began in October 2025 after ICE received information about child sexual abuse material being uploaded to a social media platform.
On November 13, ICE officers, with the assistance of Calgary Police, executed a search warrant at a home in Calgary. During the search, two mobile phones, a tablet and a laptop were seized as possible evidence.
The 17-year-old, who cannot be identified under the Youth Criminal Justice Act (YCJA), now faces charges of producing, possessing and distributing child sexual abuse and exploitation material and criminal stalking.
When asked how the images were altered or “sexualized” as he described them, Auger responded, “If I were a criminal, I would take the photo I wanted, whether it be on TikTok, Instagram or any website, take it, and then I could use software to strip the (image) of it, which would then have the AI give a very accurate estimate of your body type, the color of your skin, and make it almost impossible to discern a nude image with just my face attached.”
Staff. Sergeant Mark Auger described the alleged crimes as “the most extreme version of abuse with a weapon” against a young, developing child.
Global news
“Our biggest takeaway is that we need people to understand that this is not a joke, this is not a prank, this is the most extreme form of bullying and a criminal offense,” Auger added. “We will take steps to stop this behavior.”
He said such actions could have a “horrible impact” on victims.
Receive daily national news
Get the day's top news, political, economic and current affairs headlines delivered to your inbox once a day.
“Teenagers are going through probably the biggest changes of their lives in terms of self-esteem, body image, social media, and this is, as I said, the most extreme version of bullying a developing adult child with a gun. That's why we provide a lot of support from the beginning and after. Our investigators are now in contact with all identified individuals and their families to offer that support.”
The defendant, who appeared in court on Wednesday morning, was released under numerous court-imposed conditions, including no contact with anyone under 16 unless related to work or school, and no electronic devices capable of accessing the Internet except for work or school purposes.
His next court appearance is scheduled for Jan. 8.
Police are also asking community members to help support victims by not sharing such images, refusing to condone such behavior, and reporting these types of images or this type of behavior to police.

Provinces including British Columbia, Manitoba and Quebec have laws that criminalize the posting or distribution online without consent of fake pornographic or intimate images generated by artificial intelligence.
Alberta legislation that bans the non-consensual publishing and sharing of intimate images, passed in 2017, makes no mention of images created or altered by AI.
AI crime alerts on the rise
The Alberta case comes amid growing warnings from law enforcement about the dangers posed by artificial intelligence.
The RCMP said last year that “wave AIChild sexual abuse material is emerging” as technology rapidly improves and criminals gain access to artificial intelligence-generating tools.
After a 12-year-old British Columbia boy, a victim of online sextortion, committed suicide in 2023. experts told Global News that AI is further exacerbating the mental health “epidemic” caused by similar cases involving minors.
In the same year, The British Internet Watch Foundation warned that AI-generated deepfake images will stun child exploitation investigators without any government action.
In 2023, a 61-year-old man from Quebec was jailed for using artificial intelligence to create deepfake videos of child pornography. No real children were depicted, but Stephen LaRouche violated a law that prohibits any visual depiction of a person under 18 years of age engaged in explicit sexual activity.
Provincial Court Judge Benoit Gagnon wrote in his ruling that he believes this is the first case in the country involving deepfakes of child sexual exploitation.
Last summer Lethbridge youth soccer coach charged WARNING about the use of AI to create child pornography.

Both the RCMP National Cybercrime Coordination Center and the Canadian Cyber Security Centre, in his latest national threat assessmenthave reported a sharp rise in AI-assisted crimes that have caused harm or “near harm” since the technology entered the mainstream in 2022.
Evan Solomon, Canada's first minister in charge of artificial intelligence, is expected to introduce new legislation partly addressing online harms.
In late October, Solomon said his upcoming privacy bill could include age restrictions on access to artificial intelligence chatbots to protect children. His spokesman said the bill would be introduced in early December.
US lawmakers are also seeking to crack down on the harm caused to children by artificial intelligence. after cases involving minors who were allegedly incited to commit suicide by chatbotsor did so after engaging in sexually charged conversations with so-called “companion” apps such as Character.AI.
US President Donald Trump signed a law this spring that criminalizes fake non-consensual pornography and requires online platforms to remove such material within 48 hours of reporting. Several states have passed similar laws.
— With files from The Canadian Press.
© 2025 Global News, a division of Corus Entertainment Inc.






