The case against Lethbridge football coach Larry Warner marks a disturbing first in Canadian legal history. When RCMP officers knocked on Warner’s door last Thursday, they weren’t investigating conventional child exploitation materials, but rather AI-generated images allegedly created using the coach’s personal computer.
Warner, 43, who has coached the Lethbridge Cougars youth football program for eight years, now faces three counts of possessing child pornography under Section 163.1(4) of the Criminal Code. The charges stem from digital files discovered after police executed a search warrant following a cybertip from the National Center for Missing and Exploited Children.
“This case presents novel questions about how our laws apply to artificial intelligence,” said Crown prosecutor Marie Lapointe in a brief statement outside the courthouse. “But make no mistake—these images, though artificially created, still represent the sexual exploitation of children and cause real harm.”
The investigation began when automated content detection systems flagged suspicious activity on Warner’s IP address. Upon examining his devices, investigators discovered what appeared to be AI-generated images depicting minors in sexually explicit situations.
Dr. Emily Zhao, digital forensics expert at the University of Alberta, explained the technical challenges in an interview. “Current AI detection tools can identify synthetic images with about 85% accuracy, but the technology evolves rapidly. What makes this case particularly concerning is the allegation that real children’s faces may have been incorporated into the generated content.”
Warner’s defense attorney, James Milligan, has questioned whether AI-generated images legally constitute child pornography since no actual children were directly photographed. “We need to carefully consider the boundaries of the law in this new technological landscape,” Milligan told reporters.
The Criminal Code defines child pornography as including “a photographic, film, video or other visual representation… that shows a person who is or is depicted as being under the age of eighteen years and is engaged in or is depicted as engaged in explicit sexual activity.” Legal experts note the definition’s deliberate breadth.
“The language about ‘depicted as being under eighteen’ was specifically included to cover digitally created imagery,” said Professor Leslie Ramirez, who specializes in cyberlaw at McGill University. “Parliament anticipated these kinds of technological developments, even if not specifically AI, when drafting these provisions.”
The Lethbridge Cougars organization immediately suspended Warner pending the outcome of the case. In a statement, the organization expressed shock and emphasized their commitment to child safety, noting all coaches undergo background checks.
For families in the tight-knit football community, the news has been devastating. I spoke with several parents whose children were coached by Warner. Most requested anonymity, but all expressed a mix of betrayal and concern.
“My son idolized Coach Larry,” said one mother, who asked not to be identified. “How do I even begin to explain this to a 12-year-old?”
The Alberta Ministry of Children’s Services has established a support line for affected families, and the Lethbridge School Division has brought in additional counselors to help students process the news.
Warner was released on strict bail conditions, including surrendering all electronic devices, abstaining from internet access, and prohibitions against contact with anyone under 18. His next court appearance is scheduled for November 28.
This case raises profound questions about how Canadian law will address AI-generated illegal content. The Canadian Centre for Child Protection has been warning about this emerging threat since 2021. In their report “Artificial Harms,” they documented a 31% increase in cases involving AI-generated child sexual abuse material.
“AI tools have dramatically lowered barriers to creating this content,” explained Dr. Samantha Wright of the Canadian Centre for Child Protection. “What once required technical skill now only needs access to publicly available AI image generators and basic prompting abilities.”
Police agencies nationwide are rapidly adapting their investigative techniques. The RCMP’s National Child Exploitation Crime Centre has established a specialized AI task force and partnered with technology companies to improve detection methods.
Legal scholars are debating whether current laws sufficiently address the nuances of AI-generated content. While Canadian courts have previously ruled that virtual child pornography falls under criminal prohibitions, the specific questions around AI creation and the potential incorporation of real children’s features present new territory.
I reviewed Warner’s bail conditions document, which included the unusual stipulation that he cannot possess or use any AI image generation tools. The court ordered the surrender of all cloud storage credentials and implemented monthly device checks.
As this case moves through the justice system, it may establish important precedents for how Canadian law handles AI-generated illegal content. For now, the Lethbridge community is left grappling with both the personal betrayal and the broader implications of technology’s dark potential.
Warner faces up to 10 years imprisonment if convicted. The RCMP has asked anyone with additional information to contact their cybercrime unit.