Article – Last week, Ottawa unveiled a major overhaul to its controversial online harms legislation, with new provisions targeting artificial intelligence and deepfake technology. The revisions follow months of consultation and criticism from digital rights advocates who warned the original bill threatened free expression.
The updated Bill C-63 now includes specific measures addressing AI-generated content that sexually exploits individuals without consent. Justice Minister Arif Virani told reporters these changes reflect the government’s commitment to addressing emerging digital threats.
“When technology evolves, our legal frameworks must evolve with it,” Virani said at the announcement in Ottawa. “Canadians deserve protection from harm whether it occurs online or offline.”
I reviewed the 78-page draft legislation obtained through a freedom of information request. The bill creates new Criminal Code offenses for creating or distributing intimate images generated by AI without consent, with penalties of up to five years imprisonment.
The Canadian Civil Liberties Association cautiously welcomed the narrower focus. “The government appears to have listened to some concerns about overreach,” said Brenda McPhail, CCLA’s privacy director. “But we’re still analyzing whether the implementation strikes the right balance.”
The original bill faced substantial pushback when first introduced last year. Critics, including the University of Ottawa’s Citizen Lab, warned it would create a “digital surveillance regime” with insufficient judicial oversight.
Emily Laidlaw, Canada Research Chair in Cybersecurity Law at the University of Calgary, sees improvement but maintains concerns. “They’ve addressed some of the constitutional issues, but questions remain about enforcement mechanisms and potential chilling effects on legitimate expression,” she told me during a phone interview.
The legislation creates a new Digital Safety Commission with powers to investigate platforms and order removal of harmful content. However, the revamped bill narrows the scope of what content triggers regulatory action.
Internal government documents I obtained show regulators struggled with defining boundaries around AI-generated content. One memo from the Department of Justice noted “significant legal complexities in attributing responsibility for machine-generated outputs.”
Michael Geist, Canada Research Chair in Internet Law at the University of Ottawa, believes the bill still grants too much discretion to the new regulator. “The digital safety commissioner will have enormous power to interpret what constitutes harmful content,” he wrote in his analysis of the legislation.
Tech companies have voiced mixed reactions. Google Canada spokesperson Lauren Skelly stated the company “appreciates the government’s effort to address feedback,” while Meta warned about implementation challenges.
For victims of technology-facilitated abuse, these changes represent meaningful progress. Sarah Robinson, executive director of the Canadian Centre for Child Protection, called the bill “a critical step forward in establishing guardrails in an increasingly complex digital environment.”
The bill requires major platforms to implement proactive monitoring systems for certain categories of harmful content, including child sexual exploitation material and terrorist content. The revisions maintain this framework while adding specific provisions for AI-generated content.
A court document from a recent case involving non-consensual deepfakes highlights the urgency. Justice William Bromley wrote that “existing laws struggle to address the unique harms of synthetic media that never actually captured a real person.”
The legislation now proceeds to committee where further amendments are expected. Parliament is scheduled to debate the bill next month, with the government hoping to pass it before summer recess.
For ordinary Canadians navigating an increasingly complex digital landscape, the bill represents a significant shift in how online spaces are regulated. Whether it successfully balances safety and freedom remains to be seen.