A man accused of distributing a fake nude image of his wife via social media did not commit a crime, a Burlington, Ont., judge has ruled, pointing to what experts say is a gap in the Criminal Code that does not account for deepfakes.
In his decision, released last month, Ontario Court Justice Brian Puddington said that while that conduct may be “morally reprehensible and, frankly, obscene,” it does not amount to a crime due to the wording of the offence in the Criminal Code.
The accused in this case was charged with publishing, distributing or transmitting an intimate image of a person after photos of his wife were shared to an unknown male through Snapchat, the judge said.
Two of the photos showed the victim in “a state of undress,” Puddington noted, including one where she was wearing only a bra and another, which appeared to show her completely naked with her breasts exposed.
During the trial, it was established that in the nude photo, her head was digitally manipulated to be on a naked body that was not her own.
Neither photo met the definition of intimate images under the Criminal Code, the judge said.
There was not a sufficient level of nudity in the first picture, Puddington said, and the digitally manipulated image did not actually show the victim’s body.
“None of this is to say that creating and distributing these fake images is not morally reprehensible and, frankly, obscene. It may be that Parliament will turn its mind to criminalizing this conduct in the future,” the judge wrote.
Puddington also referenced updated language that had been proposed as part of the former Liberal government’s Online Harms Act, a bill that died on the floor after Parliament was prorogued for this year’s federal election.
Included in the definition of “intimate content communicated without consent” was a provision for fake images.
The drafters of the bill said this would include images “that falsely presents in a reasonably convincing manner a person as being nude,” explicitly referencing deepfakes.
“This proposed language appears to be, at least in part, an acknowledgment by Parliament that the legislation as currently contained in section 162.1(2) does not address fake images,” the judge added.
Legislative change needed: experts
Andrea Slane, a legal studies professor at Ontario Tech University, said this case highlights the urgent need to update the Criminal Code.
“I think it points to the problem that a lot of us already knew was there, with the way that the criminal offence is phrased, the exact words. People pointed it out as soon as deepfakes or any sort of AI-generated imagery became much more accessible,” she told CP24.
“I think it’s really obvious that there needs to be legislative change in order to clarify that...If anything, this should be a real kind of kick in the pants to make the government actually move forward with reintroducing some of the things that they suggested.”
For its part, the current federal government has promised to move forward with legislation addressing issues that were included in the previous bill, including making the distribution of sexual deepfakes a crime.
“Our government is fulfilling our campaign promise by moving forward with legislation to protect children from online sexual exploitation and extortion, tighten child-luring laws, and increase penalties for the distribution of intimate images without consent,” Lola Dandybaeva, a spokesperson for the Justice Department, said in an email to CP24.
“It will also make the non-consensual distribution of sexual deepfakes a criminal offence.”
While no timeline was provided, she added that the work is “ongoing” and “a priority” for the federal government, which is undertaking consultations to ensure they “get it right.”
“We will have more to say as we finalize the legislation and move to introduce it, and we expect all parties to work together to keep our children safe,” she said.
Slane said a bill specifically addressing the Criminal Code change would be relatively easy and would likely pass quickly with widespread support.
“I think the problem has been that the things like the Online Harms Act, those sorts of things are bigger bills… and so oftentimes, when there’s opposition to legislative change like that, it’s because there are other things in the bill that people are not as comfortable with,” she said.
Federal bill would have closed loophole
Emily Laidlaw, a Canada Research Chair in Cybersecurity Law and an associate law professor at the University of Calgary, told CP24 that a larger bill would need to address the many issues that exist outside the scope of the Criminal Code.
“If the goal isn’t to hold that individual accountable, as in he might face a charge and be incarcerated potentially, then what are the other options? That’s where the Online Harms Act comes in,” said Laidlaw, who previously served as a co-chair of the government’s advisory group on the legislation back in 2022.
“It kind of has a different goal. Its goal is to require that social media companies have to put in place risk mitigation measures when it comes to certain types of content.”
She said the proposal, before the bill died, had been that social media companies would have to take down intimate images shared without consent.
“That would have included deepfakes and AI generated images,” Laidlaw noted.
She added that the country needs a digital safety commission tasked with protecting the best interests of Canadians.
“A lot of people don’t want to go the criminal route to deal with a lot of this. They just want it quickly taken down,” she said.
“A regulator is a really effective way to do that.”


