Canada

‘Perfect environment’ for exploitation, expert says as luring cases rise

Published: 

Parents of kids involved in luring say it’s all too easy for young children to become exploited online. Kamil Karamali reports.

Online child luring cases reported to police increased nearly 20 per cent in 2025 compared to 2024, according to new data from Statistics Canada.

The report highlighted that the total number of cyber-related violations reported to police remained fairly consistent at just over 85,000 cases in both 2024 and 2025 — but “luring a child via a computer” increased from 2,882 in 2024 to 3,456 by the end of 2025.

“When it comes to these types of offences, it is grossly underreported,” said Ritesh Kotak, a cybersecurity and tech analyst in an interview with CTV News via Zoom Saturday.

“I think those numbers are significantly higher, and one of the big reasons for that is just the access to the technology, access to platforms, the lack of supervision,” he added. “The lack of safeguards that have been put in place has created a perfect environment for individuals to exploit young children online and that is reflected in the numbers.”

Jacques Marcoux, director of research and analytics for the Canadian Centre for Child Protection, which also runs Cybertip.ca — Canada’s national tipline for reporting online cases of child exploitation — says the number of calls they’ve been receiving has also been increasing year over year.

“It tends to happen from what we see in our statistics, based on thousands of reports, primarily on Snapchat (and) Instagram,” said Marcoux in a Zoom interview with CTV News Saturday.

“Increasingly we’re seeing other platforms like TikTok and also Telegram -- and it’s not a surprise because people who lure children, develop these relationships with them over a long period of time and they usually do it cloaked in secrecy because that’s part of how the grooming tactic works.”

More chat options available for child exploitation

Marcoux says that, according to their reports, the numbers are increasing because more children have the ability to get their hands on smartphones earlier, and also there are more ways to access online chat groups outside of traditional social media platforms.

“There’s just more platforms that have more functions,” said Marcoux. “Just a couple of months ago, Spotify — a platform that we all think about and use for music — introduced private messaging functions ...

“So a lot of platforms are just rolling out these kinds of seemingly convenient chat functions because there’s just a desire to get more engagement, and because engagement is how they make money,” he added. “So as you introduce these higher-risk functions that allow connections to be made with potentially strangers, that’s also leading to more general risk exposure.”

While some countries have rolled out new legislation to protect children from online predators, there are some child advocacy groups and firms that have begun targeting social media giants by taking them to court.

“They need to be held liable, in court. If it’s purely regulation, they can kind of work around regulation,” said Matthew Bergman, who has represented thousands of families who claim they were harmed by social media

“They actually, in our belief, have to bear the financial consequences of their outrageous misconduct, by compensating the victims in court, and we believe that once they have to start doing that, then hopefully their economic incentives will change and they will be designing safer platforms that protect kids, unlike the ones that exist today.”

Bergman has also led a landmark legal effort to classify social media platforms as “defective products.”

Earlier this year, he secured a landmark legal win where a Los Angeles jury found YouTube and Meta were negligent and created addictive products that caused harm, resulting in $6 million being awarded to his client.

“We think that moral outrage would have worked by now if it was going to work,” said Bergman. “Public humiliation would have worked if it was going to work, given the amount of press (reports there has been) on how these platforms exploit kids. We think the only thing that’s going to make a change is changing their economic calculus. If you have them by their pocketbooks, their hearts and minds will follow.”

Push for new legislation

Recently, Australia, France and Türkiye have passed legislation that implements an age ban on social media use — with other countries eyeing similar laws.

Canada introduced a bill in 2024, called the Online Harms Act, that would regulate harmful online content and protect children, but that bill died in early 2025 when then-Prime Minister Justin Trudeau announced his resignation and prorogued Parliament.

“The government has publicly expressed, (and) has signaled strongly that they will soon table a new bill,” said Marcoux. “Right now, there are consultations happening over what this new bill should look like in light of things like the advent of (artificial intelligence) and all these other tools, and whether or not the original bill, as it was proposed, was sufficient.”

Marcoux added that his group would like to see a broader scope in the new bill that would cover a range of companies, instead of just traditional social media companies like Meta — since a lot of harm is happening on platforms that Canadians may not consider or may not have ever heard of.

“Another one is ensuring that private messaging functions or services also have obligations,” he said. “So for example, if you offer a private messaging service like within Snapchat or WhatsApp, there should be some design requirements like having reporting tools, parental controls, potentially the ability to block a user, basic common sense things we can all get on board with.”

He said there is also support for potentially moving forward with an age minimum to use social media, following suit with Australia, France and Türkiye.

On Monday, a rally led by Children First Canada will call on federal leaders to reintroduce new online safety legislation.