OTTAWA — The Liberal government has laid out a blueprint for cracking down on harmful online materials posted to platforms such as Facebook and YouTube, with experts welcoming the move while casting a critical eye on the framework’s effectiveness.
Under the proposed rules, a digital safety commissioner would enforce a new regime that targets child pornography, terrorist content, hate speech and other harmful posts on social media platforms.
The penalty for violating the would-be laws ranges up to five per cent of a platform’s gross global revenue or $25 million, whichever is higher. If Facebook were to face such a fine, that percentage would translate to as much as $5.4 billion, based on its total revenues last year.
The new regulations would cover all “online communication service providers,” which includes social media and user-generated content sites such as Instagram, Gab and Pornhub but not telecommunications companies like Bell and Rogers or email and text messages sent via WhatsApp or Parler.
The Trudeau government announced in April it would introduce legislation to create a regulator that will ensure online platforms remove harmful content, and now says it plans to present the final framework this fall after public consultation.
“This is really a proposal that is focused on regulating content that is for the most part illegal under Canadian law already,” said Vivek Krishnamurthy, a law professor at the University of Ottawa.
The regime defines harmful content under five categories, with the first two carrying particularly harsh penalties: child pornography, terrorist content, hate speech, material that “incites violence,” and intimate images shared without consent.
Most large platforms already have policies that claim to meet or exceed these requirements, with some seeking to highlight or remove misleading information — about COVID-19 vaccines, for example.
“I don’t think it’s going to change much for Facebook,” Krishnamurthy said.
Whether the new measures can “clean up” smaller or fringe platforms remains an open question, he added, pointing to smaller pornography sites or Gab — a social media site adopted by the far right — whose appeal lies partly in their laissez-faire approach.
The measures would require flagging mechanisms for harmful material, a 24-hour time frame to respond to flagged posts and avenues of appeal for companies’ decisions on them, as well as regular reports to the commissioner about the volume and type of harmful content.
A new “digital recourse council” would act as a tribunal to handle cases where a platform has turned down both an initial request to remove content and an appeal of the company’s decision. The council could also issue administrative penalties of up to three per cent of a firm’s global revenues or $10 million, whichever is higher. (The five per cent fine kicks in with indictable offences.)
If a platform shows repeated non-compliance with obligations to flag and remove posts related to child pornography or terrorist content, the regulator could recommend to the Federal Court that the site be blocked.
Sam Andrey, director of policy and research at the Ryerson Leadership Lab, welcomed the new regime, saying that more than one in three Canadians report coming across harmful content at least once per week.
“The unique reach and speed of online platforms to facilitate harm call for unique regulatory solutions, and this package puts forward many important elements to meaningfully tackle this challenge,” he said in an email.
However, he said Charter questions of privacy and free expression may come into play when the government considers whether the regime includes platforms that “blur the line between public and private,” whether to expand its scope to other harmful activity such as impersonation, and how proactive the new digital safety commissioner and digital recourse council will be.
New Democrats and Conservatives have questioned why a new regulator is needed to crack down on exploitive material when the Criminal Code already bars child pornography, hate speech and the knowing distribution of illicit images.
“I will await with interest to see what role this online regulator will take and how much teeth this position will be given,” NDP ethics critic Charlie Angus said in an email.
“However, this announcement is cold comfort to the survivors of brutal online sexual exploitation who begged the Liberal government to apply Canadian laws in the allegations regarding Pornhub. An internet umpire for good behaviour is no substitute for enforcing the laws and standing up for the rights of abuse victims.”
Rampant child pornography and exploitive sexual material on platforms such as Montreal-based Pornhub — the world’s largest pornography site — came under increasing scrutiny after a New York Times opinion piece highlighted the problem in December, prompting deletion of millions of explicit videos by the company.
“Canadians are increasingly concerned social media is being used to spread potentially illegal and abusive content such as hate speech and child sexual exploitative content. We need consistent and transparent rules for how online platforms address hate, incitement of violence and harmful online content,” Heritage Minister Steven Guilbeault said in a statement.
Under the new blueprint, Canada’s spy agency would play a more active role in investigating online harms by obtaining information more quickly with “greater flexibility” following legislative amendments, according to a technical paper from the Heritage Department.
Sites would also have to provide relevant data to police, but only “where it was clearly evident that the material related to the offence was child pornography,” the paper states.
“Those are going to require some careful scrutiny, because they could be a significant expansion of law enforcement,” Krishnamurthy said.
This report by The Canadian Press was first published July 29, 2021.
Christopher Reynolds, The Canadian Press