Skip to Content

Canada's digital safety balancing act

There's much to digest in the federal government's long-awaited online harms legislation. 

Online harms balance concept

Bill C-63 is a significant departure from an earlier effort to pass online harms legislation, which died on the order paper when the 2021 election was called.

The new bill has narrowed its scope to focus on the protection of children. It outlines seven categories of online content it seeks to police, including communicating intimate content without consent, defined to include "deepfakes" and content that sexually victimizes a child or revictimizes a survivor. And it updates provisions concerning hate speech in the Criminal Code and reintroduces a revised version of Section 13(1) of the Canadian Human Rights Act.

The bill would also establish a five-member Digital Safety Commission tasked with drafting and enforcing regulations. It also contemplates the appointment of an ombudsman to support victims and offer guidance to companies captured by the legislation.

David Fraser, a partner with McInnes Cooper in Halifax and member of the CBA's online harms working group, which is studying the proposed bill, has concerns about the broad powers granted to the Commissioner to issue orders regarding the removal of harmful content. Speaking on his behalf, Fraser says the Commissioner only needs reasonable grounds to suspect it fits into that category, which is a low bar.

"The consequences for such a breach are astronomical—up to eight percent of a company's global revenue if it gets to the point of charges," Fraser says. "That's over the top." Fraser suggests the Commissioner should be held to a balance of probabilities standard at the very minimum.

He is also concerned that the Commissioner has unbridled auditing and inspecting powers, including entering premises without a warrant and commandeering computer equipment and employees. "That's ridiculously broad and very intrusive," Fraser says.

The Canadian Civil Liberties Association has also expressed concern that the Commission is granted too much power, from writing up the rules to enforcement, interpretation, judgment and execution.

"There's a lot that is unknown about what they're going to do," says Executive Director and General Counsel Noa Mendelsohn Aviv says. "The one thing we can say is that hate speech online will no longer have the additional eyes of the Attorney General, which is what you have under the Criminal Code."

Emily Laidlaw, the Canada Research Chair in Cyber Security Law at the University of Calgary, was the co-chair of the government's expert advisory group on online safety. She calls C-63 a "huge improvement" over what was proposed in 2021, noting that it shifts to a risk-management approach.

Bill-63 places a broad obligation on platforms to act responsibly by implementing measures to reduce the risk of users being exposed to harmful content. But while the companies' digital safety plans will have to focus on risks and harms, Laidlaw believes platforms should also be required to consider issues of freedom of expression and privacy.

Laidlaw also notes the bill's exclusion of private messaging and gaming companies. She would have at least required such services "to think through these categories of risk and file an additional safety plan."

"But they're not otherwise brought within the mix," Laidlaw says. "Why wouldn't they have to be responsible corporate actors?" 

According to Laidlaw, there was a strong consensus among members of her advisory group on the need for an independent ombudsperson to advocate for victims and support them.

"The idea that victims need someone to go to is incredibly important," she says, noting her approval of placing the Digital Safety Commission "front and centre" to promote fundamental rights. 

Under the bill, a user can flag content they consider either sexually victimizing a child or as intimate content communicated without consent. The platform then has 24 hours to review the matter, during which it can dismiss or block it – a decision that can be appealed.

Fraser says he's concerned that those reviewing the complaints won't have access to context around the images. Moreover, some will get flagged that may not necessarily relate to a real person, as websites can use AI to generate images of people that don't exist.

"Someone using a computer could make the equivalent of a painting, which is not illegal in any shape or form, but if there is any ground to suspect [it was without consent], it gets taken down," Fraser says. "Deep-fake revenge porn is incredibly harmful when it depicts an actual person, and it needs to be gone after, but you need to do it right. I think this is a missed opportunity to come up with a good definition of synthetic intimate images and put it in the Criminal Code."

Fraser also worries about the volume of information companies must file with the Commissioner and be made publicly available to support their compliance with the new law.

"That's going to be relatively onerous, and it's going to depend on what the cut-off is for the size of the company," Fraser says.

For her part, Laidlaw welcomes the inclusion of hate crime provisions in the bill and says that including the Supreme Court's Whatcott standard for hate speech—to ensure the legislation targets the effects of hate speech, not its content—helps to keep complaints from overwhelming the Canadian Human Rights Tribunal.

Richard Marceau, the vice-president and general counsel for the Centre for Israel and Jewish Affairs, also emphasized the importance of relying on Whatcott's definition of hatred. "The key for us in this aspect is to make sure that nobody can say that it's over-reaching and that by sticking close to Supreme Court jurisprudence, the bar is high enough to protect freedom of expression and at the same time catches the hatred that we want to be caught by this legislation."

Marceau adds, however, that the law could do more to capture "the smaller platforms that people go to when they are banned from the big ones." 

As for restoring a new version of the Canadian Human Rights Act's old hate speech provision, Marceau acknowledges the risks of recreating a chilling effect on freedom of expression.

"We need to make sure that this not the case," Marceau says. "I think this is a good start. The balance of freedom of expression while at the same time protecting Canadians from online hate is not an easy one to strike." Marceau adds that the Centre is comparing wording in the new bill to that proposed by Irwin Cotler in his private member's bill in 2015, which it views favourably.

According to Laidlaw, the government has to bring back an adapted version of Section 13, provided "the standard is high." 

"Most provincial human rights bodies will not hear complaints about online hate because they say it is federal jurisdiction because it's on the internet," says Laidlaw. "That means that, as it stands, Canadians have zero avenues of recourse for hate speech beyond going to the police."

"It doesn't take away anything from the hate speech provisions of the Criminal Code, but law enforcement has finite resources," Marceau says. "To have another tool in the toolbox to combat online hate is a good thing, because what happens online doesn't stay online. To have a civil remedy as opposed to a criminal remedy to combat online hate at a time where levels of antisemitism are reaching terrifying levels is, for us, a good thing. That is why we've been advocating for better tools to combat online hate since 2019."

But the reintroduction of the hate crime provisions in the CHRA is getting pushback from the CCLA.

Says Mendelsohn Aviv: "It's not clear to us at CCLA that the Whatcott standard is comprehensible, and if it will make it possible to distinguish between what our lawmakers want to criticize and strong political advocacy and emotional debate."