Social media’s standard of care
Recent lawsuits in the U.S. have held powerful tech companies accountable for harm. Observers say it’s a harbinger of where things could go here
In the wake of two landmark decisions on social media negligence in the United States, Canadian legal experts say a tide may be turning toward holding some of the world’s most powerful companies accountable for harm—with implications for legal action in this country.
Late last month, a Los Angeles County Superior Court jury found Meta and YouTube harmed a young user by designing systems that were addictive and caused mental distress. The companies were ordered to pay $4.2 million and $1.8 million in damages, respectively.
That same week, a New Mexico jury found Meta had misled users about the risks associated with its platforms and allowed sexual exploitation of minors through lax safety practices. The jury found Meta violated state consumer protection laws and ordered it to pay $375 million in damages.
A broad shift is underway
Some Canadian lawyers say that’s a harbinger of where things could go here.
“These are very important cases,” says Reidar Mogerman, a partner at CFM Lawyers in Vancouver who is co-counsel on a class action alleging Instagram and Facebook’s design features caused metal harm to children, which was filed on March 31.
“We will be restructuring the landscape through regulation and litigation, and … it's a really interesting pivot point in something that is having a massive impact across society.”
He says while the underlying facts are true on both sides of the border, they’ll still need to establish their own facts and apply them to our legal system. The Canadian approach — aggregating cases into a class, rather than the U.S. method of running thousands of individual cases — also means action here is delayed compared to cases in the U.S.
Either way, however, a broad shift is underway.
“Public knowledge is much higher than it was, say, five years or 10 years ago,” Mogerman says.
“Regulatory scrutiny is much higher than it was five years or 10 years ago. And as you can see from these decisions, juries and hopefully judges will take that evidence and turn it into decisions.”
The proposed B.C. class action is awaiting certification, which he says is likely 18 months away. It’s just one of several Canadian legal cases underway that aim to take social media companies to task for the impact of their products.
In 2024, several Ontario school boards filed a lawsuit against the companies responsible for Facebook, Instagram, Snapchat, and TikTok, accusing them of designing products that are addictive and harm children’s mental health and ability to learn.
Another Ontario class action is in development for young people who’ve suffered mental health effects from addiction to social media.
U.S. decisions long overdue
Darryl Singer, a partner at Diamond and Diamond, which is putting together the class action, says the U.S. decisions confirm what many people working in this area already know. Still, the results could be influential.
“What the decisions there have done is given us some sort of credence. I think it's going to make it very, very difficult for the social media companies to defend the class actions and the mass torts that have arisen across Canada and the U.S, including ours.”
He says while it’s unusual for a Canadian case to rely on a U.S. precedent, he’ll be leaning on the Los Angeles County Superior Court decision. That includes employing some of the same evidence, as the algorithms — and the effects — are the same on both sides of the border.
Some legal experts say that, given the breadth of evidence, the U.S. decisions are long overdue.
“It's about time,” says Maanit Zemel, a partner at Zemel van Kampen LLP in Toronto and an internet law expert.
“Experts in the field know that this has been going on for a very long time. It’s good to see … it being recognized by courts.”
Canadian courts have long recognized issues with social media companies, and have found liability in contexts ranging from defamation to intellectual property. But she says the hurdle in those findings has been enforcement and jurisdiction, as when social media companies lose in Canadian courts, they simply ignore the judgment.
“They say, ‘well, good luck enforcing it against us because we are based in the U.S. and in the U.S. it's not going to be enforceable.’”
In the U.S., companies have been shielded from enforcement by section 230 of the Communications Decency Act, which protects platforms from liability for the content their users post.
The recent U.S. cases, however, took a different approach, addressing not the content but how the system is designed to relay content, removing the shield of the Act. That could have implications for cases here, Zemel says, as it’ll be easier for a Canadian judge to find liability, without worrying about those findings being unenforceable.
“That makes it easier to proceed here both with existing claims as well as future claims.”
The need for legislation
Zemel says the U.S. judgments may also encourage plaintiffs in Canadian cases to push forward, or social media companies to settle. But for systemic change, she says she’s watching for a regulatory framework.
At the federal Liberal Party’s recent national convention, members voted in favour of following Australia’s lead and passing legislation restricting young people’s access to social media platforms. The motion calls for setting a minimum age of 16 for creating social media accounts and placing an obligation on platforms to prevent under-age users from holding accounts.
Dr. Emily Laidlaw, the Canada Research Chair in cybersecurity law at the University of Calgary, says legislation is needed to set out duties for social media companies, including requirements for transparency about their practices.
She notes that Canada’s Online Harms Act, which died with the dissolution of Parliament before the last federal election, explicitly addressed the impact of design features.
“The role of online harms legislation is supposed to be that you have this body that is acting in the public interest to ensure that these companies are meeting certain standards."
Laidlaw, a member of the recently reconvened federal expert advisory group on online harm, says that while litigation is “the last resort solution,” it’s an important avenue for victims to seek redress for their losses and a significant part of a global shift that could represent a sea change in shaping what the standard of care could be.
“This is almost the natural course of things: the tech matured and became embedded in society. Now, [we] start looking at putting in place certain legal standards.”