Skip to Content

Privacy in the balance

The pandemic is accelerating the spread of Big Data. Canadian law and regulation will have to find a way to keep up.

Man holding smartphone in the dark
iStock

Amid the pandemic, data is having a heyday. In many countries, governments have used aggregated cell phone location data to monitor social distancing. In South Korea, a trail of credit card purchases reveals a trail of past infections. And in countries from Israel to Taiwan, real-time cell phone geolocation can track COVID-19 patients and alert those who have been in close physical proximity to the virus.

The use of this personal data may be well-intentioned and is probably effective. But privacy advocates are counselling caution. According to Sinziana Gutiu, a data protection lawyer at Telus, we’re at a pivotal moment in history that will likely shape the future of privacy laws and norms in Canada and around the world. “Decisions made during a crisis like COVID-19 about how people must behave could pose a significant threat to privacy rights,” she says.

The pandemic has supercharged the spread of the Big Data into the public and private sphere—but Canadian law and regulation is not keeping up. “One of the most effective ways to ensure the protection of privacy, during and after a crisis, is through privacy law reform,” says Gutiu.

The trouble is we’ve been playing fast and loose with our personal data for the past decade. Our mobile phones are a treasure trove of valuable information for Big Tech. It knows where we are, our likes, interests and shopping habits, as well as our age, ethnicity, political allegiances and financial health. It keeps track of our family, friends and colleagues, our education and work histories, our biometric and medical data, and our photos and faces. Petabytes of data are bought and sold at extraordinary speed around the world to vast numbers of data brokers, adtech companies and internet giants. The Canadian data industry alone is estimated to be worth up to C$200bn — more than 10 per cent of our GDP.

Much of this data is used, often in relatively transparent ways, to make money. So-called "free" internet services like Facebook and Google are now widely accepted as an exchange of service-for-data. The information generates revenue by allowing advertisers and adtech companies to target audiences more efficiently.

Some data uses are ingenious and genuinely helpful. Netflix algorithms suggest TV shows to consumers based on past viewings — useful when in self-isolation. And when the lockdown ends, real-time geolocation data will help us avoid the rush-hour jams using Waze or Google Maps. Crowdsourced data can track deforestation in the Amazon and help livestock herders predict the weather in Mongolia. DNA datasets follow the progression of inheritable medical conditions. Smart cities can reduce their carbon footprint and streamline mass transit. Data scraps from individuals, when pooled together and scaled up, can be transformed into a collective good.

Data has a dark side, of course. The Cambridge Analytica scandal shed light on how political propagandists target social media users in an attempt to tip the balance of an election. Millions of North American faces are searchable in a massive facial recognition database created by Clearview AI, which sells the information to law enforcement agencies. Loan sharks use demographic data to find and exploit new customers. Every time a creepily 'relevant' ad shows up on a website, it's been placed there by the highest bidder of a real-time ad auction, using adtech datasets that have us classified and categorized to the nth degree. Even Apple's CEO, Tim Cook, has decried how the "data industrial complex… is being weaponized against us with military efficiency."

So how can we avoid the pitfalls, and balance the demands of innovation, privacy and the social good?

Those concepts are not mutually exclusive, according to Sinziana Gutiu. "I don't see privacy and innovation to be at odds," she says. "Privacy has to be at the forefront of whatever's happening with this data to get the full value. Privacy needs innovation in order to operationalize and become practical." Gutiu takes the view that being irresponsible with data is not sustainable: "If your consumers have trust, and the company is doing the right thing… that's when you get the best value out of the data."

But it's not always clear what the right thing actually is. We need clarity and guidance – and Canadian law does not always provide it.

It's why in 2018, Navdeep Bains, the Minister of Innovation, Science and Economic Development, launched a consultation process and published a Digital Charter to serve as a roadmap to reform of data protection in Canada. According to Dr. Teresa Scassa, Canada Research Chair in Information Law and Policy at the University of Ottawa, the Personal Information Protection and Electronic Documents Act -- or PIPEDA -- is showing its age. "The legal framework that we have is full of loopholes and question marks and uncertainties," she says.

Lawmakers designed PIPEDA to be technology neutral. It sets out the main principles that govern data protection in the private sector. Companies must be accountable to their users. They must identify the purpose to which data is put. They should limit the collection of data to what is required for the purpose. They must strive to correct inaccuracies and they must grant individuals access to their own data sets. And generally speaking, companies must seek consent.

But consent has its limits. When most users sign up for a new, shiny (and often free) product or service – whether it's Facebook's photo tagging, or Amazon's Ring doorbell app, most users do not bother to read the fine print, and consent is rarely informed. As Michael Geist, Professor of Internet & E-commerce law at the University of Ottawa, points out, "User agreements are not negotiated […] You click on 'agree', but you've got no real ability to influence or have an impact on what the agreement actually says."

Legislation needs to redress that imbalance, and strengthening the principles underlying the law is becoming vital, especially as Big Data is increasingly taken out of human hands. According to Tamir Israel of the Canadian Internet Policy and Public Interest Clinic, automated decision-making and AI is one area where PIPEDA falls short. When a bank refuses a credit card to a customer based on an opaque and little-understood automated algorithm, and ends up discriminating against them on the basis of race, for example, can the customer be protected? "Companies are not required to subject their AI to rigorous testing to see if there's systemic bias," says Israel. And what if the raw data being fed to the algorithm is not accurate? "PIPEDA does require accuracy, but it hasn't been applied rigorously in this context. There's not a lot of scrutiny in place.”

We also need to do better at data minimization, says Israel, and making sure that companies only collect the data they need to collect for a given purpose. PIPEDA could also do better at defining what constitutes 'sensitive data" – for example, biometric data, sexual preference, medical history – and enforce more stringent safeguards. And we need more clarity on how to deal with data that has been anonymized or de-identified. It's becoming easier to reconstitute anonymous or pseudonymous data to identify individuals. All in all, Israel says, PIPEDA was a good start: but there's work to do.

Europe has already taken a firm position on these issues. The General Data Protection Regulation, the world's strictest data protection law, enshrines privacy, transparency, and accountability. It also imposes severe financial penalties on companies that transgress.

The United States, federally at least, leans the other way. "In the U.S., you'd only restrict the free flow of information if there's a solid justification to do so," says Éloïse Gratton, a privacy and data protection lawyer at BLG, based in Montreal. Companies who deal in data in the U.S. have more freedom in how they collect and use personal information. (Clearview AI is one such example: they say their business model, which depends on amassing and analyzing publicly available photos, is protected by the First Amendment).

At present, Ottawa is feeling more European. At the launch of the Digital Charter, Navdeep Bains said Canadians expect more control and more knowledge of how their data is used. "Canadians will have clear and manageable access to their personal data and should be free to share or transfer it without undue burden." Eloise Gratton is concerned that the undue burden may fall on the private sector instead. Gratton says that Quebec regulation, for example, has become restrictive and bureaucratic. Research projects that want to use personal data have to apply to the Quebec privacy commissioner in advance, and it can take a long time to get approval. "Projects should be disclosed to regulators," says Gratton. "But I'm wondering whether the Privacy Commissioner is the right entity to evaluate them. Their mandate is very much focused on protecting privacy. Will they be objective?"

Gratton admires the GDPR for recognizing that public interest and research needs require flexibility in the way data gets collected. Under the GDPR, she says, "there's a whole ecosystem beyond consent." But she's not keen on the idea of enshrining large automatic fines in the legislation. "This may have an impact on innovation." The way these laws are drafted, she says, there's a lot of grey zones. "What's reasonable? That's something that evolves over time.”

Michael Geist lays out the Canadian conundrum. "We're a bit torn. For a long time, we were aligning ourselves more closely with the U.S.... More recently, we've seen a bit of a shift towards more if European-style approach. Is there scope for a made-in-Canada solution, on an issue that is fundamentally global in scope? I'm not sure we know the answer to that."

PIPEDA clearly needs to protect the collective goods that are created through smart, privacy-minded applications of this hugely powerful tool. But the balance is critical. Gutiu quotes Edward Snowden: privacy rights are about "protecting the minority from the majority." Do we really want the freedom to sell that fundamental right in exchange for a product or a service?

Wherever we land on the America-versus-European spectrum of protections—most likely, in the middle of the Atlantic—the system only works if we can enforce the law.

That's a significant challenge. Right now, the Office of the Privacy Commissioner of Canada can issue findings, and if the company does not abide by those findings, it can take an errant company to Federal Court. But each court application is a de novo proceeding, meaning the hearings are lengthy and inefficient. The OPC has no order-making powers, nor can it impose fines. As Michael Geist points out, "the combination of a well-resourced regulator, significant penalties, and investigative powers are essential."

According to Scassa, some structural reform is necessary. She suggests that a privacy tribunal be created under PIPEDA to provide more distance between the Commissioner's broader role and the adjudication of complaints. 

But as we navigate the choppy waters of the international data economy, one lonely nation's law becomes hard to enforce. Clearview AI, for example, is currently under scrutiny by the OPC, and will probably be found to be in breach of PIPEDA. "[Clearview] may feel absolutely no reason to follow the Commissioner's recommendation," says Scassa, "and would have a hard time doing that without completely altering their business model." To pursue the matter, the Commissioner would have to go to Federal Court for an order, which may not be enforceable in the U.S.

"We're not in a strong position," says Scassa, "and we're a long way off from global privacy standards."