Skip to Content

Privacy’s unintended consequences

Re-introducing Bill C-11 requires modifications.

Data in the background and warning sign

Bill C-11, the Digital Charter Implementation Act, 2020, was introduced in November 2020 and died on the Order Paper when the 2021 election was called. A new bill should be introduced in Parliament in 2022. The Canadian Bar Association’s Privacy and Access Law Section had already responded to a consultation document on Strengthening Privacy for the Digital Age in 2019. It is now offering its comments on Bill C-11 to help guide the government before it introduces the new bill.

The CBA Section is generally supportive of Bill C-11, but a few matters could result in unintended consequences if the same bill were reintroduced. Below is a summary of the most problematic issues and how they should be addressed.


The CBA Section is troubled by the proposed wording of the definition of what constitutes de-identification. It believes the bar is too high for what constitutes de-identified information, and this will restrict too many ordinary business activities. The Section would support instead a definition like that of “pseudonymisation” under the General Data Protection Regulation 2 (GDPR).

De-identification “means to modify personal information — or create information from personal information — by using technical processes to ensure that the information does not identify an individual or could not be used in reasonably foreseeable circumstances, alone or in combination with other information, to identify an individual.”

Such information is no longer personal information, the Section adds, it is anonymized and should not be subject to the proposed Consumer Privacy Protection Act, or CPPA, or any other law governing personal information. “Further,” the CBA letter reads, “comingling the concepts of anonymized information and de-identified information will create interpretative difficulty and international legal compliance challenges.”

Business transaction exemption

The requirement that information be de-identified to be shared during a business transaction is unworkable in practice, the Section says. Especially during mergers and acquisitions, investment, and other commercial transactions, where it is often necessary to disclose some personal information for the buyer to perform due diligence.

By contrast, s. 7.2 of the Personal Information Protection and Electronic Documents Act, or PIPEDA, has worked well. “The CBA Section is not aware of any Office of the Privacy Commissioner investigations or decisions where abuse of personal information disclosed under these provisions is mentioned,” the letter reads.

Data flows

The CBA Section wants to see references to interprovincial data flows removed, as there should be no barriers to interprovincial trade and commerce and the proposed CPPA would apply to interprovincial transfers.

The Section also requests that the OPC’s interim order power be removed from the CPPA. “Since OPC inquiries are complaint-driven,” it explains, “they can investigate an industry common practice. If the OPC enjoins the action at an early stage and takes several months, or even years, to render a decision, there can be significant implications for an organization’s competitiveness.”

The creation of a regulatory framework for some types of automated decision-making systems is desirable, but the approach must be intersectional and nuanced. The Section recommends a revision of the definition of automated decision systems “to focus on technology that makes assessments or decisions in lieu of human decision-making.” As well, obligations on organizations should be limited to automated decision-making “that has a real material impact on, or poses a risk of significant harm to, the individual,” the letter says.