Skip to main content

The Canadian Government Undertakes a Second Effort at Comprehensive Reform to Federal Privacy Law

Reading Time 17 minute read


Privacy & Cybersecurity Bulletin

On June 16, 2022, the Canadian government tabled a second attempt to reform Canadian privacy law in Bill C-27, the Digital Charter Implementation Act, 2022. [1] While 2020’s Bill C-11 [2] sought to enact comprehensive reform of the federal private-sector privacy law, many criticized it for being too timid of an attempt. The former Privacy Commissioner of Canada and the Ontario government were particularly critical of Bill C-11, with former Commissioner Therrien calling it “a step back overall from our current law and need[ing] significant changes” to restore confidence in the digital economy. Despite this disapproval, other than the new artificial intelligence framework Bill C-27 is derived from Bill C-11 and will likely face some of the same criticism from those who took issue with Bill C-11 (see a comparison between Bill C-27 and Bill C-11 here).

Like its predecessor, the 2022 Act proposes to

  • Enact the Consumer Privacy Protection Act (CPPA) to replace Part 1 of the Personal Information Protection and Electronic Documents Act (PIPEDA), which is the part of PIPEDA that addresses privacy in the private sector; and
  • Enact the Personal Information and Data Protection Tribunal Act (Tribunal Act) establishing the Personal Information and Data Protection Tribunal (Tribunal), which would hear recommendations of and appeals from decisions of the Privacy Commissioner of Canada (Commissioner).

In addition, the 2022 Act would:

  • Enact the Artificial Intelligence and Data Act (AIDA) to regulate “artificial intelligence systems” and the processing of data in connection with artificial intelligence systems.

Significantly, the preamble to Bill C-27 states that the protection of privacy interests “is essential to individual autonomy and dignity and to the full enjoyment of fundamental rights and freedoms in Canada.”

  • As in the former reform effort, the new CPPA retains the principles-based approach of PIPEDA but integrates and adds to those principles directly in the act rather than set them out in a schedule like PIPEDA.

Relative to the version set out in the Bill C-11, the 2022 iteration of the CPPA also:

  • Deems the personal information of minors to be sensitive personal information and provides additional protections for the personal information of minors;
  • Introduces a “legitimate interest” exception to consent, along with revisions to Bill C-11’s “business activities” exception;
  • Clarifies that the “manner” of collecting, using, and disclosing personal information, in addition to the purpose for doing so, must be appropriate in the circumstances, regardless of whether consent is required under the CPPA;
  • Introduces a definition of “anonymize” and clarifies that de-identified information is personal information subject to the CPPA (with exceptions) and that anonymized information is not;
  • Provides that retention periods must consider the sensitivity of personal information and that security measures include reasonable authentication measures;
  • Limits the requirement to provide explanations of automated decision-making to cases where it “could have a significant impact on individuals”;
  • Expands the cases where de-identified information may be re-identified;
  • Modifies the right of disposal to apply to the personal information an organization “controls” rather than personal information “has collected from individuals”; and
  • Expands the circumstances under which the Commissioner may recommend that a penalty be imposed.

The CPPA in Bill C-27 retains the regulatory tools to address compliance and the much more severe remedies for non-compliance introduced in 2020’s Bill C-11:

  • New powers for the Commissioner, including audit and order-making powers;
  • The ability for the Commissioner to recommend, and for the Tribunal to impose, penalties up to the greater of $10 million or 3% of an organization’s annual global revenues;
  • Significantly expanded offences with fines up to the greater of $25 million or 5% of annual global revenues; and
  • A private right of action to permit recourse to the courts in certain circumstances.

Bill C-27’s most notable divergence from the 2020 reform effort is the new AIDA, which regulates the design, development, and use of AI systems. AIDA sets out positive requirements for AI systems as well as monetary penalties and criminal prohibitions on certain unlawful or fraudulent conduct in respect of AI systems.

Like the European Union’s proposed Artificial Intelligence Act, AIDA is risk-based and focuses on mitigating the risks of harm and bias in the use of “high-impact” AI systems. However, AIDA is not as prescriptive as the EU’s proposed law, which sets out a more detailed methodology for classifying “high-risk” AI systems and expressly prohibits a broader range of harmful AI practices, such as certain uses of biometric identification systems by law enforcement. Still, both proposed laws aim to regulate AI in a balanced manner which protects against individual harm but is not overly restrictive of technological development.

Bill C-27 is part of a trend towards the strengthening of privacy rules worldwide, with the most well known example being the adoption of the EU’s General Data Protection Regulation (Regulation (EU) 2016/679) (GDPR) and with the trend reaching Canada more recently with Québec’s revised privacy law (click here to read more about those changes). That said, there is no guarantee that Bill C-27 will pass in its current form, and it is very likely that amendments will be introduced when the bill is considered at the applicable committee of the House of Commons. Indeed, Bill C-27 introduces an entirely new artificial intelligence framework and is susceptible to much of the same criticisms as Bill C-11.

Like Bill C-11 before it, the Act is certain to attract very strong attention from domestic and foreign organizations that collect information about Canadians and are subject to Canadian privacy law. It will also concern organizations that operate artificial intelligence systems or process data used in those systems, particularly in light of the impacts of the COVID pandemic and the additional compliance costs and risks of material liability that Bill C-27 represents. Organizations and trade associations should consider the impact of the Act and its evolution as it progresses through Parliament and be prepared to propose improvements and to address any unintended consequences of its reforms. Our Fasken privacy team will keep you updated.

A high-level summary of key features of AIDA is set out below. We have also updated our summary of the CPPA and the role of the Tribunal based on the revisions in Bill C-27.

The Revised Consumer Privacy Protection act (CPPA)


Like PIPEDA, the CPPA is consent-based, but expands the requirements for obtaining consent and applicable exceptions to consent.

Under the CPPA, to obtain valid consent, organizations must notify individuals, in plain language, of the type of personal information they collect, use, and disclose, and of the purposes, manner, and consequences of such collection, use, and disclosure before or at the time of collection. [3] Bill C-27 clarifies the plain language requirement: organizations must provide information in a language that an individual would reasonably be expected to understand. [4] Organizations must also identify any third parties to whom personal information will be disclosed.

The expanded exceptions to consent include:

  • The collection or use of personal information for certain business activities, including an activity required to provide products or services to an individual, an activity that is necessary for the organization’s information system or network security, for the safety of a product or service that the organization provides, or any other prescribed activity, in each case provided the individual would expect the collection or use and it is not for the purposes of influencing the behaviours or decisions of the individual.
  • The collection and use of personal information for a legitimate interest “that outweighs any potential adverse effect on the individual,” provided the individual would expect the collection or use and it is not for the purposes of influencing the behaviours or decisions of the individual. The use of the legitimate interest exception is subject to conditions, including that the organization identify any potential adverse effect on the individual and take reasonable measures to reduce or mitigate those effects. Organizations must also keep records with respect to the foregoing. [5]
  • Public interest purposes as set out in the CPPA.
  • Transfers of personal information to service providers.
  • De-identifying personal information.

Finally, the CPPA requires express consent for certain activities, specifically for the business activities listed above and activities related to legitimate interests to the extent that those activities cannot benefit from the exceptions to consent for those activities (e.g., the activities are for the purposes of influencing behaviour or would not be expected by a reasonable person).

Policies and Practices

The CPPA will require organizations, as part of their governance framework, to consider the sensitivity of personal information when determining its retention period. [6] Physical, organizational, and technological security safeguards must include reasonable measures to authenticate the identity of the individual. [7] The CPPA requires organizations to outline in plain language their policies and practices regarding the protection of personal information, which must be readily available and indicate:

  • What type of personal information the organization controls;
  • How an organization uses it and how it applies the exceptions to the requirement to obtain an individual’s consent, including where it invokes a “legitimate interest”;
  • Whether it uses automated decision making about individuals that could have a significant impact on them;
  • Whether it transfers personal information outside Canada or interprovincially in a way that may have foreseeable privacy implications;
  • The retention periods applicable to sensitive personal information;
  • How an individual can make a request for access or disposal; and
  • Who to contact to file a complaint. [8]

Transfers of Personal Information and Service Providers

Where an organization transfers personal information to a service provider, it must ensure an equivalent level of protection of the personal information (by contract or otherwise) that the organization is required to provide under the CPPA. Service providers must safeguard personal information and provide notice of any breach of security safeguards to the organization who controls the personal information. Otherwise, provided that a service provider only uses the transferred personal information for the purposes for which it was transferred, service providers are exempt from the obligations of the CPPA with respect to the transferred personal information.

An organization’s readily available policies must include a description of interprovincial and international transfers of personal information and the privacy implications of those transfers.

Enhanced Individual Rights: Disposal of Personal Information and Mobility

The CPPA integrates PIPEDA’s second privacy principle, Identifying Purposes, directly into the body of the legislation. The CPPA in Bill C-27 reinforces this principle by clarifying that an organization may collect, use, or disclose personal information only in a manner and for purposes that a reasonable person would consider appropriate in the circumstances, whether or not consent is required. [9]

In addition, the CPPA requires an organization on written request to dispose of personal information under its control, as soon as feasible if:

  • The information was collected, used, or disclosed in contravention of the Act;
  • The individual has withdrawn their consent to the collection, use, or disclosure of their information; or
  • The information is no longer necessary to provide a product or service requested by the individual. [10]

Relative to Bill C-11, Bill C-27’s CPPA provides for an expanded list of exceptions where an organization may refuse disposal. [11] Where it disposes of personal information, an organization must also inform any service provider to which it has transferred the information and ensure that the service provider disposes of the personal information as well. [12]

The CPPA also allows an individual to request that an organization disclose their personal information to another designated organization where both organizations are subject to a data mobility framework, [13] similar to the European Union’s data portability right.


The CPPA treats the personal information of minors as sensitive information and imposes heightened protection for the handling of such personal information. In addition, the new CPPA now enables parents to act on behalf of their children to protect their rights.

Automated Decision Systems

Independently of the AIDA, the CPPA also addresses the impacts on privacy rights and personal information protection in relation to automated decision systems that replace the judgement of a human decision maker. Compared to Bill C-11, the scope of the automated decision system provisions has been limited to systems that may have a significant impact on individuals, a change that organizations will certainly welcome. Organizations that use automated decision systems to make a decision that could have a significant impact on an individual must, on request by the individual, explain:

  • The type of personal information that was used to make the prediction, recommendation or decision;
  • The source of the information; and
  • The reasons or principal factors that led to the prediction, recommendation or decision. [14]

De-Identification and Anonymization

Like Québec’s new privacy law, the CPPA distinguishes between anonymized and de-identified personal information. It defines “anonymize” as irreversibly and permanently modifying personal information, in accordance with generally accepted best practices, to ensure that no individual can be identified from the information, whether directly or indirectly, by any means. [15] The CPPA does not apply to anonymized information. [16]

The CPPA defines “de-identify” as modifying personal information so that an individual cannot be directly identified from it, though a risk of the individual being identified indirectly remains. The CPPA clarifies that de-identified information is personal information except in certain cases, most notably in connection with research, business transactions, and certain rights of individuals.

Under Bill C-27, the CPPA expands the situations in which organizations may re-identify an individual using de-identified information, [17] including by granting the Commissioner the power to authorize a re-identification if it is clearly in the interests of the concerned individual. [18]

New Commissioner Powers and the Personal Information and Data Protection Tribunal

The Act represents a significant departure from the current enforcement model in Canada and would present materially greater legal and reputational risks to organizations in relation to non-compliance with federal privacy law. Currently under PIPEDA, the Commissioner has no power to make orders or to recommend monetary penalties. The CPPA provides for both, and organizations subject to Canadian privacy law will face a range of potential sanctions, including significant monetary penalties and awards, and a private right of action.

The Tribunal Act establishes the Personal Information and Data Protection Tribunal and grants the Tribunal jurisdiction over the penalties that may be imposed under the CPPA. The Tribunal must provide a decision, with written reasons, to all parties to a proceeding, and must make its decisions and the reasons for them publicly available. Bill C-27 makes relatively few changes to the Tribunal Act but such changes go to the core of the Tribunal’s authority, as in Bill C-27 the Tribunal has the powers of a superior court of record instead of the power of a commissioner under the Inquiries Act.

In addition, the CPPA expands the Commissioner’s powers to conduct inquiries and impose penalties through recommendations to the Tribunal. [19]

Where the Commissioner finds it appropriate to make an order instead of recommending a penalty, it may do so directly. It can order an organization to:

  • Take measures to comply with the CPPA;
  • Stop doing something that is in contravention of the CPPA;
  • Comply with the terms of a compliance agreement entered into by the organization; or
  • Make public any measures taken or proposed to be taken to correct the policies, practices, or procedures that the organization has put in place to fulfill its obligations under the CPPA. [20]

Penalties and Fines

The CPPA provides for penalties and expanded fines for non-compliance with certain of its provisions. Where the Commissioner has conducted an inquiry [21] and has determined that an organization has contravened certain sections of the CPPA (which are expanded under Bill C-27), including those related to maintaining a privacy management program, transfers of personal information to service providers, consent, limiting collection, use, and disclosure of personal information, the retention and disposal of personal information, and security safeguards, the Commissioner may make a recommendation to the Tribunal to impose a penalty.

We have summarized the possible penalty and fine amounts in the table below:

  Situation Amount
Monetary penalties Upon the Commissioner's recommendations, the Tribunal may impose a penalty. $10 million or 3% of the organization's annual gross global revenue, whichever is greater.

Where an organization knowingly contravenes provisions relating to:

  • reporting of breaches of security safeguards;
  • maintaining records of breaches of safeguards;
  • retaining information subject to an access request;
  • using de-identified information to identify an individual;
  • whistleblower protections; or
  • obstructing a Commissioner investigation or inquiry.

It is liable to a fine, on indictment: of $25 million or 5% of an organization’s annual gross global revenue, whichever is greater.


On summary conviction: of $20 million or 4% of its annual gross global revenues, whichever is greater.


Private Right of Action

The CPPA introduces a private right of action against organizations for damages for loss or injury where the Commissioner (where the finding is no longer subject to appeal to the Tribunal) or Tribunal has found that an organization has contravened the CPPA. The private right of action would allow individuals to seek financial relief from Federal Court or a provincial superior court for various violations of the CPPA. [22]

The Artificial Intelligence and Data Act (AIDA)

AIDA defines “artificial intelligence system” as a technological system that, autonomously or partly autonomously, processes data related to human activities through the use of a genetic algorithm, a neural network, machine learning or another technique in order to generate content or make decisions, recommendations, or predictions. AIDA contemplates future regulations that will set out criteria to define “high-impact” AI systems.

AIDA regulates the following activities:
a) processing or making available for use any data relating to human activities for the purpose of designing, developing, or using an AI system; and
b) designing, developing, or making available for use an AI system or managing its operations.

Requirements of AI Systems

AIDA imposes the following requirements on persons who manage or are responsible for high-impact AI systems, or who otherwise engage in the regulated activities listed above:

  • Anonymized Data: Persons who make anonymized data available for use in AI systems must establish measures with respect to the manner in which data is anonymized and the use or management of anonymized data. 
  • Assessments: Persons responsible for an AI system must assess whether it is a high-impact system (in accordance with future regulations).
  • Risk Mitigation Measures: Persons responsible for high-impact systems must establish measures to identify, assess, and mitigate the risks of harm or biased output that could result from the use of the system and must monitor such measures to ensure their effectiveness.
  • Record Keeping: Records must be kept which describe the risk mitigation measures established and the reasons supporting the assessment of whether or not an AI system is a high-impact system. The Minister of Innovation, Science and Industry has broad powers to compel disclosure of such records.
  • Public AI Statement: Persons who manage or make available high-impact AI systems must publish on a publicly-available website a plain-language description of the system that explains (a) how the system is used or intended to be used, (b) the types of content that it generates and the decisions, recommendations or predictions that it makes (or is intended to generate or make), (c) the risk mitigation measures established in respect of it, and (d) any other information required by regulation.
  • Reporting Obligations: Persons who are responsible for high-impact AI systems must notify the Minister if use of the system results or is likely to result in material harm (meaning physical or psychological harm to an individual, damage to an individual’s property or economic loss to an individual). [23]

The Minister has broad audit rights and order-making powers with respect to these requirements. A person who contravenes any of the requirements is liable to a maximum fine of the greater of $10 million and 3% of the person’s gross global revenues in the preceding year or, in the case of an individual, a fine in the discretion of the court.

Monetary Penalties & Criminal Prohibitions

AIDA establishes an administrative monetary penalty regime for violations of the Act, the specifics of which will be set out in future regulations.

AIDA also makes certain prohibited activities criminal offences:

  • Unlawful use of personal information in AI systems: Personal information used in connection with AI systems must be lawfully created or obtained, including in accordance with the provisions of the CPPA. AIDA makes it an offence for a person to possess or use personal information for the purpose of designing, developing, using or making available for use in AI system if the person knows or believes that the personal information was obtained or derived as a result of the commission of an offense under Canadian law (or an act or omission committed outside of Canada which, had it occurred in Canada, would constitute an offence under Canadian law).
  • AI systems which cause harm or economic loss: Under AIDA, it is an offence to make an AI system available for use if it is likely to cause serious physical or psychological harm to an individual or substantial damage to an individual’s property, and the system causes that harm or damage. It is also an offence to make an AI system available for use with the intent to defraud the public and to cause substantial economic loss to an individual, and the system causes that loss.

A person who commits either of these offences is liable to a maximum fine of the greater of $25 million and 5% of the person’s gross global revenues in the preceding year or, in the case of an individual, a fine at the discretion of the court and/or a term of imprisonment of up to five years less a day.

Administration of AIDA under the new Artificial Intelligence and Data Commissioner

The Minister may designate a senior official within the Ministry as the Artificial Intelligence and Data Commissioner. The Commissioner will assist in the administration and enforcement of AIDA and may be delegated any of the powers and duties conferred on the Minister, including the following:

  • Promote public awareness of AIDA and provide education;
  • Make recommendations and prepare reports on measures to facilitate compliance with the requirements of AIDA;
  • Establish guidelines with respect to compliance with the requirements of AIDA; and
  • Establish an advisory committee to provide advice and support on any matters related to the requirements of AIDA and publish such advice online.


[1] Bill C-27, An Act to enact the Consumer Privacy Protection Act, the Personal Information and Data Protection Tribunal Act and the Artificial Intelligence and Data Act and to make consequential and related amendments to other Acts, 44th Parl., 1st Sess., 70-71 Elizabeth II, 2021-2022 (First Reading).

[2] Bill C-11, An Act to enact the Consumer Privacy Protection Act and the Personal Information and Data Protection Tribunal Act and to make consequential and related amendments to other Acts, 43rd Parl., 2nd Sess., 69 Elizabeth II, 2020 (First Reading).

[3] CPPA., s. 15(3).

[4] CPPA., s. 15(4).

[5] CPPA., s. 18(2)(3).

[6] CPPA., s. 53(2).

[7] CPPA., s. 57(3).

[8] CPPA., s. 62(2).

[9] CPPA., s. 12(1).

[10] CPPA., s. 55(1).

[11] CPPA., s. 55(2).

[12] CPPA., s. 55(4).

[13] CPPA., s. 72.

[14] CPPA., s. 63(4).

[15] CPPA., s. 2(1).

[16] CPPA., s. 6(5).

[17] CPPA., s. 75.

[18] CPPA., s. 116.

[19] CPPA., s. 94.

[20] CPPA., 93.

[21] CPPA., s. 89 or 90.

[22] CPPA., s. 107(1).

[23] AIDA, ss. 6-12.

Contact the Authors

For more information or to discuss a particular matter please contact us.

Contact the Authors



    Receive email updates from our team