Bill C-27, the Digital Charter Implementation Act (or Dee CIA), has been brought back. In the last session, this was Bill C-11.
Contrary to what many might assume, this is not about gun control. Instead, it concerns digital privacy, and the way and means that personal information will be shared.
In fact, a lot of the Bills in this current session are recycled versions of legislation that died in previously. This is no exception.
One major difference here is something that was created:
The Artificial Intelligence and Data Act
[Section 2: definitions]
artificial intelligence system means a technological system that, autonomously or partly autonomously, processes data related to human activities through the use of a genetic algorithm, a neural network, machine learning or another technique in order to generate content or make decisions, recommendations or predictions.
[Section 3] Interestingly, this Act, and the limitations, do not apply to:
(a) the Minister of National Defence;
(b) the Director of the Canadian Security Intelligence Service;
(c) the Chief of the Communications Security Establishment; or
(d) any other person who is responsible for a federal or provincial department or agency and who is prescribed by regulation.
The legislation then gets into how the Act would be applied, and what the limitations would be. There’s a provision to prevent “biased outcomes” from being determined by artificial intelligence.
[Section 5(1)]
biased output means content that is generated, or a decision, recommendation or prediction that is made, by an artificial intelligence system and that adversely differentiates, directly or indirectly and without justification, in relation to an individual on one or more of the prohibited grounds of discrimination set out in section 3 of the Canadian Human Rights Act, or on a combination of such prohibited grounds. It does not include content, or a decision, recommendation or prediction, the purpose and effect of which are to prevent disadvantages that are likely to be suffered by, or to eliminate or reduce disadvantages that are suffered by, any group of individuals when those disadvantages would be based on or related to the prohibited grounds.
For reference, the Canadian Human Rights Act lists: “race, national or ethnic origin, colour, religion, age, sex, sexual orientation, gender identity or expression, marital status, family status, genetic characteristics, disability and conviction for an offence for which a pardon has been granted or in respect of which a record suspension has been ordered”, as protected grounds
In other words, AI can be used to pander to specific groups of people. However, “noticing” things would presumably violate the law.
[Section 6] lays out a requirement to add safeguards to anonymized data, which actually a really good idea. Guess we’ll have to see what those protections are later.
[Section 11] states that anyone or group that is involved in running a high-impact system must publish information — in plain terms — how the system works, and what safety protocols are in place.
(from the Bill) High-impact system means an artificial intelligence system that meets the criteria for a high-impact system that are established in regulations. However, the regulations haven’t been established yet.
[Sections 13, 14] allows Cabinet Ministers to require the disclosure of certain records, particularly if there is the risk of “biased outcomes” in what the AI is generating.
[Section 26] lists others who may be able to access confidential information, including:
(a) the Privacy Commissioner;
(b) the Canadian Human Rights Commission;
(c) the Commissioner of Competition;
(d) the Canadian Radio-television and Telecommunications Commission;
(e) any person appointed by the government of a province, or any provincial entity, with powers, duties and functions that are similar to those of the Privacy Commissioner or the Canadian Human Rights Commission;
(f) any other person or entity prescribed by regulation.
[Section 28] gives the Minister the authority to publish information about people or a group (without their consent), if it’s believed that doing so will prevent harm from coming to them. However, it’s not stated what “reasonable grounds” actually means.
[Section 29] gets into Administrative Monetary Penalties, and the stated goal of ensuring compliance with the The Artificial Intelligence and Data Act.
[Section 30] states that it’s an offence to violate Sections 6-12, as well as providing misleading information to the Minister, or anyone acting for the Minister.
[Section 36] is a backdoor provision, which exists in many pieces of legislation. It allows the Governor in Council to make regulations without the need to Parliamentary oversight.
[Sections 38-40] lay out penalties, both monetary and potential prison time, for violations of this Act. Fines can be up to $25,000,000 + 5% of revenues. Prison time can be up to 5 years (if proceeded by indictment), and 2 years less a day (if proceeded summarily).
Aside from the Artificial Intelligence and Data Act being included, this legislation is essentially just Bill C-11 from the last session of Parliament.
Consumer Privacy Protection Act
The Consumer Privacy Protection Act was the bulk of the last version of this Bill, and was in this one as well. While name appears to take privacy seriously, it’s worth noting that that Section 4 states that it doesn’t apply to:
(a) any government institution to which the Privacy Act applies;
(b) any individual in respect of personal information that the individual collects, uses or discloses solely for personal or domestic purposes;
(c) any organization in respect of personal information that the organization collects, uses or discloses solely for journalistic, artistic or literary purposes;
(d) any organization in respect of an individual’s personal information that the organization collects, uses or discloses solely for the purpose of communicating or facilitating communication with the individual in relation to their employment, business or profession; or
(e) any organization that is, under an order made under paragraph 122(2)(b), exempt from the application of this Act in respect of the collection, use or disclosure of personal information that occurs within a province in respect of which the order was made.
In other words, personal information can be shared with just about anyone.
[Section 8(1)] requires that organizations designate someone to be responsible for the security of this information, and that their contact information be furnished if requested.
[Sections 9-11] outline how a privacy safety management program must be established, and some considerations in setting it up.
[Section 18] lists how and when businesses can collect personal information, or disclose it, and when consent isn’t required to go through with it.
[Section 19] says that no consent or knowledge is required from the individual to share personal information with a service provider in the course of business.
[Sections 20-22] permit research to be done using customer information as data, although it’s expected that it would be anonymized. It’s also okay to do this for prospective business transactions that haven’t yet been approved.
[Sections 23-24] are about disclosure during the course of employment. This has been the norm for a long time, as companies routinely share data for things like payroll.
[Sections 25-28] says information can be shared without knowledge or consent for the purposes of disclosure to a notary, obtaining witness statements, suspected fraud, and debt collection.
[Section 35] allows information to be disclosed without the person’s knowledge or consent if it’s being done for statistical purposes, study or research, if obtaining consent is impractical.
[Section 36] gets into the disclosure of “records of historic or archival importance”, which again, can be done without knowledge or consent.
[Section 38] allows journalists, artists and people performing literary purposes to disclose information without the knowledge or consent of other parties involved.
[Sections 43, 44] mean that Government employees would be able to access personal records without the knowledge or consent of others, if done for the purpose of administering laws.
The Act then goes on at length about procedures that would be in place if these other rules were violated.
Bill C-27 would make various changes to other acts such as: the Canada Evidence Act; the Access to Information Act; the Aeronautics Act, the Competition Act; the Telecommunications Act; and the Public Servants Disclosure Protection Act.
While it sounds great to enshrine digital privacy, there are so many exceptions written in that one reasonably has to wonder what protections are really offered.
Of course, there is a bit of a conflict of interest here. Reporters and journalists require access to information in order to do their jobs. While doxing isn’t acceptable, the ability to dig deep is essential in order to properly prepare a broadcast or newspaper.
Bill C-11, (the last version of this), didn’t get far in the last session, and it doesn’t appear to be urgent now. Who knows if this will actually pass?
(1) https://www.parl.ca/legisinfo/en/bill/43-2/c-11
(2) https://www.parl.ca/DocumentViewer/en/43-2/bill/C-11/first-reading
(3) https://www.parl.ca/legisinfo/en/bill/44-1/c-27
(4) https://www.parl.ca/DocumentViewer/en/44-1/bill/C-27/first-reading
(5) https://laws-lois.justice.gc.ca/eng/acts/h-6/page-1.html