SoVote

Decentralized Democracy

House Hansard - 136

44th Parl. 1st Sess.
November 28, 2022 11:00AM
  • Nov/28/22 12:30:44 p.m.
  • Watch
  • Re: Bill C-27 
Madam Speaker, I will be sharing my time with the member for York Centre. I am pleased to rise in the House today to speak to the digital charter implementation act, 2022, in particular the aspect on the consumer privacy protection act. If I have time, I will also discuss the artificial intelligence and data act. I am very proud to speak to these two pieces of legislation that introduce a regime that seeks to not only support the technological transformation, but also help Canadians safely navigate this new digital world with confidence. These past few years, Canadians have witnessed these technological shifts take place. They have taken advantage of new technologies like never before. In 2021, more than 72.5% of Canadians used e-commerce services, a trend that is expected to grow to 77.6% by 2025. According to TECHNATION, a 10% increase in digitalization can create close to a 1% drop in the unemployment rate. What is more, every 1% increase in digitalization can add $8.7 billion to Canada's GDP. In order to take advantage of those major benefits for our economy, we must ensure that consumers continue to have confidence in the digital marketplace. Technology is clearly an intrinsic part of our lives, and Canadians have growing expectations regarding the digital economy. It is absolutely essential that the Government of Canada be able to meet those expectations. With this bill, the government is putting forward a regime that gives Canadians the protection they deserve. First, as stated in the preamble of the digital charter implementation act, 2022, Canada recognizes the importance of protecting Canadians' privacy rights. Similarly, the 2022 consumer privacy protection act also provides important protections for Canadians. That said, our government has listened to the input of various stakeholders, and we have made changes to improve this bill. I was on the committee in the last Parliament, and there was a lot of discussion about the previous bill, Bill C‑11. I am very pleased to be able to speak to Bill C-27, so that we can get all that work done in this Parliament. One of the most important changes we have made is enhancing protection for minors. Some stakeholders felt that the previous legislation did not go far enough to protect children's privacy. I agree. Consequently, the bill was amended to define minors' information as sensitive by default. This means that organizations subject to the law will have to adhere to higher standards of protection for that information. The legislation also provides minors with a more direct route to delete their personal information. This will make it easier for them to manage their online reputation. I think this is a really important change, because we know that young people are very aware and very capable of using all types of digital platforms, but at the same time, we need to make sure that they are able to protect their reputation. In addition to protections for minors, we also made changes to the concept of de-identification of personal information. According to many stakeholders, the definitions in the old bill were confusing. We recognize that having well-defined terms helps ensure compliance with the act and provides more effective protection of consumers' information. In that regard, I understand that, because we are talking about new technologies and an evolving industry, it is important for all members to share their expertise, since that will help us develop a better piece of legislation. The difference, then, between anonymous information and de-identified information needs to be clarified because, clearly, if information is de-identified but an organization or company is able to reidentify it, that does not serve the purpose of having anonymous information. Data-based innovation offers many benefits for Canadians. These changes contribute to appropriate safeguards to prevent unauthorized reidentification of this information, while offering greater flexibility in the use of de-identified information. The new law also maintains the emphasis on controlling the use of their personal information by individuals. That remains a foundation of the law, namely that individuals must be able to fully understand the purpose for which information will be used and consent to that purpose in the most important circumstances. However, the modern economy must also have flexible tools to accommodate situations that are beneficial but that may not require consent if the organization respects certain limits and takes steps to protect individuals. The approach advocated here continues to be based on the concept of individual control, but proposes a new exception to consent to resolve these gaps as a tool for safeguarding privacy. The new provisions propose a general exception to cover situations in which organizations could use personal information without obtaining consent, provided that they can justify their legitimate interest in its use for circumstances in which the individual expects the information to be used. In addition, to prevent abuse, the exception is subject to a requirement that the organization mitigate the risk. For example, digital mapping applications that take photos of every street and that we use to view them, particularly to help with navigation, are widely accepted as being beneficial. However, obtaining individual consent from every resident of the city is impossible. I believe that everyone in the House will agree that it is hard to imagine how we managed before we had access to those navigation applications. Last evening, I had a visit with a family member in Ottawa and was very happy to have my mapping application to find my destination. The presence of an exception, combined with a mitigation requirement, therefore allows individuals to take advantage of a beneficial service while safeguarding personal information. The example shows another key aspect for building trust and transparency. Digital mapping technology presents a certain level of transparency. The vehicles equipped with cameras can be seen on our streets and the results can also be seen posted and available online. However, there are some technologies or aspects thereof that are more difficult to see and understand. That is why the bill continues granting individuals the right to ask organizations for an explanation regarding any prediction, recommendation or decision made in their regard by an automated decision-making system. What is more, these explanations must be provided in plain language that the individual can understand. These provisions also support the proposed new artificial intelligence act. However, I do not think that I have time to get into that, so I will end there.
1085 words
All Topics
  • Hear!
  • Rabble!
  • star_border
  • Nov/28/22 1:26:38 p.m.
  • Watch
  • Re: Bill C-27 
Madam Speaker, it is an honour today to rise to speak to Bill C-27, the digital charter implementation act. I think it is important to reflect on how long it has been since we last had an update to legislation regarding the privacy laws that exist around data. The last time was over 20 years ago. Twenty years might not seem like a long time, but when we think about it, 20 years ago Facebook was probably just a program Mark Zuckerberg was working on in his dorm room. If we think of iPhones, they were pretty much non-existent 20 years ago. Smart phones were out, but they certainly did not have anywhere near the capabilities they do today. So many other technologies we have come to rely on now have been getting smarter over the years. They are acting in different manners and are able to do the work they do because of the data being collected from individual users. Another great example would be Google. Twenty years ago it was nothing more than literally a search engine. One had to type into the Google form what one was looking for. Sometimes one had to put weird characters or a plus symbol between words in the search terms. It literally was just a table of contents accessing information for people. However, now it is so much more than that. How many of us have, at some point, said to somebody that we would love to get a new air fryer, and then suddenly, the next day or later that day, we see in Google, on Facebook, or whatever it might be, advertisements for air fryers that keep popping up. I am sure that sometimes it is a coincidence, but I know in my experience it seems it happens way too often to be a coincidence. These are the results of new technologies that are coming along, and in particular AI, that are able to work algorithms and build new ones based on the information being fed into the system. Of course the more information that gets fed in, the smarter the technologies get and the more they are looking to feed off new data that can give them even further precision with respect to advertising and targeting tools at people. This is not just about selling advertising. AI can also lead to incredible advancements in technology that we otherwise would not have been able to get to, such as advancements in health and the automotive industry. If we think of our vehicles, the big thing now in new cars is the lane-assist feature, which uses technology such as lidar to read signals in the road. There is technology that, when we enter our passwords to confirm we are human beings, sometimes requires us to pick different things from pictures. When we do that, we are feeding information back into helping those images be properly placed. We are not just confirming that we are human beings; there is an incredible amount of data being used to give better evaluations to various different formulas and equations based on the things we do. When we think of things like intelligent and autonomous vehicles, which basically drive themselves, 20 years ago would we ever have thought a car could actually drive itself? We are pretty much halfway there. We are at a point where vehicles are able to see and identify roads and know where they need to be, what the hazards are, and what the possible threats are that exist with respect to that drive. What is more important is that, when I get into my vehicle, drive it around and engage with other vehicles, it is analyzing all of this data and sending that information back to help develop that AI system for intelligent vehicles to make it even better and more predictive. It is not just the data that goes into the AI, but also the data that it can generate and then further feed to the algorithms to make it even better. It is very obvious that things have changed quite a bit in 20 years. We are nowhere near where we were 20 years ago. We are so much further ahead, but we have to be conscious of what is happening to that data we are submitting. Sometimes, as I mentioned in a previous question, it can be data that is submitted anonymously for the purposes of being used to help algorithms around lidar and self-driving vehicles, for example. At other times it can be data that can be used for commercial, marketing and advertising purposes. I think of my children. My six-year-old, who is in grade one, is developing his reading quite quickly. Two years ago, even at the age of four, when he would be playing a video game and would not be able to figure out how to get past a certain level, he would walk up to my wife's iPad and basically say, “Hey, Siri, how do I do this?” Just saying that, I probably set off a bunch of phones to listen to what I am saying, but the point is that we have children who, already at such a young age, are using this technology. I did not grow up being able to say, “Hey, Siri, how do I do this or that?” What we have to be really concerned about is the development of children and the development of minors, what they are doing and how that can impact them and their privacy. I am very relieved to see there is a big component of this that, in my opinion, aims to ensure the privacy of minors is maintained, even though I have heard the concern or the criticism from some members today that the definition of “minor” needs to be better reflected in the legislation. I feel as though if it is not known what a minor is, in terms of how it relates to this legislation, then I believe this is something that can be worked out in committee. It is something to which the governing members would be more than welcome, in terms of listening to the discussion around that and why or why not further clarifying the definition is important. I would like to just back up a second and talk more specifically about the three parts of this bill and what they would do. The summary reads as follows: Part 1 enacts the Consumer Privacy Protection Act to govern the protection of personal information of individuals while taking into account the need of organizations to collect, use or disclose personal information in the course of commercial activities. A consequence of this first part would be to repeal other older pieces of legislation. I think this is absolutely critical, because this goes back to what I have been talking about in terms of how things have changed over the last 20 years. We are now at a place where we really do not know what information we are giving or is being used from us. I realize, as some other colleagues have indicated, 99.9% of the time, we always click that “yes, I accept the terms” without reading the terms and conditions, not knowing exactly how our information is being used and what is actually being linked directly back to us. Through the consumer privacy protection act, there would be protections in place for the personal information of individuals while, at the same time, really respecting the need to ensure companies can still innovate, because it is important to innovate. It is important to see these technologies do better. Quite frankly, it is important for me personally, and this will be very selfish of me, that, when I am watching on Netflix a show that I really like, I get recommendations of other shows I might really like. As the member for South Shore—St. Margarets mentioned earlier, when it comes to Spotify, it is important to me also that, when I start listening to certain music, other music gets suggested to me based on what other people who share similar interests to mine have liked, and how these algorithms end up generating that content for me. It is important to ensure that companies, if we want them to continue to innovate on these incredible technologies we have, can have access to data. However, it is even more important that they be responsible with respect to that innovation. There has to be the proper balance between privacy and innovation, how people are innovating and how that data is being used. We have seen examples in recent years, whether in the United States or in Canada, where data that has been collected has been used in a manner not in keeping with how that data was supposed to be used. There has to be a comprehensive act in place that properly identifies how that data is going to be used, because, quite frankly, the last time this legislation was updated, 20 years ago, we had no idea how that data would be used today. By encouraging responsible innovation and ensuring we have the proper terminology in the legislation, companies would know exactly what they should and should not be doing, how they should be engaging with that data, what they need to do with that data at various times, how to keep it secure and safe and, most importantly, how to maintain the privacy of individuals. It is to the benefit not just of individuals in 2022, or 2023 almost, to have data that is being properly secured. It is also very important and to the benefit of the businesses, so that they know what the rules are and what the playing field is like when it comes to accessing that data. The second part of this bill, as has been mentioned: ...enacts the Personal Information and Data Protection Tribunal Act, which establishes an administrative tribunal to hear appeals of certain decisions made by the Privacy Commissioner under the Consumer Privacy Protection Act and to impose penalties for the contravention of certain provisions of that Act. This is absolutely critical, because there has to be somewhere people can go to ensure that, if they have a concern from a consumer perspective over the way their data is used and they are not happy with the result from the commissioner, they have an avenue to appeal those decisions. If we do not do that, and we put too much power in the hands of a few individuals, or in this case the Privacy Commissioner under the consumer protection act, if we give all that power and do not have the ability for an appeal mechanism, then we will certainly run into problems down the road. This legislation would help ensure that the commissioner is kept in check, and it would also help consumers have the faith they need to have in terms of accountability when it comes to their data and whether it is being used and maintained in a safe way. The third part of the bill is the more controversial in terms of whether or not it should be part of this particular legislation or in a separate vote. The summary reads: Part 3 enacts the Artificial Intelligence and Data Act to regulate international and interprovincial trade and commerce in artificial intelligence systems by requiring that certain persons adopt measures to mitigate the risks of harm and biased output related to high-impact artificial intelligence systems. That act would provide for public reporting and authorizes the minister to order the production of records related to artificial intelligence systems. The act also would establish prohibitions related to the possession or use of illegally obtained personal information for the purpose of designing, developing, using or making available for use an artificial intelligence system in an intentional or reckless way that causes material harm to individuals. One of the consequences of artificial intelligence, quite frankly, is that if we allow all of this biased information to be fed into the artificial intelligence systems and be used to create and produce results for important algorithms, then we run the risk of those results being biased as well if the inputs are going to be that way. Therefore, ensuring that there are proper measures in place to ensure individuals are not going to be treated in a biased manner is going to require true accountability. The reality is that artificial intelligence, even in its current form, is very hard to predict. It is very hard to understand exactly when a person is being impacted by something being generated from an artificially intelligent form. Quite often, a lot of the interactions we already have on a day-to-day basis are based on these artificial intelligence features that are using various different inputs in order to determine what we should be doing or how we should be engaging with something. The reality is that if this is done in a biased manner or in a manner that is intentionally reckless, people might not be aware of that until it is well past the point, so it is important to ensure that we have all of the proper measures in place to protect individuals against those who would try to use artificial intelligence in a manner that would intentionally harm them. As I come to the conclusion of my remarks, I will go back to what I talked about in the beginning, that artificial intelligence, quite frankly, has a lot of benefits to it. It is going to transform just about everything in our lives: how we interact with individuals, how we interact with technologies, how we are cared for, how we move around by transportation, how we make decisions, as we already know, on what to listen to or what to watch. It is incredibly important that as this technology develops and artificial intelligence becomes more and more common, we ensure that we are in the driver's seat in terms of understanding what is going into that and making sure we are fully aware of anybody who might be breaking rules as they relate to the use of artificial intelligence. It will become more difficult, quite frankly, as the artificial intelligence forms take on new responsibilities and meanings to create new decisions and outputs, and we must ensure that we are in a position to always be in the driver's seat and have the proper oversight that is required. I recognize that some concerns have been brought forward today by different members. At first glance, when the member for South Shore—St. Margarets and others brought forward the concern around the definition of a “minor”, which is not something I thought of when I originally looked at this bill, I can appreciate, especially after hearing his response to my question, why it is necessary to put a proper definition in there. I hope the bill gets to committee and the committee can study some of those important questions so we can keep moving this along. I certainly do not feel as though we should just be abandoning this bill altogether because we might have concerns about one thing or another. The reality, and what we know for certain, is that things have changed quite a bit in the last 20 years since the legislation was last updated. We need to start working on this now. We need to get it to committee, and the proper studies need to occur at this point so we can properly ensure that individuals' privacy and protection are taken care of as they relate to the three particular parts I talked about today.
2637 words
All Topics
  • Hear!
  • Rabble!
  • star_border
  • Nov/28/22 5:49:48 p.m.
  • Watch
  • Re: Bill C-27 
Mr. Speaker, would the member agree that the creation of two new categories of data exempt from privacy measures is a worrisome gesture by the Liberals and could be a gift to the very technology giants to which they have such close ties?
43 words
  • Hear!
  • Rabble!
  • star_border