SoVote

Decentralized Democracy

House Hansard - 327

44th Parl. 1st Sess.
June 7, 2024 10:00AM
  • Jun/7/24 10:22:27 a.m.
  • Watch
Mr. Speaker, the bill has received widespread condemnation from groups of all political stripes because it forces Canadians to make unnecessary trade-offs between their security and their charter rights. As well, the bill would force much-needed reforms into a long, onerous regulatory process with no clear end in sight. There are people watching this today who will fear deepfaked intimate images being used to harass and bully them in their high schools. The government could have made a small amendment to the Criminal Code to update existing laws to protect Canadians in the digital age, but it has chosen this onerous, widely panned approach instead of protecting Canadians' rights. Why?
112 words
  • Hear!
  • Rabble!
  • star_border
  • Jun/7/24 10:23:25 a.m.
  • Watch
Mr. Speaker, I would say categorically that this is a misconstruction of the legislation and what it would do. This legislation would uphold freedom of expression. Freedom of speech in this country, as of right now, does not include hateful speech. That is protected against in the physical world. We are transposing that protection into the online world to directly address the needs of the very people that she just mentioned in those schools in Alberta. With respect to deepfakes, we are taking an additional step by entrenching that language in the legislation. That was done intentionally because deepfakes are being used against children, adolescents and adults to silence them. I know the member is a strong advocate for women's empowerment and women's voices in civic discourse. Deepfakes are being used right now against Alexandria Ocasio-Cortez and Prime Minister Meloni in Italy. Regardless of one's views of their political positions, etc., the point is that when the leader of a G7 country is being limited in terms of their ability to participate in civic and political discourse via deepfakes, we need to take action. We are taking that action in a comprehensive bill and a comprehensive measure that would address and empower freedom of expression rather than limiting it.
213 words
  • Hear!
  • Rabble!
  • star_border
  • Jun/7/24 10:24:40 a.m.
  • Watch
  • Re: Bill C-63 
Mr. Speaker, the Bloc Québécois believes that Bill C-63 tackles two major online scourges and that it is time for us, as legislators, to take action to stamp them out. The Bloc Québécois strongly supports part 1 of the bill, in other words, all provisions related to addressing child pornography and the communication of pornographic content without consent. As we see it, this part is self-evident. It has garnered such strong consensus that we told the minister, through our critic, the member for Rivière-du-Nord, that we not only support it, but we were also prepared to accept and pass part 1 quickly and facilitate its passage. As for part 2, however, we have some reservations. We consider it reasonable to debate this part in committee. The minister can accuse other political parties of playing politics with part 2, but not the Bloc Québécois. We sincerely believe that part 2 needs to be debated. We have questions. We have doubts. I think our role calls on us to to get to the bottom of things. That is why we have asked the minister—and why we are asking him again today—to split Bill C‑63 in two, so that we can pass part 1 quickly and implement it, and set part 2 aside for legislative and debate-related purposes.
241 words
  • Hear!
  • Rabble!
  • star_border
  • Jun/7/24 10:26:18 a.m.
  • Watch
Mr. Speaker, I thank my colleague opposite for her question, and I appreciate the position of the Bloc Québécois I want to emphasize three points. First, the aspect that affects children also affects teens and adults. In other words, hatred is a problem for children, teenagers and adults. Hatred is not exclusive to any particular age. That is the first thing. Second, the member is suggesting that a comprehensive study is needed, with witnesses and consultations, to see if we can improve the bill. I could not agree more, but it is not just part 2 that needs to be thoroughly studied. We need a comprehensive study of all aspects of this bill. We need to examine the bill in its entirety. Third, as I mentioned at the outset, Canada is not the first country to move in this direction. Australia took its first steps in 2015, beginning with protecting children only. Nine years later, in 2024, Australia is addressing the issue more broadly. In 2024, Canada needs to address all aspects. Harmful content is by no means limited to content directed at children.
188 words
  • Hear!
  • Rabble!
  • star_border
  • Jun/7/24 10:27:41 a.m.
  • Watch
Mr. Speaker, the NDP finds that the government delayed introduction of this bill for far too long. We want it to be referred to committee for a comprehensive study. There are some parts that we fully support. There are others that deal with the Criminal Code, for example, that will truly require a comprehensive study in committee. We have to make sure we take the time that is needed. That being said, the bill is missing certain aspects, which is a bit surprising. I am talking about transparency with respect to algorithms. As the minister knows, hate and other such things are often amplified by algorithms that promote the kind of content that adversely affects people. This is not being addressed in the bill. I would like the minister to tell us why this important aspect of algorithms and transparency is not being addressed so that we can determine precisely why some hateful content or harmful content is promoted on certain platforms.
162 words
  • Hear!
  • Rabble!
  • star_border
  • Jun/7/24 10:29:18 a.m.
  • Watch
Mr. Speaker, I want to note that the time it took to promulgate this bill and bring it here before the House for debate was directly related to the consultations we held around the world. That is why it took four years to prepare this bill. Also, with respect to the transparency of social media and platforms, I would like to note three specific points. First, the bill specifically seeks to enable the digital safety commissioner to authorize academic researchers to access data anonymously to verify what is happening on platforms with their own algorithms. Second, the digital safety commissioner will be responsible for ensuring that the platforms actually follow the digital safety plan. Third, every user can run their own algorithm to inform platforms that some content is harmful and to prevent content from a specific author from appearing on their feed. We are therefore broadening many aspects related to algorithm transparency. If other measures should be taken, I am quite willing to consider amendments that are presented in good faith in committee on how to improve transparency on this front.
182 words
  • Hear!
  • Rabble!
  • star_border
  • Jun/7/24 10:30:57 a.m.
  • Watch
Mr. Speaker, a few weeks ago I had the opportunity to visit a school in my riding in response to letters that some nine-year-olds and 10-year-olds had written to me. In their classroom, I asked the kids whether they knew about cyber-bullying, and all of them raised their hands because all of them had experienced it or knew to some degree what cyber-bullying is like. While I was talking about the topic of cyber-bullying, there was a young boy the age of my daughter, nine years old. He raised his hand and shared with me that on his birthday, he had received new VR glasses to use with his video game. He shared that while he was in his online space and was minding his own business, someone approached him online and did things to him repeatedly that were not nice. Needless to say, when I asked him what he did after this happened to him, the young man said he did not do anything and that he decided not to play video games ever again. The reason I am sharing his testimony is that I would like to ask the hon. minister what the bill would do to help protect kids just like the one I spoke to at the elementary school.
220 words
  • Hear!
  • Rabble!
  • star_border
  • Jun/7/24 10:32:24 a.m.
  • Watch
Mr. Speaker, what I can say is that my heart breaks just listening to that. It is at the heart of the bill. The bill would entrench a duty to protect children, a duty to remove content that would target children. In terms of what the child who was mentioned experienced, one can rest assured that it is not an anomaly in Mississauga. Kids around Canada and around the world are facing this type of situation all the time. We would never tolerate someone's lurking around a schoolyard or contacting our kids by telephone at midnight. That is what is occurring all the time. The fact that the bill takes a hard look at child sex predators and at those who would spread revenge porn, and that it would entrench a duty to protect children, is in fact the exact step we need to take. That is what Canadian parents are demanding. I hope every parliamentarian of the chamber will get behind the important bill before us.
168 words
  • Hear!
  • Rabble!
  • star_border
Mr. Speaker, we must protect Canadians in the digital age, but Bill C-63 is not the way to do it. It would force Canadians to make unnecessary trade-offs between the guarantee of their security and their charter rights. Today I will explain why Bill C-63 is deeply flawed and why it would not protect Canadians' rights sufficiently. More importantly, I will present a comprehensive alternative plan that is more respectful of Canadians' charter rights and would provide immediate protections for Canadians facing online harms. The core problem with Bill C-63 is how the government has changed and chosen to frame the myriad harms that occur in the digital space as homogenous and as capable of being solved with one approach or piece of legislation. In reality, harms that occur online are an incredibly heterogenous set of problems requiring a multitude of tailored solutions. It may sound like the former might be more difficult to achieve than the latter, but this is not the case. It is relatively easy to inventory the multitudes of problems that occur online and cause Canadians harm. From there, it should be easy to sort out how existing laws and regulatory processes that exist for the physical world could be extended to the digital world. There are few, if any, examples of harms that are being caused in digital spaces that do not already have existing relatable laws or regulatory structures that could be extended or modified to cover them. Conversely, what the government has done for nearly a decade is try to create new, catch-all regulatory, bureaucratic and extrajudicial processes that would adapt to the needs of actors in the digital space instead of requiring them to adapt to our existing laws. All of these attempts have failed to become law, which is likely going to be the fate of Bill C-63. This is a backward way of looking at things. It has caused nearly a decade of inaction on much-needed modernization of existing systems and has translated into law enforcement's not having the tools it needs to prevent crime, which in turn causes harm to Canadians. It has also led to a balkanization of laws and regulations across Canadian jurisdictions, a loss of investment due to the uncertainty, and a lack of coordination with the international community. Again, ultimately, it all harms Canadians. Bill C-63 takes the same approach by listing only a few of the harms that happen in online spaces and creates a new, onerous and opaque extrajudicial bureaucracy, while creating deep problems for Canadian charter rights. For example, Bill C-63 would create a new “offence motivated by a hatred” provision that could see a life sentence applied to minor infractions under any act of Parliament, a parasitic provision that would be unchecked in the scope of the legislation. This means that words alone could lead to life imprisonment. While the government has attempted to argue that this is not the case, saying that a serious underlying act would have to occur for the provision to apply, that is simply not how the bill is written. I ask colleagues to look at it. The bill seeks to amend section 320 of the Criminal Code, and reads, “Everyone who commits an offence under this Act or any other Act of Parliament...is guilty of an indictable offence and liable to imprisonment for life.” At the justice committee earlier this year, the minister stated: ...the new hate crime offence captures any existing offence if it was hate-motivated. That can run the gamut from a hate-motivated theft all the way to a hate-motivated attempted murder. The sentencing range entrenched in Bill C-63 was designed to mirror the existing...options for all of these potential underlying offences, from the most minor to the most serious offences on the books.... The minister continued, saying, “this does not mean that minor offences will suddenly receive...harsh sentences. However, sentencing judges are required to follow legal principles, and “hate-motivated murder will result in a life sentence. A minor infraction will...not result in it.” In this statement, the minister admitted both that the new provision could be applied to any act of Parliament, as the bill states, and that the government would be relying upon the judiciary to ensure that maximum penalties were not levelled against a minor infraction. Parliament cannot afford the government to be this lazy, and by that I mean not spelling out exactly what it intends a life sentence to apply to in law, as opposed to handing a highly imperfect judiciary an overbroad law that could have extreme, negative consequences. Similarly, a massive amount of concern from across the political spectrum has been raised regarding Bill C-63's introduction of a so-called hate crime peace bond, calling it a pre-crime provision for speech. This is highly problematic because it would explicitly extend the power to issue peace bonds to crimes of speech, which the bill does not adequately define, nor does it provide any assurance that it would meet a criminal standard for hate. Equally as concerning is that Bill C-63 would create a new process for individuals and groups to complain to the Canadian Human Rights Commission that online speech directed at them is discriminatory. This process would be extrajudicial, not subject to the same evidentiary standards of a criminal court, and could take years to resolve. Findings would be based on a mere balance of probabilities rather than on the criminal standard of proof beyond a reasonable doubt. The subjectivity of defining hate speech would undoubtedly lead to punishments for protected speech. The mere threat of human rights complaints would chill large amounts of protected speech, and the system would undoubtedly be deluged with a landslide of vexatious complaints. There certainly are no provisions in the bill to prevent any of this from happening. Nearly a decade ago, even the Toronto Star, hardly a bastion of Conservative thought, wrote a scathing opinion piece opposing these types of provisions. The same principle should apply today. When the highly problematic components of the bill are overlaid upon the fact that we are presently living under a government that unlawfully invoked the Emergencies Act and that routinely gaslights Canadians who legitimately question efficacy or the morality of its policies as spreading misinformation, as the Minister of Justice did in his response to my question, saying that I had mis-characterized the bill, it is not a far leap to surmise that the new provision has great potential for abuse. That could be true for any political stripe that is in government. The government's charter compliance statement, which is long and vague and has only recently been issued, should raise concerns for parliamentarians in this regard, as it relies on this statement: “The effects of the Bill on freedom expression are outweighed by the benefits of protecting members of vulnerable groups”. The government has already been found to have violated the Charter in the case of Bill C-69 for false presumptions on which one benefit outweighs others. I suspect this would be the same case for Bill C-63 should it become law, which I hope it does not. I believe in the capacity of Canadians to express themselves within the bounds of protected speech and to maintain the rule of law within our vibrant pluralism. Regardless of political stripe, we must value freedom of speech and due process, because they are what prevents violent conflict. Speech already has clearly defined limitations under Canadian law. The provisions in Bill C-63 that I have just described are anathema to these principles. To be clear, Canadians should not be expected to have their right to protected speech chilled or limited in order to be safe online, which is what Bill C-63 would ask of them. Bill C-63 would also create a new three-headed, yet-to-exist bureaucracy. It would leave much of the actual rules the bill describes to be created and enforced under undefined regulations by said bureaucracy at some much later date in the future. We cannot wait to take action in many circumstances. As one expert described it to me, it is like vaguely creating an outline and expecting bureaucrats, not elected legislators, to colour in the picture behind closed doors without any accountability to the Canadian public. The government should have learned from the costs associated with failing when it attempted the same approach with Bill C-11 and Bill C-18, but alas, here we are. The new bureaucratic process would be slow, onerous and uncertain. If the government proceeds with it, it means Canadians would be left without protection, and innovators and investors would be left without the regulatory certainty needed to grow their businesses. It would also be costly. I have asked the Parliamentary Budget Officer to conduct an analysis of the costs associated with the creation of the bureaucracy, and he has agreed to undertake the task. No parliamentarian should even consider supporting the bill without understanding the resources the government intends to allocate to the creation of the new digital safety commission, digital safety ombudsman and digital safety office, particularly since the findings in this week's damning NSICOP report starkly outlined the opportunity cost of the government failing to allocate much needed resources to the RCMP. Said differently, if the government cannot fund and maintain the critical operations of the RCMP, which already has the mandate to enforce laws related to public safety, then Parliament should have grave, serious doubts about the efficacy of its setting up three new bureaucracies to address issues that could likely be managed by existing regulatory bodies like the CRTC or in the enforcement of the Criminal Code. Also, Canadians should have major qualms about creating new bureaucracies which would give power to well-funded and extremely powerful big tech companies to lobby and manipulate regulations to their benefit behind the scenes and outside the purview of Parliament. This approach would not necessarily protect Canadians and may create artificial barriers to entry for new innovative industry players. The far better approach would be to adapt and extend long-existing laws and regulatory systems, properly resource their enforcement arms, and require big tech companies and other actors in the digital space to comply with these laws, not the other way around. This approach would provide Canadians with real protections, not what amounts to a new, ineffectual complaints department with a high negative opportunity cost to Canadians. In no scenario should Parliament allow the government to entrench in legislation a power for social media companies to be arbiters of speech, which Bill C-63 risks doing. If the government wishes to further impose restrictions on Canadians' rights to speech, that should be a debate for Parliament to consider, not for regulators and tech giants to decide behind closed doors and with limited accountability to the public. In short, this bill is completely flawed and should be abandoned, particularly given the minister's announcement this morning that he is unwilling to proceed with any sort of change to it in scope. However, there is a better way. There is an alternative, which would be a more effective and more quickly implementable plan to protect Canadians' safety in the digital age. It would modernize existing laws and processes to align with digital advancements. It would protect speech not already limited in the Criminal Code, and would foster an environment for innovation and investment in digital technologies. It would propose adequately resourcing agencies with existing responsibilities for enforcing the law, not creating extrajudicial bureaucracies that would amount to a complaints department. To begin, the RCMP and many law enforcement agencies across the country are under-resourced after certain flavours of politicians have given much more than a wink and a nod to the “defund the police” movement for over a decade. This trend must immediately be reversed. Well-resourced and well-respected law enforcement is critical to a free and just society. Second, the government must also reform its watered-down bail policies, which allow repeat offenders to commit crimes over and over again. Criminals in the digital space will never face justice, no matter what laws are passed, if the Liberal government's catch-and-release policies are not reversed. I think of a woman in my city of Calgary who was murdered in broad daylight in front of an elementary school because her spouse was subject to the catch-and-release Liberal bail policy, in spite of his online harassment of her for a very long time. Third, the government must actually enforce—
2134 words
  • Hear!
  • Rabble!
  • star_border
  • Jun/7/24 10:46:09 a.m.
  • Watch
The hon. member for Drummond is rising on a point of order.
12 words
  • Hear!
  • Rabble!
  • star_border
  • Jun/7/24 10:46:13 a.m.
  • Watch
Mr. Speaker, I apologize to my colleague. I hate to interrupt her in the middle of a speech like this, but we can hear a telephone or device vibrating near a microphone and it must be very irritating for the interpreters. Could you ask members to be mindful of that and to keep their devices away from the microphones, please?
60 words
  • Hear!
  • Rabble!
  • star_border
  • Jun/7/24 10:46:31 a.m.
  • Watch
I would ask the hon. member to move the cellphone away from the microphone so that it does not vibrate.
20 words
  • Hear!
  • Rabble!
  • star_border
  • Jun/7/24 10:46:46 a.m.
  • Watch
  • Re: Bill C-63 
Mr. Speaker, third, the government must actually enforce laws that are already on the books but have not been recently enforced due to a extreme lack of political will and disingenuous politics and leadership, particularly as they relate to hate speech. This is particularly in light of the rise of dangers currently faced by vulnerable Canadian religious communities such as, as the minister mentioned, Canada's Jewish community. This could be done via actions such as ensuring the RCMP, including specialized integrated national security enforcement teams and national security enforcement sections, is providing resources and working directly with appropriate provincial and municipal police forces to share appropriate information intelligence to provide protection to these communities, as well as making sure the secure security infrastructure program funding is accessible in an expedited manner so community institutions and centres can enhance security measures at their gathering places. Fourth, for areas where modernization of existing regulations and the Criminal Code need immediate updating to reflect the digital age, and where there could be cross-partisan consensus, the government should undertake these changes in a manner that would allow for swift and non-partisan passage through Parliament. These items could include some of the provisions discussed in Bill C-63. These include the duty of making content that sexually victimizes a child or revictimizes a survivor, or of intimate content communicated without consent, inaccessible to persons in Canada in certain circumstances; imposing certain duties to keep all records related to sexual victimization to online providers; making provisions for persons in Canada to make a complaint to existing enforcement bodies, such as the CRTC or the police, not a new bureaucracy that would take years to potentially materialize and be costly and/or ineffective; ensuring that content on a social media service that sexually victimizes a child or revictimizes a survivor, or is intimate content communicated without consent, by authorization of a court making orders to the operators of those services, is inaccessible to persons in Canada; and enforcing the proposed amendment to an act respecting the mandatory reporting of internet child pornography by persons who provide an Internet service. Other provisions the government has chosen not to include in Bill C-63, but that should have been and that Parliament should be considering in the context of harms that are being conducted online, must include updating Canada's existing laws on the non-consensual distribution of intimate images to ensure the distribution of intimate deepfakes is also criminalized, likely through a simple update to the Criminal Code. We could have done this by unanimous consent today had the government taken the initiative to do so. This is already a major problem in Canada with girls in high schools in Winnipeg seeing intimate images of themselves, sometimes, as reports are saying, being sexually violated without any ability for the law to intervene. The government also needs to create a new criminal offence of online criminal harassment that would update the existing crime of criminal harassment to address the ease and anonymity of online criminal harassment. Specifically, this would apply to those who repeatedly send threatening and/or explicit messages or content to people across the Internet and social media when they know, or should know, it is not welcome. This could include aggravating factors for repeatedly sending such material anonymously and be accompanied by a so-called digital restraining order that would allow victims of online criminal harassment to apply to a judge, under strict circumstances, to identify the harassment and end the harassment. This would protect privacy, remove the onus on social media platforms from guessing when they should be giving identity to the police and prevent the escalation of online harassment into physical violence. This would give police and victims clear and easy-to-understand tools to prevent online harassment and associated escalation. This would address a major issue of intimate partner violence and make it easier to stop coercive control. As well, I will note to the minister that members of the governing Liberal Party agreed to the need for these exact measures at a recent meeting of PROC related to online harassment of elected officials this past week. Fifth, the government should consider a more effective and better way to regulate online platforms, likely under the authority of the CRTC and the Minister of Industry, to better protect children online while protecting charter rights. This path could include improved measures to do this. This could include, through legislation, not backroom regulation, but precisely through law, defining the duty of care required by online platforms. Some of these duties of care have already been mentioned in questions to the ministers today. This is what Parliament should be seized with, not allowing some unnamed future regulatory body to decide this for us while we have big tech companies and their lobbying arms defining that behind closed doors. That is our job, not theirs. We could provide parents with safeguards, controls and transparency to prevent harm to their kids when they are online, which could be part of the duty of care. We could also require that online platforms put the interests of children first with appropriate safeguards, again, in a legislative duty of care. There could also be measures to prevent and mitigate self-harm, mental health disorders, addictive behaviours, bullying and harassment, sexual violence and exploitation, and the promotion of marketing and products that are unlawful for minors. All of these things are instances of duty of care. We could improve measures to implement privacy-preserving and trustworthy age verification methods, which many platforms always have the capacity to do, while prohibiting the use of a digital ID in any of these mechanisms. This path could also include measure to ensure that the enforcement of these mechanisms, including a system of administrative penalties and consequences, is done through agencies that already exist. Additionally, we could ensure that there are perhaps other remedies, such as the ability to seek remedy for civil injury, when that duty of care is violated. This is a non-comprehensive list of online harms, but the point is, we could come to consensus in this place on simple modernization issues that would update the laws now. I hope that the government will accept this plan. A send out a shout-out to Sean Phelan and David Murray, two strong and mighty workers. We did not have an army of bureaucrats, but we came up with this. I hope that Parliament considers this alternative plan, instead of Bill C-63, because the safety of Canadians is at risk.
1099 words
  • Hear!
  • Rabble!
  • star_border
  • Jun/7/24 10:53:57 a.m.
  • Watch
Mr. Speaker, I genuinely thank the member opposite for her contributions to today's debate because it is really important. I will point out four things and then ask her a question. The first is that, with respect to my position on amendments, what I said, and I want to make sure it is crystal clear to Canadians watching, is that I am open to amendments that would strengthen the bill that are made in good faith. The second point is with respect to free-standing hate crime, which is a provision that exists in 47 out of 50 states in the United States. The nature of the penalty that would be applied in a given context of a hate crime would depend on the underlying offence. Uttering a threat that was motivated by hate would constitute less of a penalty than committing a murder that was motivated by hate. For the member's benefit, paragraph 718.1 of the Criminal Code, which I do trust judges to interpret, specifically says that the penalty “must be proportionate to the gravity of the offence and the degree of responsibility of the offender.” With respect to the peace bond, what I would say to the member's point, quite simply, is that I do believe it is necessary to take a tool that is well known to criminal law and apply it to the context of a synagogue, which has already been targeted with vandalism and may be targeted again, where there would be proof needed to be put before a judge and where the safeguard would exist for the attorney general of jurisdiction to give consent before such a peace bond was pursued. The member talked about the fact that Criminal Code tools should be used in the context of ensuring that we can tackle this pernicious information. What I would say to her is that law enforcement has asked us for the same tool that Amanda Todd's mother has asked us for. The victimization of people, even after death, continues when the—
346 words
  • Hear!
  • Rabble!
  • star_border
  • Jun/7/24 10:55:35 a.m.
  • Watch
The hon. member for Calgary Nose Hill.
7 words
  • Hear!
  • Rabble!
  • star_border
  • Jun/7/24 10:55:41 a.m.
  • Watch
Mr. Speaker, I have outlined in detail why the bill is irremediable. It is not fixable, and members do not have to take my word for it. The Atlantic magazine, hardly a bastion of conservative thought, has a huge expose this morning on why the bill is so flawed. I suspect it is why the government has only allowed it to come up for debate now. I do not expect to see it in the fall. Given that the bill is so flawed, it is incumbent upon the Minister of Justice to take the suggestions of the opposition seriously. I have outlined several, and they are very easy to pick out of my speech, suggestions on how the minister could proceed. He could proceed, likely on an expedited process, under those situations. It sounds like my colleagues from the Bloc and the NDP have similar concerns. The bill cannot proceed in its current state. Frankly, Canadians should not be expected to trade their rights for safety online, and they should not have to expect a government, which has dragged its heels for nearly a decade, to continue with the facade that it actually cares about this issue or has a plan to address it. We have given it one, and the Liberals should take it.
215 words
  • Hear!
  • Rabble!
  • star_border
  • Jun/7/24 10:56:48 a.m.
  • Watch
Mr. Speaker, at the end of this parliamentary term, I am pleased to see that more and more school groups are coming to watch the business of the House. I think this is a strategy used by teachers to show that they are not as boring as they seem and that students should pay attention in class. Quite often, what happens here is a lot more interesting than sitting in class. That said, I listened closely to my colleague's speech. I noted several interesting points, particularly the fact that she made proposals. We do not often hear proposals about regulating online content from the Conservatives. I heard proposals and I also detected some desire for consensus. There may well be certain points on which we could agree. Does my colleague agree with the Bloc Québécois, which is proposing that we split the bill, that we should fast-track the study of part 1, given that we generally agree on its principles at least, and that we should take the time to study part 2 in the House and in committee? Part 2 contains aspects that require much more in-depth discussion, in our opinion.
199 words
  • Hear!
  • Rabble!
  • star_border
  • Jun/7/24 10:57:48 a.m.
  • Watch
Mr. Speaker, the unfortunate thing is that the government is close to the end of its mandate and does not have a lot of public support across the country. The reality is that even if the government members said that they were going to split the bill, which they just said that they were not going to do, the bill would not likely become law. Certainly, the regulatory process is not going to happen prior to the next election, even if the bill is rammed through. The problem that is facing Canadians is that the solutions that are required have problems that need to be addressed today. I would suggest that what is actually needed is a separate, completely different piece of legislation, which outlines the suggestions I have in there. It is unfortunate that the government, with its army of bureaucrats, was not able to do it and that it is the opposition that has to do it. I am certainly willing to work with my opposition colleagues on another piece of legislation that could address these issues and find areas of commonality so that we can protect Canadians from online harms.
193 words
  • Hear!
  • Rabble!
  • star_border
  • Jun/7/24 10:58:51 a.m.
  • Watch
Mr. Speaker, I appreciate the member's hard work in terms of tackling issues like harassment and the distribution of non-consensual images; she is very sincere in this regard. The member has flagged the issue of resources; the bill is unclear as to what the government would actually provide in terms of resources. I do note this has been an ongoing problem over the last 20 years with cutbacks to law enforcement. The member notes as well the impact of big tech. I wanted her to comment on a substantial missing piece in the legislation around algorithm transparency, which is currently before the U.S. Congress, and needs to be addressed absolutely. Big tech companies often promote non-consensual images through their algorithms and hate through their algorithms without any sort of oversight or responsibility. How does the member feel about that missing piece?
145 words
  • Hear!
  • Rabble!
  • star_border
  • Jun/7/24 10:59:57 a.m.
  • Watch
Mr. Speaker, with regard to resources, I asked the Parliamentary Budget Officer to conduct an analysis of the resources that the government was anticipating for the creation of its bureaucracy, because I believe that those resources would likely be much better allocated to other places. My colleague can wait for that report and perhaps re-emphasize to the Parliamentary Budget Officer the need to speed that along. The second thing is with regard to algorithmic transparency. This is why we need to have a legislated duty of care. If we proceeded on the principle of a legislated duty of care of social media operators, then we could discuss what needs to be in there. Certainly, algorithmic transparency and bias that are used in AI systems that could be potentially injurious in a variety of ways are something—
138 words
  • Hear!
  • Rabble!
  • star_border