SoVote

Decentralized Democracy

Bill C-412

44th Parl. 1st Sess.
September 16, 2024
  • Part 1 of this enactment enacts the Protection of Minors in the Digital Age Act, the purpose of which is to provide for a safe online environment for minors by requiring owners and operators of platforms such as online services or applications to put the interests of minors first and to ensure that minors’ personal data is not used in a manner that could compromise their privacy, health or well-being.

    Part 2 amends the Criminal Code to, among other things,

    (a)prohibit the publication of the image of a person created or edited through the use of computer software that falsely represents the person, in a manner that is intended to make the image appear authentic, as being nude, as exposing their genital organs, anal region or breasts or as being engaged in explicit sexual activity;

    (b)create a separate offence of criminal harassment that is conducted by means of the Internet, a social media service or other digital network and require the court imposing a sentence for the offence to consider as an aggravating factor the fact that the offender, in committing the offence, communicated with the victim anonymously or using a false identity; and

    (c)provide for the circumstances in which a person who presents a risk of committing an offence of online harassment may be required to enter into a recognizance and, if the person has communicated anonymously or using a false identity, provide for the circumstances in which a court may make a production order for the purpose of identifying the person.

  • H1
  • H2
  • H3
  • S1
  • S2
  • S3
  • RA
  • Yea
  • Nay
  • star_border

First Session, Forty-fourth Parliament,

70-71 Elizabeth II – 1-2-3 Charles III, 2021-2022-2023-2024

HOUSE OF COMMONS OF CANADA

BILL C-412
An Act to enact the Protection of Minors in the Digital Age Act and to amend the Criminal Code

FIRST READING, September 16, 2024

Ms. Rempel Garner

441361


SUMMARY

Part 1 of this enactment enacts the Protection of Minors in the Digital Age Act, the purpose of which is to provide for a safe online environment for minors by requiring owners and operators of platforms such as online services or applications to put the interests of minors first and to ensure that minors’ personal data is not used in a manner that could compromise their privacy, health or well-being.

Part 2 amends the Criminal Code to, among other things,

(a)prohibit the publication of the image of a person created or edited through the use of computer software that falsely represents the person, in a manner that is intended to make the image appear authentic, as being nude, as exposing their genital organs, anal region or breasts or as being engaged in explicit sexual activity;

(b)create a separate offence of criminal harassment that is conducted by means of the Internet, a social media service or other digital network and require the court imposing a sentence for the offence to consider as an aggravating factor the fact that the offender, in committing the offence, communicated with the victim anonymously or using a false identity; and

(c)provide for the circumstances in which a person who presents a risk of committing an offence of online harassment may be required to enter into a recognizance and, if the person has communicated anonymously or using a false identity, provide for the circumstances in which a court may make a production order for the purpose of identifying the person.

Available on the House of Commons website at the following address:
www.ourcommons.ca


1st Session, 44th Parliament,

70-71 Elizabeth II – 1-2-3 Charles III, 2021-2022-2023-2024

HOUSE OF COMMONS OF CANADA

BILL C-412

An Act to enact the Protection of Minors in the Digital Age Act and to amend the Criminal Code

His Majesty, by and with the advice and consent of the Senate and House of Commons of Canada, enacts as follows:

Short Title

Short title

1This Act may be cited as the Promotion of Safety in the Digital Age Act.

PART 1
Protection of Minors in the Digital Age Act

Enactment of Act

Enactment

2The Protection of Minors in the Digital Age Act is enacted as follows:

An Act to provide for the protection of minors in the digital age

Short Title

Short title

1This Act may be cited as the Protection of Minors in the Digital Age Act.

Interpretation

Definitions

2The following definitions apply in this Act.

child means an individual who is under the age of 16 years.‍ (enfant)

Commission means the Canadian Radio-television and Telecommunications Commission established by the Canadian Radio-television and Telecommunications Commission Act.  (Conseil)

Minister means the Minister of Industry.‍ (ministre)

minor means an individual who is under the age of 18 years.‍ (mineur)

operator means the owner or operator of a platform, such as an online service or application, that connects to the Internet and that is used, or could reasonably be expected to be used, by a minor, including a social media service and an online video gaming service.‍ (exploitant)

parent, in respect of a minor, includes a person who, in law,

  • (a)has custody of the minor or, in Quebec, parental authority over the minor; or

  • (b)is the guardian of the minor or, in Quebec, the tutor or curator to the person of the minor.‍ (parent)

personal data means information that identifies or is linked or may reasonably be linked to a particular minor and includes a mobile device identifier associated with a minor.‍ (données personnelles)

personalized recommendation system means a fully or partially automated system or computer algorithm used to suggest, promote or rank information based on the personal data of users.‍ (système de recommandations personnalisées)

social media service means a website or application that is accessible in Canada, the primary purpose of which is to facilitate interprovincial or international online communication among users of the website or application by enabling them to access and share content.‍ (service de média social)

Purpose

Purpose

3The purpose of this Act is to provide for a safe online environment for minors by requiring operators to take meaningful steps to protect them and address online risks to their health and well-being, including by putting their interests first and by ensuring that their personal data is not used in a manner that could compromise their privacy, health or well-being, such as by leading to the development of a negative self-image, loneliness or the inability to maintain relationships.

Duty of Care

Duty of care

4(1)Every operator must act in the best interests of a user whom it knows or should reasonably know is a minor by taking reasonable steps in the design and operation of its products and services to prevent or to mitigate the effects of the following:

  • (a)physical harm or incitement of such harm and online bullying and harassment of minors;

  • (b)online sexual violence against minors, including any conduct directed at a minor online that constitutes an offence under the Criminal Code and is committed for a sexual purpose or any unsolicited or unwanted sexual actions or communications directed at a minor online;

  • (c)the creation or dissemination of imagery of a minor, whether altered or not, that is sexually exploitative, humiliates them, is harmful to their dignity or invades their privacy;

  • (d)the promotion and marketing of products or services that are unlawful for minors, such as any controlled substance as defined in subsection 2(1) of the Controlled Drugs and Substances Act, alcohol, cannabis and tobacco and products or services relating to gambling or pornography;

  • (e)mental health disorders including anxiety, depression, loneliness, eating disorders and substance use disorders, and the promotion of self-harm, suicide and suicidal behaviours;

  • (f)patterns of use that indicate or encourage addiction-like behaviours;

  • (g)the operation of an account by a user whom it knows or should reasonably know is a minor without first verifying the contact information for any of the user’s parents through, for example, the appropriate Internet service provider; and

  • (h)predatory, unfair or deceptive marketing practices.

Clarification

(2)Nothing in subsection (1) is to be construed as

  • (a)requiring an operator to prevent any minor from deliberately and independently searching for specific content; or

  • (b)preventing an operator or any user from providing resources for the prevention or mitigation of any harm described in subsection (1), including evidence-based information and clinical resources.

Safeguards

Safety settings

5(1)Every operator must provide any parent of a user whom the operator knows or should reasonably know is a child, as well as that user, with clear and readily accessible safety settings on its platform, including settings to

  • (a)control the ability of other individuals to communicate with the child;

  • (b)prevent other individuals from consulting personal data of the child that is collected by, used or disclosed on the platform, in particular by restricting public access to personal data;

  • (c)reduce features that increase, encourage or extend the use of the platform by the child, including automatic displaying of content, rewards for time spent on the platform, notifications and other features that could result in addictive use of the platform by the child;

  • (d)control personalized recommendation systems, including the right to

    • (i)opt out of such systems, while still allowing content to be displayed in chronological order, with the latest published content displayed first, or

    • (ii)limit types or categories of recommendations from such systems; and

  • (e)restrict the sharing of the child’s geolocation and notify the child and their parent when their geolocation is being tracked.

Default settings

(2)The operator must ensure that the default setting for the safeguards described in subsection (1) is the option that provides the highest level of protection.

Additional obligations

(3)The operator must

  • (a)in restricting access to its platform or any of its content that is inappropriate for children, use computer algorithms that ensure reliable age verification and that preserve privacy;

  • (b)implement adequate measures to protect the privacy, health and well-being of children; and

  • (c)take remedial measures when it becomes aware of any issues raised in relation to the privacy, health or well-being of children on the platform.

Additional options

(4)The operator must provide any parent of a user whom it knows or should reasonably know is a child, as well as that user, with clear and readily accessible options on its platform to

  • (a)delete the child’s account;

  • (b)delete any personal data collected from or shared by the child on the platform; and

  • (c)limit the amount of time spent by the child on the platform.

Parental controls

6(1)Every operator must provide on its platform clear and readily accessible controls for any parent to support a user that the operator knows or should reasonably know is a minor, including the ability to

  • (a)manage the minor’s privacy and account settings;

  • (b)view metrics of time spent by the minor on the platform; and

  • (c)prevent purchases and financial transactions by the minor.

Default parental controls

(2)The parental controls referred to in subsection (1) must be set as a default setting in the case of a user whom the operator knows or should reasonably know is a child.

Opt-out

(3)Every operator must provide any parent with a clear and readily accessible option to opt out of or turn off the default parental controls.

Notice to minor

(4)Every operator must notify a user whom it knows or should reasonably know is a minor when the parental controls are in effect and which settings or controls have been activated.

Notice to parent

(5)If the operator has reasonable grounds to believe that the default parental controls have been turned off by a minor, it must notify the parent.

Accessibility

7Every operator must make the following readily accessible and provide it in the language, form and manner in which its platform provides the product or service used by minors and their parents:

  • (a)information and control options that take into consideration the differing ages, capacities and developmental needs of the minors most likely to access the platform and that do not encourage minors or parents to weaken or disable safety settings or parental controls; and

  • (b)options to enable or disable safety settings or parental controls, as appropriate.

Reporting Channel

Reporting mechanism

8(1)Every operator must provide on its platform a dedicated and readily accessible reporting channel that any person may use to alert the operator to online harms and risks to minors.

Internal process

(2)Every operator must establish an internal process to receive and respond to the reports received through the reporting channel and must take any measures necessary to respond to the person who makes a report and to address any issues raised in a reasonable and timely manner.

Prohibitions

Prohibition

9(1)It is prohibited for an operator to use any platform design features, including personalized recommendation systems, or use personal data in a manner that facilitates the advertising, marketing, soliciting, offering or selling of products or services that are unlawful for minors, such as any controlled substance as defined in subsection 2(1) of the Controlled Drugs and Substances Act, alcohol, cannabis and tobacco and products or services relating to gambling or pornography.

Prohibition

(2)It is prohibited for an operator to design, modify or manipulate a user interface in a manner that subverts or impairs user autonomy, decision-making or choice in order to weaken or disable the safety settings or parental controls required under this Act.

Prohibition

(3)It is prohibited for an operator to require or request the use of a digital identifier that serves as an electronic representation of an individual’s identity and of their right to access information or services online.

Interpretation

(4)Nothing in this section is to be construed as

  • (a)preventing an operator from taking reasonable measures to

    • (i)block, detect or prevent the distribution of unlawful, obscene or other harmful material, as described in paragraph 4(1)‍(a), to minors, or

    • (ii)block or filter spam, prevent criminal activity or protect the security of its platform or service; or

  • (b)requiring the disclosure of a minor’s browsing behaviour, search history, messages, contact list or other content or metadata of their communications.

Disclosure

Clear and readily accessible information

10(1)Every operator must, in a prominent location on its platform, provide clear and readily accessible information regarding the following:

  • (a)its policies, practices and safety settings, including those pertaining to and available for minors and their parents;

  • (b)access to the safety settings and parental controls required under sections 5 and 6, respectively;

  • (c)the type of personal data that the platform collects, uses or discloses and the manner in which it does so;

  • (d)the platform’s use of any personalized recommendation systems to prioritize, assign weight to or rank different categories of personal data and of the options available to users to modify or disable these settings; and

  • (e)the platform’s use of any labels or tags to indicate that specific advertisements, information, products or services are directed at minors.

In the case of a child

(2)In the case of a user whom an operator knows or should reasonably know is a child, the operator, in addition to providing a parent of the child with information about the safety settings and parental controls required under sections 5 and 6, respectively, must obtain express consent from the parent for the use of its platform by the child before the child first uses it.

Requirements deemed satisfied

(3)The operator is deemed to have satisfied the requirement under subsection (2) if it has made reasonable efforts — taking into consideration available technology — to ensure that the parent receives the information referred to in subsection (2) and to obtain their express consent.

Advertising and Marketing

Advertising and marketing

11Every operator must provide, with respect to advertising on its platform, clear and readily accessible information and labels regarding the following:

  • (a)the name of the product, service or brand and the subject matter of each advertisement;

  • (b)if the platform conducts targeted advertising, the reasons for targeting minors regarding any given advertisement and the ways in which minors’ personal data is used to engage in such advertising; and

  • (c)the fact, if applicable, that content displayed to a minor consists of an advertisement or marketing material, including the disclosure of endorsements of products, services or brands, made by other users of the platform for commercial consideration.

Transparency

Keeping records

12Every operator must keep and maintain audit logs for the collection, processing and use of personal data and relevant records of data and personal data in its possession or control that are necessary to determine whether it has complied with this Act.

Independent review

13Every two years, every operator must cause an independent review of its platform to be conducted, including in respect of the risks and harms it poses to minors and the cumulative effects the use of the platform has on minors. The operator must make the findings publicly available.

Annual report

14(1)Every operator must, in each year, prepare a report for the previous year on the risks and harms to minors identified in the independent review and the prevention and mitigation measures taken to address them.

Content

(2)The report must also include a systemic risk and impact assessment in relation to the following:

  • (a)the extent to which the operator’s platform is likely to be accessed by minors;

  • (b)if the platform is accessed by minors, data on the number of minors using it and on their daily, weekly and monthly usage;

  • (c)the platform’s safety settings and parental controls, including an assessment of their efficacy and a description of any breaches reported in relation to them;

  • (d)the extent to which the platform’s design features, including its personalized recommendation systems and its use of automatic displaying of content, rewards for time spent and notifications, pose risks to minors, including to their privacy, health or well-being;

  • (e)the collection, use and disclosure by the platform of personal data, such as geolocation or health data, and the purposes for which and the manner in which the data is collected, used or disclosed;

  • (f)the reports the operator has received through its reporting channel, including the number and nature of the reports;

  • (g)the internal process the operator has implemented to receive reports and the timeliness, effectiveness and types of responses provided following each report; and

  • (h)the prevention and mitigation measures taken by the operator to address any issues raised in the independent review.

Publication

(3)The operator must publish the report in a prominent place on its platform.

Market Research Guidelines

Guidelines

15The Commission must, in consultation with relevant stakeholders, establish guidelines setting out how operators may conduct market research and product-focused research in relation to minors.

Offences and Punishment

Contravention of sections 4 to 9

16Every operator that contravenes any of sections 4 to 9 is guilty of an offence and liable,

  • (a)on conviction on indictment, to a fine of not more than twenty-five million dollars; and

  • (b)on summary conviction, to a fine of not more than twenty million dollars.

Contravention of sections 10 to 12

17Every operator that contravenes any of sections 10 to 12 or a provision of the regulations made under section 21 is guilty of an offence and liable on summary conviction to a fine of not more than ten million dollars.

Due diligence

18An operator is not to be found guilty of an offence under this Act if it establishes that it exercised due diligence to prevent its commission.

Private Right of Action

Private right of action

19(1)The user of a platform who is a minor, or any of their parents, who alleges that they have suffered serious harm as a result of a failure by its operator to comply with its duty of care under subsection 4(1) may, in any court of competent jurisdiction, bring an action against the operator and, in the action, claim relief by way of one or more of the following:

  • (a)damages for any serious harm, loss or damage suffered;

  • (b)aggravated or punitive damages;

  • (c)an injunction;

  • (d)an order for specific performance; or

  • (e)any other appropriate relief, including the costs of the action.

Limitation period

(2)Unless the court decides otherwise, no action may be brought later than three years after the day on which the minor or their parent becomes aware of the act or omission on which their action is based.

Definition of serious harm

(3)In this section, serious harm includes physical or psychological harm, substantial damage to reputation or relationships and substantial economic loss.

Standards and Conformity Assessment

Standards or codes of practice

20If an operator has implemented standards or a code of practice that, in the Minister’s opinion, provides for substantially the same or greater protections as those provided for under this Act, the Minister may cause a notice to be published in the Canada Gazette confirming the extent to which this Act applies to the operator’s platform.

Regulations

Regulations

21The Governor in Council, on the recommendation of the Minister following consultations with the Commission, may make regulations for carrying out the purposes and provisions of this Act, including regulations

  • (a)setting out the form and manner, including the languages, in which information is to be provided to users under section 10; and

  • (b)for the purpose of section 12, providing for the records of data and personal data to be kept and maintained by an operator, the manner in which they are to be kept and maintained and the period during which they are to be kept and maintained.

Coming into Force

18 months after royal assent

3(1)Sections 1 to 11 and 16 to 21 of the Protection of Minors in the Digital Age Act, as enacted by section 2 of this Act, come into force on the day that, in the eighteenth month after the month in which this Act receives royal assent, has the same calendar number as the day on which this Act receives royal assent or, if that eighteenth month has no day with that number, the last day of that eighteenth month.

Second anniversary

(2)Sections 12 to 15 of the Protection of Minors in the Digital Age Act, as enacted by section 2 of this Act, come into force on the second anniversary of the day on which this Act receives royal assent.

PART 2

R.‍S.‍, c. C-46

Criminal Code

Amendments to the Act

4(1)Paragraph 162.‍1(1)‍(a) of the Criminal Code is replaced by the following:
  • (a)of an indictable offence and liable to imprisonment

    • Insertion start (i) Insertion end for a term of not more than five years,

    • Start of inserted block

      (ii)for a term of not more than 10 years, if the person depicted in the image is engaged in explicit sexual activity, or

    • (iii)for a term of not more than 14 years, if the accused knew or ought to have known that, at the time the intimate image was created, aggravated sexual assault was being, or had just been, committed against the person depicted in the image; or

      End of inserted block
(2)Section 162.‍1 of the Act is amended by adding the following after subsection (1):
Publication, etc.‍, of a false intimate image without consent
Start of inserted block
(1.‍1)Everyone who knowingly publishes, distributes, transmits, sells, makes available or advertises a false intimate image of a person knowing that the person depicted in the image did not give their consent to that conduct, or being reckless as to whether or not that person gave their consent to that conduct, is guilty
  • (a)of an indictable offence and liable to imprisonment

    • (i)for a term of not more than five years,

    • (ii)for a term of not more than 10 years, if the image depicts the person as being engaged in explicit sexual activity, or

    • (iii)for a term of not more than 14 years if the accused knew or ought to have known that, at the time the false intimate image was created or edited, aggravated sexual assault was being, or had just been, committed against the person depicted in the image; or

  • (b)of an offence punishable on summary conviction.

    End of inserted block
(3)Section 162.‍1 of the Act is amended by adding the following after subsection (2):
Definition of false intimate image
Start of inserted block
(2.‍1)In this section, false intimate image means a visual recording made by any means, including a photographic, film or video recording, that is created or edited through the use of computer software, including artificial intelligence software, and that falsely represents a person, in a manner that is intended to make the recording appear authentic, as being nude, as exposing their genital organs or anal region or their breasts or as being engaged in explicit sexual activity.
End of inserted block

5Subsection 162.‍2(1) of the Act is replaced by the following:

Prohibition order
162.‍2(1)When an offender is convicted, or is discharged on the conditions prescribed in a probation order under section 730, of an offence referred to in subsection 162.‍1(1) Insertion start or (1.‍1) Insertion end , the court that sentences or discharges the offender, in addition to any other punishment that may be imposed for that offence or any other condition prescribed in the order of discharge, may make, subject to the conditions or exemptions that the court directs, an order prohibiting the offender from using the Internet or other digital network, unless the offender does so in accordance with conditions set by the court.

6(1)Paragraph 164(1)‍(b) of the Act is replaced by the following:

  • (b)the recording, copies of which are kept for sale or distribution in premises within the jurisdiction of the court, is an intimate image Insertion start or a false intimate image Insertion end ;

(2)Subsections 164(3) to (5) of the Act are replaced by the following:

Owner and maker may appear
(3)The owner and the maker of the matter seized under subsection (1), and alleged to be obscene, child pornography, a voyeuristic recording, an intimate image, Insertion start a false intimate image Insertion end , an advertisement of sexual services or an advertisement for conversion therapy, may appear and be represented in the proceedings to oppose the making of an order for the forfeiture of the matter.
Order of forfeiture
(4)If the court is satisfied, on a balance of probabilities, that the publication, representation, written material or recording referred to in subsection (1) is obscene, child pornography, a voyeuristic recording, an intimate image, Insertion start a false intimate image Insertion end , an advertisement of sexual services or an advertisement for conversion therapy, it may make an order declaring the matter forfeited to Insertion start His Insertion end Majesty in right of the province in which the proceedings take place, for disposal as the Attorney General may direct.
Disposal of matter
(5)If the court is not satisfied that the publication, representation, written material or recording referred to in subsection (1) is obscene, child pornography, a voyeuristic recording, an intimate image, Insertion start a false intimate image Insertion end , an advertisement of sexual services or an advertisement for conversion therapy, it shall order that the matter be restored to the person from whom it was seized without delay after the time for final appeal has expired.

(3)Subsection 164(8) of the Act is amended by adding the following in alphabetical order:

Start of inserted block

false intimate image has the same meaning as in subsection 162.‍1(2.‍1); (fausse image intime)

End of inserted block
7(1)The portion of subsection 164.‍1(1) of the Act before paragraph (a) is replaced by the following:
Warrant of seizure
164.‍1(1)If a judge is satisfied by information on oath that there are reasonable grounds to believe that there is material — namely, child pornography as defined in section 163.‍1, a voyeuristic recording, an intimate image, Insertion start a false intimate image Insertion end , an advertisement of sexual services or an advertisement for conversion therapy, or computer data as defined in subsection 342.‍1(2) that makes child pornography, a voyeuristic recording, an intimate image, Insertion start a false intimate image Insertion end , an advertisement of sexual services or an advertisement for conversion therapy available — that is stored on and made available through a computer system as defined in subsection 342.‍1(2) that is within the jurisdiction of the court, the judge may order the custodian of the computer system to

(2)Subsection 164.‍1(5) of the Act is replaced by the following:

Order
(5)If the court is satisfied, on a balance of probabilities, that the material is child pornography as defined in section 163.‍1, a voyeuristic recording, an intimate image, Insertion start a false intimate image Insertion end , an advertisement of sexual services or an advertisement for conversion therapy, or computer data as defined in subsection 342.‍1(2) that makes child pornography, the voyeuristic recording, the intimate image, Insertion start the false intimate image Insertion end , the advertisement of sexual services or the advertisement for conversion therapy available, it may order the custodian of the computer system to delete the material.

8Subparagraph (a)‍(xxvii.‍2) of the definition offence in section 183 of the Act is replaced by the following:

  • (xxvii.‍2) Insertion start subsection Insertion end 162.‍1 Insertion start (1) Insertion end (intimate image),

  • Start of inserted block

    (xxvii.‍3)subsection 162.‍1(1.‍1) (false intimate image),

    End of inserted block
9(1)Subsection 264(2) of the Act is amended by adding the following after paragraph (b):
  • Start of inserted block

    (b.‍1)repeatedly communicating with, either directly or indirectly, the other person or anyone known to them through the Internet, a social media service or other digital network;

    End of inserted block
(2)Section 264 of the Act is amended by adding the following after subsection (2):
Definition of social media service
Start of inserted block
(2.‍1)In paragraph (2)‍(b.‍1), social media service means a website or application that is accessible in Canada, the primary purpose of which is to facilitate interprovincial or international online communication among users of the website or application by enabling them to access and share content.
End of inserted block

(3)Subsection 264(4) of the Act is replaced by the following:

(4) Insertion start If Insertion end a person is convicted of an offence under this section, the court imposing the sentence on the person shall consider as an aggravating factor that, at the time the offence was committed,
  • (a)the person contravened the terms or conditions of an order made Insertion start under Insertion end section 161 or a recognizance entered into Insertion start under Insertion end section 810, 810.‍1 or 810.‍2;

  • (b) Insertion start the person contravened Insertion end the terms or conditions of any other order or recognizance, or of an undertaking, made or entered into under the common law, this Act or any other Act of Parliament or of a provincial legislature that is similar in effect to an order or recognizance referred to in paragraph (a); or

  • Start of inserted block

    (c)in the case of conduct referred to in paragraph (2)‍(b.‍1), the person communicated anonymously or using a false identity.

    End of inserted block

10Subparagraph (a)‍(x) of the definition primary offence in section 490.‍011 of the Act is replaced by the following:

  • (x) Insertion start subsection Insertion end 162.‍1 Insertion start (1) Insertion end (publication, etc.‍, of an intimate image without consent),

  • Start of inserted block

    (x.‍1)subsection 162.‍1(1.‍1) (publication, etc.‍, of a false intimate image without consent),

    End of inserted block

11Paragraph 738(1)‍(e) of the Act is replaced by the following:

  • (e)in the case of an offence under subsection 162.‍1(1) Insertion start or (1.‍1) Insertion end , by paying to a person who, as a result of the offence, incurs expenses to remove the intimate image Insertion start or the false intimate image, as the case may be Insertion end , from the Internet or other digital network, an amount that is not more than the amount of those expenses, to the extent that they are reasonable, if the amount is readily ascertainable.

12(1)Subsection 810(1) of the Act is amended by striking out “or” at the end of paragraph (a), by adding “or” at the end of paragraph (b) and by adding the following after paragraph (b):
  • Start of inserted block

    (c)will continue to engage in conduct referred to in paragraph 264(2)‍(b.‍1).

    End of inserted block
(2)Section 810 of the Act is amended by adding the following after section (2):
Production order — identification of party
Start of inserted block
(2.‍1)If the information is in respect of the commission of an offence referred to in paragraph (1)‍(c) and the identity of the person against whom it is to be laid is unknown because that person communicated anonymously or using a false identity, the justice or summary conviction court may make an order under any of sections 487.‍015 to 487.‍017 for the purpose of identifying the person who transmitted the communication if, in addition to the conditions required for the issue of the order, the justice or court is satisfied that there is no other way by which any information that would reveal their identity can reasonably be obtained.
End of inserted block
(3)Section 810 of the Act is amended by adding the following after subsection (3):
Recognizance — online criminal harassment
Start of inserted block
(3.‍001)However, if the information is in respect of the commission of an offence referred to in paragraph (1)‍(c), the justice or summary conviction court may order that a recognizance be entered into only if the conduct that is the subject of the offence was threatening or obscene and the defendant engaged in a pattern of repetitive behaviour or persistent aggressive behaviour.
End of inserted block
Conditions
Start of inserted block
(3.‍002)If an order referred to in subsection (3.‍001) is made, the justice or summary conviction court
  • (a)must order that the recognizance include a condition prohibiting the defendant from communicating by any means — including a means referred to paragraph 264(2)‍(b.‍1) — directly or indirectly, with the person on whose behalf the information was laid; and

  • (b)may order the recognizance be entered into for any period — definite or indefinite — that the justice or summary conviction court considers necessary to protect the security of the person on whose behalf the information was laid, taking into consideration whether the defendant

    • (i)communicated with the person anonymously or using a false identity, or

    • (ii)created more than one social media service account to prevent their communications with the person from being blocked.

      End of inserted block
Definition of social media service
Start of inserted block
(3.‍003)In subparagraph (3.‍002)‍(b)‍(ii), social media service means a website or application that is accessible in Canada, the primary purpose of which is to facilitate interprovincial or international online communication among users of the website or application by enabling them to access and share content.
End of inserted block

Coordinating Amendment

Bill C-63

13If Bill C-63, introduced in the 1st session of the 44th Parliament and entitled An Act to enact the Online Harms Act, to amend the Criminal Code, the Canadian Human Rights Act and An Act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service and to make consequential and related amendments to other Acts receives royal assent, then, on the first day on which both section 12 of that Act and subsection 9(3) of this Act are in force, paragraph 264(4)‍(a) of the Criminal Code is replaced by the following:

  • (a)the person contravened the terms or conditions of an order made under section 161 or a recognizance entered into under section 810, 810.‍012, 810.‍1 or 810.‍2;

Published under authority of the Speaker of the House of Commons