Part 1 of this enactment enacts the Protection of Minors in the Digital Age Act, the purpose of which is to provide for a safe online environment for minors by requiring owners and operators of platforms such as online services or applications to put the interests of minors first and to ensure that minors’ personal data is not used in a manner that could compromise their privacy, health or well-being.
Part 2 amends the Criminal Code to, among other things,
(a) prohibit the publication of the image of a person created or edited through the use of computer software that falsely represents the person, in a manner that is intended to make the image appear authentic, as being nude, as exposing their genital organs, anal region or breasts or as being engaged in explicit sexual activity;
(b) create a separate offence of criminal harassment that is conducted by means of the Internet, a social media service or other digital network and require the court imposing a sentence for the offence to consider as an aggravating factor the fact that the offender, in committing the offence, communicated with the victim anonymously or using a false identity; and
(c) provide for the circumstances in which a person who presents a risk of committing an offence of online harassment may be required to enter into a recognizance and, if the person has communicated anonymously or using a false identity, provide for the circumstances in which a court may make a production order for the purpose of identifying the person.
First Session, Forty-fourth Parliament, 70-71 Elizabeth II – 1-2-3 Charles III, 2021-2022-2023-2024 |
HOUSE OF COMMONS OF CANADA |
An Act to enact the Protection of Minors in the Digital Age Act and to amend the Criminal Code
|
FIRST READING, September 16, 2024
|
Ms. Rempel Garner |
Part 1 of this enactment enacts the Protection of Minors in the Digital Age Act, the purpose of which is to provide for a safe online environment for minors by requiring owners and operators of platforms such as online services or applications to put the interests of minors first and to ensure that minors’ personal data is not used in a manner that could compromise their privacy, health or well-being.
Part 2 amends the Criminal Code to, among other things,
(a) prohibit the publication of the image of a person created or edited through the use of computer software that falsely represents the person, in a manner that is intended to make the image appear authentic, as being nude, as exposing their genital organs, anal region or breasts or as being engaged in explicit sexual activity;
(b) create a separate offence of criminal harassment that is conducted by means of the Internet, a social media service or other digital network and require the court imposing a sentence for the offence to consider as an aggravating factor the fact that the offender, in committing the offence, communicated with the victim anonymously or using a false identity; and
(c) provide for the circumstances in which a person who presents a risk of committing an offence of online harassment may be required to enter into a recognizance and, if the person has communicated anonymously or using a false identity, provide for the circumstances in which a court may make a production order for the purpose of identifying the person.
Available on the House of Commons website at the following address:
www.ourcommons.ca
|
1st Session, 44th Parliament, 70-71 Elizabeth II – 1-2-3 Charles III, 2021-2022-2023-2024 |
HOUSE OF COMMONS OF CANADA |
BILL C-412 |
An Act to enact the Protection of Minors in the Digital Age Act and to amend the Criminal Code |
His Majesty, by and with the advice and consent of the Senate and House of Commons of Canada, enacts as follows:
1 This Act may be cited as the Promotion of Safety in the Digital Age Act.
1 This Act may be cited as the Protection of Minors in the Digital Age Act.
2 The following definitions apply in this Act.
child means an individual who is under the age of 16 years. (enfant)
Commission means the Canadian Radio-television and Telecommunications Commission established by the Canadian Radio-television and Telecommunications Commission Act. (Conseil)
Minister means the Minister of Industry. (ministre)
minor means an individual who is under the age of 18 years. (mineur)
operator means the owner or operator of a platform, such as an online service or application, that connects to the Internet and that is used, or could reasonably be expected to be used, by a minor, including a social media service and an online video gaming service. (exploitant)
parent, in respect of a minor, includes a person who, in law,
(a) has custody of the minor or, in Quebec, parental authority over the minor; or
(b) is the guardian of the minor or, in Quebec, the tutor or curator to the person of the minor. (parent)
personal data means information that identifies or is linked or may reasonably be linked to a particular minor and includes a mobile device identifier associated with a minor. (données personnelles)
personalized recommendation system means a fully or partially automated system or computer algorithm used to suggest, promote or rank information based on the personal data of users. (système de recommandations personnalisées)
social media service means a website or application that is accessible in Canada, the primary purpose of which is to facilitate interprovincial or international online communication among users of the website or application by enabling them to access and share content. (service de média social)
3 The purpose of this Act is to provide for a safe online environment for minors by requiring operators to take meaningful steps to protect them and address online risks to their health and well-being, including by putting their interests first and by ensuring that their personal data is not used in a manner that could compromise their privacy, health or well-being, such as by leading to the development of a negative self-image, loneliness or the inability to maintain relationships.
4 (1) Every operator must act in the best interests of a user whom it knows or should reasonably know is a minor by taking reasonable steps in the design and operation of its products and services to prevent or to mitigate the effects of the following:
(a) physical harm or incitement of such harm and online bullying and harassment of minors;
(b) online sexual violence against minors, including any conduct directed at a minor online that constitutes an offence under the Criminal Code and is committed for a sexual purpose or any unsolicited or unwanted sexual actions or communications directed at a minor online;
(c) the creation or dissemination of imagery of a minor, whether altered or not, that is sexually exploitative, humiliates them, is harmful to their dignity or invades their privacy;
(d) the promotion and marketing of products or services that are unlawful for minors, such as any controlled substance as defined in subsection 2(1) of the Controlled Drugs and Substances Act, alcohol, cannabis and tobacco and products or services relating to gambling or pornography;
(e) mental health disorders including anxiety, depression, loneliness, eating disorders and substance use disorders, and the promotion of self-harm, suicide and suicidal behaviours;
(f) patterns of use that indicate or encourage addiction-like behaviours;
(g) the operation of an account by a user whom it knows or should reasonably know is a minor without first verifying the contact information for any of the user’s parents through, for example, the appropriate Internet service provider; and
(h) predatory, unfair or deceptive marketing practices.
(2) Nothing in subsection (1) is to be construed as
(a) requiring an operator to prevent any minor from deliberately and independently searching for specific content; or
(b) preventing an operator or any user from providing resources for the prevention or mitigation of any harm described in subsection (1), including evidence-based information and clinical resources.
5 (1) Every operator must provide any parent of a user whom the operator knows or should reasonably know is a child, as well as that user, with clear and readily accessible safety settings on its platform, including settings to
(a) control the ability of other individuals to communicate with the child;
(b) prevent other individuals from consulting personal data of the child that is collected by, used or disclosed on the platform, in particular by restricting public access to personal data;
(c) reduce features that increase, encourage or extend the use of the platform by the child, including automatic displaying of content, rewards for time spent on the platform, notifications and other features that could result in addictive use of the platform by the child;
(d) control personalized recommendation systems, including the right to
(i) opt out of such systems, while still allowing content to be displayed in chronological order, with the latest published content displayed first, or
(ii) limit types or categories of recommendations from such systems; and
(e) restrict the sharing of the child’s geolocation and notify the child and their parent when their geolocation is being tracked.
(2) The operator must ensure that the default setting for the safeguards described in subsection (1) is the option that provides the highest level of protection.
(3) The operator must
(a) in restricting access to its platform or any of its content that is inappropriate for children, use computer algorithms that ensure reliable age verification and that preserve privacy;
(b) implement adequate measures to protect the privacy, health and well-being of children; and
(c) take remedial measures when it becomes aware of any issues raised in relation to the privacy, health or well-being of children on the platform.
(4) The operator must provide any parent of a user whom it knows or should reasonably know is a child, as well as that user, with clear and readily accessible options on its platform to
(a) delete the child’s account;
(b) delete any personal data collected from or shared by the child on the platform; and
(c) limit the amount of time spent by the child on the platform.
6 (1) Every operator must provide on its platform clear and readily accessible controls for any parent to support a user that the operator knows or should reasonably know is a minor, including the ability to
(a) manage the minor’s privacy and account settings;
(b) view metrics of time spent by the minor on the platform; and
(c) prevent purchases and financial transactions by the minor.
(2) The parental controls referred to in subsection (1) must be set as a default setting in the case of a user whom the operator knows or should reasonably know is a child.
(3) Every operator must provide any parent with a clear and readily accessible option to opt out of or turn off the default parental controls.
(4) Every operator must notify a user whom it knows or should reasonably know is a minor when the parental controls are in effect and which settings or controls have been activated.
(5) If the operator has reasonable grounds to believe that the default parental controls have been turned off by a minor, it must notify the parent.
7 Every operator must make the following readily accessible and provide it in the language, form and manner in which its platform provides the product or service used by minors and their parents:
(a) information and control options that take into consideration the differing ages, capacities and developmental needs of the minors most likely to access the platform and that do not encourage minors or parents to weaken or disable safety settings or parental controls; and
(b) options to enable or disable safety settings or parental controls, as appropriate.
8 (1) Every operator must provide on its platform a dedicated and readily accessible reporting channel that any person may use to alert the operator to online harms and risks to minors.
(2) Every operator must establish an internal process to receive and respond to the reports received through the reporting channel and must take any measures necessary to respond to the person who makes a report and to address any issues raised in a reasonable and timely manner.
9 (1) It is prohibited for an operator to use any platform design features, including personalized recommendation systems, or use personal data in a manner that facilitates the advertising, marketing, soliciting, offering or selling of products or services that are unlawful for minors, such as any controlled substance as defined in subsection 2(1) of the Controlled Drugs and Substances Act, alcohol, cannabis and tobacco and products or services relating to gambling or pornography.
(2) It is prohibited for an operator to design, modify or manipulate a user interface in a manner that subverts or impairs user autonomy, decision-making or choice in order to weaken or disable the safety settings or parental controls required under this Act.
(3) It is prohibited for an operator to require or request the use of a digital identifier that serves as an electronic representation of an individual’s identity and of their right to access information or services online.
(4) Nothing in this section is to be construed as
(a) preventing an operator from taking reasonable measures to
(i) block, detect or prevent the distribution of unlawful, obscene or other harmful material, as described in paragraph 4(1)(a), to minors, or
(ii) block or filter spam, prevent criminal activity or protect the security of its platform or service; or
(b) requiring the disclosure of a minor’s browsing behaviour, search history, messages, contact list or other content or metadata of their communications.
10 (1) Every operator must, in a prominent location on its platform, provide clear and readily accessible information regarding the following:
(a) its policies, practices and safety settings, including those pertaining to and available for minors and their parents;
(b) access to the safety settings and parental controls required under sections 5 and 6, respectively;
(c) the type of personal data that the platform collects, uses or discloses and the manner in which it does so;
(d) the platform’s use of any personalized recommendation systems to prioritize, assign weight to or rank different categories of personal data and of the options available to users to modify or disable these settings; and
(e) the platform’s use of any labels or tags to indicate that specific advertisements, information, products or services are directed at minors.
(2) In the case of a user whom an operator knows or should reasonably know is a child, the operator, in addition to providing a parent of the child with information about the safety settings and parental controls required under sections 5 and 6, respectively, must obtain express consent from the parent for the use of its platform by the child before the child first uses it.
(3) The operator is deemed to have satisfied the requirement under subsection (2) if it has made reasonable efforts — taking into consideration available technology — to ensure that the parent receives the information referred to in subsection (2) and to obtain their express consent.
11 Every operator must provide, with respect to advertising on its platform, clear and readily accessible information and labels regarding the following:
(a) the name of the product, service or brand and the subject matter of each advertisement;
(b) if the platform conducts targeted advertising, the reasons for targeting minors regarding any given advertisement and the ways in which minors’ personal data is used to engage in such advertising; and
(c) the fact, if applicable, that content displayed to a minor consists of an advertisement or marketing material, including the disclosure of endorsements of products, services or brands, made by other users of the platform for commercial consideration.
12 Every operator must keep and maintain audit logs for the collection, processing and use of personal data and relevant records of data and personal data in its possession or control that are necessary to determine whether it has complied with this Act.
13 Every two years, every operator must cause an independent review of its platform to be conducted, including in respect of the risks and harms it poses to minors and the cumulative effects the use of the platform has on minors. The operator must make the findings publicly available.
14 (1) Every operator must, in each year, prepare a report for the previous year on the risks and harms to minors identified in the independent review and the prevention and mitigation measures taken to address them.
(2) The report must also include a systemic risk and impact assessment in relation to the following:
(a) the extent to which the operator’s platform is likely to be accessed by minors;
(b) if the platform is accessed by minors, data on the number of minors using it and on their daily, weekly and monthly usage;
(c) the platform’s safety settings and parental controls, including an assessment of their efficacy and a description of any breaches reported in relation to them;
(d) the extent to which the platform’s design features, including its personalized recommendation systems and its use of automatic displaying of content, rewards for time spent and notifications, pose risks to minors, including to their privacy, health or well-being;
(e) the collection, use and disclosure by the platform of personal data, such as geolocation or health data, and the purposes for which and the manner in which the data is collected, used or disclosed;
(f) the reports the operator has received through its reporting channel, including the number and nature of the reports;
(g) the internal process the operator has implemented to receive reports and the timeliness, effectiveness and types of responses provided following each report; and
(h) the prevention and mitigation measures taken by the operator to address any issues raised in the independent review.
(3) The operator must publish the report in a prominent place on its platform.
15 The Commission must, in consultation with relevant stakeholders, establish guidelines setting out how operators may conduct market research and product-focused research in relation to minors.
16 Every operator that contravenes any of sections 4 to 9 is guilty of an offence and liable,
(a) on conviction on indictment, to a fine of not more than twenty-five million dollars; and
(b) on summary conviction, to a fine of not more than twenty million dollars.
17 Every operator that contravenes any of sections 10 to 12 or a provision of the regulations made under section 21 is guilty of an offence and liable on summary conviction to a fine of not more than ten million dollars.
18 An operator is not to be found guilty of an offence under this Act if it establishes that it exercised due diligence to prevent its commission.
19 (1) The user of a platform who is a minor, or any of their parents, who alleges that they have suffered serious harm as a result of a failure by its operator to comply with its duty of care under subsection 4(1) may, in any court of competent jurisdiction, bring an action against the operator and, in the action, claim relief by way of one or more of the following:
(a) damages for any serious harm, loss or damage suffered;
(b) aggravated or punitive damages;
(c) an injunction;
(d) an order for specific performance; or
(e) any other appropriate relief, including the costs of the action.
(2) Unless the court decides otherwise, no action may be brought later than three years after the day on which the minor or their parent becomes aware of the act or omission on which their action is based.
(3) In this section, serious harm includes physical or psychological harm, substantial damage to reputation or relationships and substantial economic loss.
20 If an operator has implemented standards or a code of practice that, in the Minister’s opinion, provides for substantially the same or greater protections as those provided for under this Act, the Minister may cause a notice to be published in the Canada Gazette confirming the extent to which this Act applies to the operator’s platform.
21 The Governor in Council, on the recommendation of the Minister following consultations with the Commission, may make regulations for carrying out the purposes and provisions of this Act, including regulations
(a) setting out the form and manner, including the languages, in which information is to be provided to users under section 10; and
(b) for the purpose of section 12, providing for the records of data and personal data to be kept and maintained by an operator, the manner in which they are to be kept and maintained and the period during which they are to be kept and maintained.
R.S., c. C-46
(a) of an indictable offence and liable to imprisonment
Insertion start (i) Insertion end for a term of not more than five years,
(ii) for a term of not more than 10 years, if the person depicted in the image is engaged in explicit sexual activity, or
(iii) for a term of not more than 14 years, if the accused knew or ought to have known that, at the time the intimate image was created, aggravated sexual assault was being, or had just been, committed against the person depicted in the image; or
End of inserted block(a) of an indictable offence and liable to imprisonment
(i) for a term of not more than five years,
(ii) for a term of not more than 10 years, if the image depicts the person as being engaged in explicit sexual activity, or
(iii) for a term of not more than 14 years if the accused knew or ought to have known that, at the time the false intimate image was created or edited, aggravated sexual assault was being, or had just been, committed against the person depicted in the image; or
(b) of an offence punishable on summary conviction.
End of inserted block(b) the recording, copies of which are kept for sale or distribution in premises within the jurisdiction of the court, is an intimate image Insertion start or a false intimate image Insertion end ;
false intimate image has the same meaning as in subsection 162.1(2.1); (fausse image intime)
End of inserted block(xxvii.2) Insertion start subsection Insertion end 162.1 Insertion start (1) Insertion end (intimate image),
(xxvii.3) subsection 162.1(1.1) (false intimate image),
End of inserted block(b.1) repeatedly communicating with, either directly or indirectly, the other person or anyone known to them through the Internet, a social media service or other digital network;
End of inserted block(a) the person contravened the terms or conditions of an order made Insertion start under Insertion end section 161 or a recognizance entered into Insertion start under Insertion end section 810, 810.1 or 810.2;
(b) Insertion start the person contravened Insertion end the terms or conditions of any other order or recognizance, or of an undertaking, made or entered into under the common law, this Act or any other Act of Parliament or of a provincial legislature that is similar in effect to an order or recognizance referred to in paragraph (a); or
(c) in the case of conduct referred to in paragraph (2)(b.1), the person communicated anonymously or using a false identity.
End of inserted block(x) Insertion start subsection Insertion end 162.1 Insertion start (1) Insertion end (publication, etc., of an intimate image without consent),
(x.1) subsection 162.1(1.1) (publication, etc., of a false intimate image without consent),
End of inserted block(e) in the case of an offence under subsection 162.1(1) Insertion start or (1.1) Insertion end , by paying to a person who, as a result of the offence, incurs expenses to remove the intimate image Insertion start or the false intimate image, as the case may be Insertion end , from the Internet or other digital network, an amount that is not more than the amount of those expenses, to the extent that they are reasonable, if the amount is readily ascertainable.
(c) will continue to engage in conduct referred to in paragraph 264(2)(b.1).
End of inserted block(a) must order that the recognizance include a condition prohibiting the defendant from communicating by any means — including a means referred to paragraph 264(2)(b.1) — directly or indirectly, with the person on whose behalf the information was laid; and
(b) may order the recognizance be entered into for any period — definite or indefinite — that the justice or summary conviction court considers necessary to protect the security of the person on whose behalf the information was laid, taking into consideration whether the defendant
(i) communicated with the person anonymously or using a false identity, or
(ii) created more than one social media service account to prevent their communications with the person from being blocked.
End of inserted block(a) the person contravened the terms or conditions of an order made under section 161 or a recognizance entered into under section 810, 810.012, 810.1 or 810.2;
Published under authority of the Speaker of the House of Commons
|