On Monday, February 26, 2024, Minister of Justice Arif Virani introduced Bill C-63, which seeks to enact the Online Harms Act (the Act) and amend existing legislation in order to better address hate and harms to children in the quickly evolving online landscape.
The Act seeks to promote online safety1. The Act will be administered and enforced by a new Digital Safety Commission of Canada (the Commission), and operators of social media services will be subject to new monitoring, disclosure and record-keeping obligations, backed up by sizable monetary penalties.
The Commission will administer and enforce the Act, with the support of the Digital Safety Ombudsperson and the Digital Safety Office of Canada.
The Act will apply to operators of social media services. The definition of social media service goes well beyond the typical players like Facebook, Instagram and Twitter/X. Rather, it is broadly defined as “a website or application that is accessible in Canada, the primary purpose of which is to facilitate interprovincial or international online communication among users of the website or application by enabling them to access and share content”, with some requirements for number of users but even small-user sites could also be subject to the Act if there is a significant risk that harmful content is accessible on the service2. The definition also includes adult content and live-streaming services3.
The Act is concerned with seven types of “harmful content”, including:
While most of these categories are straightforward, “content that foments hatred” is more controversial. Content will foment hatred if it “expresses detestation or vilification”, but not “solely because it expresses disdain or dislike or it discredits, humiliates, hurts or offends”. This definition is trying to protect freedom of expression online, but the threshold for when content engages the provisions of the Act is unclear, and will likely pose challenges in application.
Operators of social media services will be subject to a wide range of new duties and obligations, which include the following:
People will be able to make complaints to the Commission regarding content on a social media service7. The Commission can hold a hearing for a complaint or any other issue of an operator’s compliance with its duties and obligations8. However, there is a three-year limitation period9.
In the event of non-compliance, operators may be subject to compliance orders or may be required to, or can also enter into undertakings with the Commission. The Commission may also require operators to pay administrative monetary penalties, with a maximum of not more than 6% of gross global revenue or $10 million, whichever is greater10. If an operator contravenes the Commission’s orders or undertakings, there can be higher monetary penalties of up to 8% of gross global revenue or $25 million, whichever is greater11.
The duty to act responsibly is fairly expansive, and its implementation raises some questions regarding the scope of an operator’s legal obligations, as well as users’ expression and privacy rights.
At a basic level, operators must establish public guidelines and standards of conduct12, and develop and use tools and processes to flag harmful content13.
Operators are expected to implement measures to mitigate the risk of users being exposed to harmful content14. However, they are not required to proactively search for and identify harmful content15. Regulations may also be enacted in the future to require operators to “use technological means to prevent content that sexually victimizes a child or revictimizes a survivor from being uploaded to the service”16.
The Act also appears to be alive to concerns regarding users’ freedom of expression and privacy, which are addressed in part with limits on the scope of operators’ obligations. Any measures implemented by operators to mitigate online harm are not required to “unreasonably or disproportionately limit users’ expression”17. In addition to stipulating that proactive searching for harmful content is not a requirement, the duties of operators also do not apply to any private messaging features18.
Nevertheless, without further guidance from the Commissioner, there is a risk that operators will find it burdensome to balance their legal obligations against the rights of individuals in a way that will satisfy the Commission. Operators will also need to consider privacy law requirements, which are likely to shape how operators meet their moderation and safety obligations under the Act.
The Online Harms Act has just been through first reading. There is going to need to be some further work to make it practicable, including (hopefully) consultation with major social media service platforms. However, businesses should consider the following in anticipation of coming changes:
To discuss these issues, please contact the author(s).
This publication is a general discussion of certain legal and related developments and should not be relied upon as legal advice. If you require legal advice, we would be pleased to discuss the issues in this publication with you, in the context of your particular circumstances.
For permission to republish this or any other publication, contact Janelle Weed.
© 2024 by Torys LLP.
All rights reserved.