Online Harms Act

In March 2019, following the Christchurch mosque shooting, Minister of Public Safety and Emergency Preparedness Ralph Goodale stated that the government was planning to carefully evaluate whether social media platforms should be required to censor hate speech and extremist content.

It would primarily target public posts on social networks and platforms that "pose significant risk in terms of proliferating harmful content", and establish a digital safety commission.

They would also be required to show a duty of care in protecting children, by implementing age-appropriate "design features" that would be determined by the Digital Safety Commission.

[10][11] The bill amends the Criminal Code to add a definition of "hatred" as "the emotion that involves detestation or vilification and that is stronger than disdain or dislike".

[12] Emily Laidlaw, a research chair in cybersecurity at the University of Calgary writing for The Globe and Mail, opined that the bill successfully balances "between free expression and protection from harm".

However, he felt that it still contained several "red flags", including definitions that could be interpreted in an overly broad manner, the "remarkable" powers that would be held by the proposed Digital Safety Commission, and that "the provisions involving the Criminal Code and Canadian Human Rights Act require careful study as they feature penalties that go as high as life in prison and open the door to a tidal wave of hate speech related complaint".

"[14] Marcus Gee writing for The Globe and Mail, opined that discourse surrounding Israel and Palestine in light of the Israel–Hamas war, such as chants including "From the river to the sea", could be subject to the bill's life imprisonment penalty for advocating genocide.