
TORONTO — A new research report released today by industry watchdog Friends of Canadian Broadcasting argues social media companies such as Facebook and YouTube are liable for any harmful or illegal content appearing on their platforms, just like any other publisher.
The report, Platform for Harm, builds on a legal analysis provided by libel defence lawyer and free speech advocate Mark Donald, says FCB. According to the report, social media companies are already publishers in the eyes of Canadian law, and therefore legally liable for user-generated content if they know the content is harmful and publish it anyway or if they fail to remove it after being notified about the offending content.
“Our elected officials don’t need to create new laws to deal with this problem. They don’t need to define harmful content, police social media, or constrain free expression in any new way. All government needs to do is apply existing laws. But if a judge decides that content circulated on social media breaks the law, the platform which publishes and recommends that illegal content must be held liable for it,” said Friends’ executive director Daniel Bernhard, in the press release announcing availability of the report (which can be accessed here).
Although social media platforms often present themselves as simple bulletin boards that display user-generated content without editorial control, the report finds “platforms like Facebook routinely exercise editorial control by promoting content users have never asked to see, including extreme content that would land any other publisher in court: for example the promotion of illegal acts such as the Christchurch, NZ massacre. They also conceal content from users without consulting them, another form of editorial control,” reads the Friends’ release.
The press release also notes Facebook and other social media platforms make claims to advertisers that they have the technology to recognize content users post before it’s published and pushed out to others.
“Facebook and other social media platforms have complaints processes where they are alerted to potentially illegal or otherwise objectionable content. Yet it is their own community standards, not the law, which dictates whether they will remove a post. Even then Facebook employees say that the company does not apply its own standards when prominent right-wing groups are involved,” said George Carothers, Friends’ director of research, in the press release.
The Platform for Harm report was the subject of an expert panel discussion on Monday, co-sponsored by Friends and the Centre for International Governance Innovation (CIGI). The discussion was moderated by Rita Trichur, senior business writer and columnist at The Globe and Mail, and featured Friends executive director Daniel Bernhard, Ottawa Centre MP and federal minister of infrastructure Catherine McKenna, CIGI senior fellow Taylor Owen, and Heidi Tworek, assistant professor of international history at the University of British Columbia.
To view the one-hour discussion, please click here.