Social media platform efforts are inconsistent and not enough, could be read in a technical briefing for the media on Thursday.
The framework presented by Canadian Heritage targets five types of content
damaging : hate speech, inciting violence content, child sexual exploitation content, terrorist content and non-consensual sharing of intimate images.
If you can’t do it in the physical world, you might not have to do it behind a screen.
Platforms like Facebook, Twitter, YouTube, TikTok, Instagram and Pornhub would be required to remove such content within 24 hours of reporting.
If after 24 hours they do not or they do not justify why they do not do it, they may be subject to very severe fines., said in an interview the Minister of Canadian Heritage, Steven Guilbeault.
These financial penalties could reach up to 3% of a platform’s gross revenue or up to $ 10 million, whichever is greater. As an indication, Facebook earned gross revenues of more than $ 85 billion in 2020, according to a report produced for its shareholders. The
maximum financial penalty to which the web giant could be exposed is therefore bordering on 2.6 billion dollars.
In the event of a last resort, it would be possible to ask Internet service providers to block access to sites that persist in not removing child pornography or terrorist content. However, an authorization from the court would be needed to put this final sanction in place.
In addition, the Internet providers themselves, as well as private discussion platforms and popular review sites like Trip Advisor, would not be covered by this legislative framework.
It is not excluded that after consultations and other analyzes, mixed platforms like Telegram – which offer private messaging and the more public distribution of content -, will be subject to regulation.
Consultations will be held until September 25 with a view to tabling a possible bill this fall. You can participate on the Canadian Heritage website.
Ottawa is also proposing the creation of a new digital security commission to oversee the implementation of its regulations.
The digital security commissioner would ensure the application of the new rules of a possible bill. This person would also be responsible for reviewing complaints against the platforms and recommending sanctions, including blocking access or
administrative monetary penalties.
However, it would be a tribunal created under Bill C11 that would determine the penalties to be imposed.
The legislative framework proposed by Canadian Heritage includes the creation of a recourse council. This body would act as a fully online tribunal to settle disputes surrounding the removal of online content, among other things.
Finally, the commission would have an advisory board to
provide specialist guidance and advice to the statutory auditor and the appeals board. This body would draw on advice from experts and groups representing civil society, particularly those most targeted by hate online, including women, people of color and the LGBTQ + community.
The role of law enforcement and CSRS
The government is also pushing for changes to the Canadian Security Intelligence Service (CSIS) Act.
In particular, it is suggested that a new simplified process be created for obtaining a judicial authorization to go and collect data.
identifying information on users.
This proposal could
ensure timely investigations and greater flexibility, according to public documents that present the legislative framework.
Canadian Heritage is examining various possibilities for sharing harmful content with police forces and intelligence services. It could be possible
notify police services when content suggests an imminent risk of serious harm, or to share certain content with the security authorities best equipped to process them.
According to this paradigm, the
illegal content would be reported to the police, while the material
content of national security concern at CSIS would be placed in the hands of CSIS. Thresholds for determining the severity or threat posed by each content remain to be set.
Ottawa also wants to adapt the Act respecting mandatory child pornography reports to new digital realities.
Platforms are already required to report child pornography, but this update may require that they systematically include metadata of illegal content (IP address, source, destination, date and time of sharing) or information on subscribers who publish them, including the transmission data of the content, as well as the contact details, name and billing information linked to the IP address of the user in question.
With information from Julie-Anne Lapointe