The UK is set to introduce laws to limit illegal and harmful online content, such as child sexual abuse and terrorist material. Social media giants face blocked sites and massive fines if they fail to comply with the new regulations. Also in the firing line are senior executives who could be held personally liable for their firm’s non-compliance.
The New Online Harms Bill
The UK is set to introduce a new Online Harms Bill. The reasons for which are explained in the UK governments Online Harms White Paper. The bill will bring in new regulations that will apply to any company hosting user-generated content online that is accessible by individuals in the UK. It will also apply to any company enabling individuals in the UK to interact with others online privately or publicly. Consequently, social media giants, messaging platforms, dating apps and video games that have chat services will all be affected.
Furthermore, the legislation will apply to companies providing search engines. This is because search engines enable access to harmful content online. They enable access to gaming sites, messaging apps, forums and commercial pornographic sites, for example.
Nonetheless, the bill states that “The legislation will include safeguards for freedom of expression and pluralism online — protecting people’s rights to participate in society and engage in robust debate. The laws will not affect the articles and comments sections on news websites.”
The Online Harms bill will be tabled before parliament in 2021 and is expected to become law by 2022. The bill will have to be voted in by the UK Parliament before it can become law.
Extra Provisions for Social Media Giants
The Online Harms Bill’s proposals contain extra provisions for the social media giants with high-risk features. This is expected to include such companies as Facebook, TikTok, Instagram and Twitter.
These companies will be required to have policies in place with regards to content that, although not illegal, could cause harm. For example, disseminating misinformation about Covid-19 vaccines. They will have to assess if there is a “reasonably foreseeable risk” that any content they host will cause “signification physical or psychological harm to adults”. They will have to clarify what content will be allowed and how they will deal with non-conforming content.
The government also expects tech platforms to better protect children from harmful online content like bullying, grooming and pornography. “We are entering a new age of accountability for tech to protect children and vulnerable users, to restore trust in this industry, and to enshrine in law safeguards for free speech,” Digital and Culture Secretary Oliver Dowden, who presented the proposals to parliament, said.
Consequences of Non-Compliance
The bill’s proposals are a means of forcing tech giants to rid their platforms of illegal and harmful online content. The bill requires that all companies falling within the new regulation’s scope remove and limit the spread of illegal content. This involves removing content such as child sexual abuse, suicide content, and terrorist material from their platforms. The government is still deciding whether to make material promoting self-harm illegal as well.
British media watchdog, Ofcom, is slated as the authority who will administer the new regulations. Ofcom will be given the power to fine any company failing to comply with the new online safety rules. Such companies face fines of up to £18 million ($24 million) or 10% of annual turnover, whichever is the higher.
Ofcom could also block non-compliant services from being accessible from the UK. Furthermore, the government is reserving the right to make senior executives personally liable for non‑compliance if companies don’t take the new legislation seriously. This includes, for example, not responding swiftly to regulators’ requests for information.
The government expects less than 3% of UK companies to fall within the scope of the new online safety rules. Dowden explained: “This proportionate new framework will ensure we don’t put unnecessary burdens on small businesses but give large digital businesses robust rules of the road to follow so we can seize the brilliance of modern technology to improve our lives.”
Social Media Giants’ Response
In response to the Online Harms Bill, Facebook’s head of UK public policy, Rebecca Stimson, stated that Facebook already has policies in place against harmful content. However, the company welcomed the introduction of regulations.
“Protecting people from harm without undermining freedom of expression or the incredible benefits the internet has brought is a complex challenge,” she added. “We look forward to continuing the discussion with government, Parliament and the rest of the industry as this process continues.”
Twitter responded by saying that they were “deeply committed to keeping people safe online.” A Twitter spokesperson added: “We support regulation that is forward thinking, understanding that a one-size-fits all approach fails to consider the diversity of our online environment.”
Chinese owned video sharing platform TikTok, said it was looking forward to reviewing the proposals and working with government to strengthen online safety. “At TikTok, safety isn’t a bolt-on or a nice-to-have, it’s our starting point to building a creative, diverse community,” a TikTok spokesman said.
Google’s YouTube UK managing director, Ben McOwen Wilson, also responded. He stated that the firm took the safety of its online communities very seriously. And that the firm had not waited for legislation to act. “We have worked with industry, community groups and the government to tackle harmful content,” he said.
EU Set to Unveil Similar Legislation
On Tuesday the EU announced its own plans for new digital legislation. EU’s reforms are also aimed at ensuring that tech giants take more responsibility for the content on their platforms.
Under EU’s proposals, companies could face fines like those proposed in the UK’s Online Harms Bill. Breaches of the new EU rules could cost companies up to 10% of their annual global turnover.
“Though the regimes differ in many ways, both impose new and potentially onerous duties on online platforms to protect their users from harmful content. They also both propose heavy sanctions for those who fail to comply, including fines of up to 10% of a company’s global turnover,” said Ben Packer, partner at global law firm Linklaters.