THIS CONSULTATION ACTION IS NOW CLOSED.

In an effort to clean up the Internet, the Government is planning to regulate the online platforms and services we use to communicate. 

Some form of regulation is needed to ensure that social media companies deal with material on their platforms appropriately. However, the Department of Digital, Culture, Media and Sport’s (DCMS) broad, risk-based proposal is likely to lead to widespread removal of legal content across the Internet.

A free society can never eliminate all unacceptable speech. Open Rights Group favours a rights based approach to Internet regulation that focuses on increasing online companies' transparency and accountability to an independent authority apart from the Government.

DCMS is accepting public feedback on their plan until 1 July. Please tell them to make a new plan that protects the right to free expression online. You may use the bullet points below to write your own unique message.

  • Regulation addressing online content must protect human rights and increases online companies’ transparency and accountability. It should focus on ensuring that companies (a) remove illegal or unlawful content, and (b) set human-rights-compliant terms & conditions around lawful content and adhere to these.
  • Users need to be able to defend their right to publish legal content.
  • The scheme pushes towards automated takedowns and speed, both of which are likely to come at the cost of accuracy. Accuracy is as important as removal for a scheme to be legitimate.
  • The UK does not allow state regulation of the press, why should it permit state regulation of millions of citizens’ lawful online speech? It is hard to imagine that this separation can be maintained in the medium term, as services ‘converge’.
  • The duty of care and harms-based approach are not appropriate for this regulatory scheme. They risk expansive restrictions on free speech, let unelected regulators define what free expression protects and do not promote fair and accurate decisions.
  • The government has not explained how harm and risk are to be defined. This is the central question for the ‘duty of care’. As such it could be incredibly wide and impact many kinds of lawful speech.
  • Blocking powers are incredibly blunt. Blocks always need to be limited to the very worst cases and judicially authorised, not imposed by a regulator through administrative orders.
  • The scope of online services included in the proposal is unrealistically vast. Regulation needs to be targeted at social media platforms that handle the publication of very large volumes of user-generated material.

Read more about Open Rights Group’s position here:
https://www.openrightsgroup.org/about/reports/org-policy-responses-to-online-harms-white-paper

View the full DCMS consultation:
https://www.gov.uk/government/consultations/online-harms-white-paper