UK Government Considers Tightening Social Media Regulations After Riots
Recent far-right riots following disinformation have informed the UK government’s deliberations on firmer rules governing social media companies. A more recent legislation – the Online Safety Act endorsed in October but was not implemented yet – has raised concerns because several governments are considering broadening its coverage to fight dissemination of rather toxic contents and false information.
These riots which occurred after the spread of fake news regarding the identify of a suspected killer have vindicated the arbitrariness of the Act. At present, the law permits them to be fined a sum equivalent to up to 10% of their worldly gross for the previous financial year if they are found wanting in policing unlawful content. Proposed changes could bring Ofcom, the United Kingdom’s communications regulator to the position where they fine companies for abetting “legal but dangerous” content on their platforms.
Cabinet Office minister Nick Thomas Symonds agreed with the necessity of possible changes to the Act. Some parts of the Online Safety Act do not take effect presently is rather evidently true. We are prepared to take the action if is required to make the changes, he said in a recent interview.? This same opinion was shared by the Mayor of London, Sadiq Khan, who raised some alarm that the current form of the Act as it is now may not work.
The authorities have decided to reconsider existing legislation on social networks due to the recent wave of the spread of fake news during the unrest. For instance, some posts with wrong information about the alleged culprit for the knife attack on three young girls on July 29, went viral on the social media, which caused the anger and eventually the escalation of violence regarding Muslims and migrant origins.
On the same note, the owner of X, now known as Twitter, Elon Musk gave out fake news to his many followers, even going as far as stating that civil war in Britain was impending. Such elaborate cases of disinformation spread have led to increased pressure to enhance supervision of social media sites.
It seems that public opinion would like to see greater regulation. According to the poll done by YouGov sample comprising of over 2000 adults, two-thirds think the social media companies should be liable for criminal exhortation in the posts. In addition, 70% stated that social media platforms were not well regulated enough and 71% said that the social media companies did not act adequately in opposing fake news during the riots.
So the question remains, where to from here of the government, the difficulty is in how to balance concepts of free speech and purported fomenting of violence. Labour, the current UK government, which has inherited the Online Safety Act from the Conservatives, has the challenging to work in the fine-tuning of the law that took months to design and develop due primarily to these issues.
The possible changes in the Act have serious consequences for most social media platforms that are active in the UK market. This will likely force platforms to block more posts and comments proactively, potentially risking damaging sanctions for letting the misinformation persist, while potential penalties for allowing the spread of such content might require platforms to boost their content moderation measures and invest in better technologies for that goal.
While deliberations on the future developments are ongoing, voices of representatives from industrial fields have followed. It is easy to assume that technology industries, civil liberty organizations, and police forces will all contribute towards the future UK online safety regulation.
The conclusions of these discussions could inform other nations that are facing similar challenges as those of managing online harm and misinformation. With the advance in technology other countries may look to the UK for directives on how they can regulate social media while ensuring that the platforms remain open for everyone.