fbpx Scroll to top

Social Media Platform Content to be Policed by Regulator Watchdog Ofcom

It has been revealed this week that regulators Ofcom are to expand their duties to UK social media. Up until now, Ofcom has had a wide range of powers in the world of digital culture, including television, radio, telecoms and postal sectors, but only this week has the government revealed that watchdog Ofcom has been handed the responsibility for policing social platforms too.

Under this new expansion of Ofcom responsibilities, the watchdog is now to be responsible for ensuring social media platforms such as Twitter, Facebook and Instagram are complying with their legal “duty of care” and protecting their users from illegal material which may be uploaded to the platform. This content can include violence, terrorism, cyber-bullying and child abuse – and platforms will need to ensure that content is removed quickly after flagging. According to the government, it will be at Ofcom’s discretion as to what constitutes a breach of “duty of care” and it will also be their responsibility to decide on the consequences and repercussions of this breach.

Up until now, platforms such as Facebook, Twitter and TikTok have been mostly self regulating and have since defended their internal processes of taking down inappropriate and harmful content. Despite this, the increased pressure to regulate these platforms has been more pressurised since last April, where critics have said that independent adjudicators are clearly needed to ensure nothing slips through the net.

This increasing pressure also stems from the growing amount of tragic deaths of young teens following exposure to potentially harmful content on social media platforms. Molly Russel tragically took her life in 2017 following exposure to content on Instagram which discussed depression and suicide. Molly’s parents believe that Instagram is largely to blame for the passing of her daughter and despite Instagrams statement in which the platform stated that it “does not allow content that promotes or glorifies self-harm or suicide and will remove content of this kind.”, there clearly was a lapse in regulations, perhaps demonstrating the need for independent regulators such as Ofcom.

There are still plenty of questions regarding how these regulations are going to affect certain platforms and how the regulations would deal with workarounds, such as VPNs which would allow internet users to obscure the origin of traffic. These questions are expected to become more clear in the upcoming months as Ofcom builds these relationships with the platforms and sets their standards and regulations.

Post a Comment

Your email address will not be published. Required fields are marked *