Monday, June 30


Tech platforms could be forced to prevent illegal content from going viral and limit the ability for people to send virtual gifts to or record a child’s livestream, under more online safety measures proposed by Ofcom.

The UK regulator published a consultation on Monday seeking views on further protections to keep citizens, particularly children, safer online.

These could also include making some larger platforms assess whether they need to proactively detect terrorist material under further online safety measures.

Oliver Griffiths, online safety group director at Ofcom, said its proposed measures seek to build on existing UK online safety rules but keep up with “constantly evolving” risks.

“We’re holding platforms to account and launching swift enforcement action where we have concerns,” he said.

“But technology and harms are constantly evolving, and we’re always looking at how we can make life safer online.”

The consultation highlighted three main areas in which Ofcom thinks more could be done:

  • stopping illegal content going viral
  • tackling harms at source
  • giving further protections to children

The BBC has approached TikTok, livestreaming platform Twitch and Meta – which owns Instagram, Facebook and Threads – for comment.

Ofcom’s range of proposals target a number of issues – from intimate image abuse to the danger of people witnessing physical harm on livestreams – and vary in what type or size of platform they could apply to.

For example, proposals that providers have a mechanism to let users report a livestream if its content “depicts the risk of imminent physical harm” would apply to all user-to-user sites that allow a single user to livestream to many, where there may be a risk of showing illegal activity.

Meanwhile potential requirements for platforms to use proactive technology to detect content deemed harmful to children, would only apply to the largest tech firms which present higher risks of relevant harms.

“Further measures are always welcome but they will not address either the systemic weaknesses in the Online Safety Act,” said Ian Russell, chair of the Molly Rose Foundation – an organisation set up in memory of his 14-year-old daughter Molly Russell, who took her own life after viewing thousands of images promoting suicide and self-harm.

He added that Ofcom showed a “lack of ambition” in its approach to regulation.

“As long as the focus is on sticking plasters not comprehensive solutions, regulation will fail to keep up with current levels of harm and major new suicide and self-harm threats,” Mr Russell said.

“It’s time for the prime minister to intervene and introduce a strengthened Online Safety Act that can tackle preventable harm head on by fully compelling companies to identify and fix all the risks posed by their platforms.”

The consultation is open until 20 October 2025 and Ofcom hopes to get feedback from service providers, civil society, law enforcement and members of the public.

It comes as tech platforms look to bring their services in line with the UK’s sweeping online safety rules that Ofcom has been tasked with enforcing.

Some have already taken steps to try and clamp down on features that experts have warned may expose children to grooming, such as through livestreaming.

In 2022, TikTok banned children raised its minimum age for going live on the platform from 16 to 18 – shortly after a BBC investigation found hundreds of accounts going live from Syrian refugee camps with children begging for donations.

YouTube recently said it would increase its threshold for users to livestream to 16, from 22 July.



Source link

Share.
Leave A Reply

Exit mobile version