Wednesday, July 23


Young people, who did not take part in the children’s commissioner’s survey, say how long they spend on their phones

One in five children spend at least seven hours a day using phones and tablets, initial findings of a survey have found.

Two children aged 10 and 11 said they spent at least nine hours a day using screens during the weekend, according to the survey by the children’s commissioner for Wales.

Thirteen-year-old Kiishi is part of a digital guardians project to help protect children online and said some technologies were “almost controlling”.

New rules under the Online Safety Act, including age verification on certain sites and apps, will be enforced from Friday.

The survey asked children and young people aged between seven and 18 in Wales about their use of devices such as phones, tablets and computers.

More than half of the 340 respondents to date said they had rules at home to limit screen time and what apps they use, amid concerns time spent online will increase during the school holidays.

A third of respondents said they had to leave their device downstairs at bedtime, and 47% said they were only allowed on certain apps.

About three quarters who said they used TikTok admitted to switching off its one-hour limit function for under 18s.

The children’s commisioner for Wales, Rocio Cifuentes, said: “The [Online Safety] Act must deliver on its promise of protecting children and improving their online experiences. And in such a quickly developing space, this means keeping pace with new challenges and responding to them effectively.

“Mechanisms protecting children from too much time on screens must be stronger.”

Kiishi, 13, says she’s particularly concerned about the rapid advancements in artificial intelligence and how technology can seem “controlling”

The UK government is reportedly looking at how it might be able to limit how much time children spend on social media.

Rufus, 15, from Llantwit Major, Vale of Glamorgan, is part of a digital guardians project run by Platfform, a mental health charity and the NSPCC, to give young people a voice in the debate around online safety.

“I think there needs to be more restrictions,” he said.

“Not in the way of restricting time because most young people believe adults that restrict time are restricting their fun and enjoyment.”

Kiishi, 13, from Swansea, said she wanted to share her experiences of being online as a young person to help improve protections for other children.

She said: “Technologies are advancing and becoming more complex and almost controlling. Some people could be brainwashed into thinking some things that are not real.”

Ada, 12, from Cardiff said: “I wanted to become a digital guardian so I can help keep children like me, older or younger, safe given the rise of things like AI to steel data, to spread misinformation because it’s so common these days to take information from things like AI that may be inaccurate which may be dangerous to our physical and mental health.

“While the internet can be bad, there are also lot of positives that help you grow and understand things.

“It can be educational but there needs to be more restrictions on the negatives like on social media and disinformation.”

According to research by Ofcom, nearly every child over 12 has a mobile phone and almost all of them watch videos on platforms such as YouTube or TikTok

The UK’s communications regulator, Ofcom, will enforce new rules which will require social media platforms to check a user’s age and change their algorithms affecting what is shown in order to filter certain types of content.

Under the Online Safety Act, firms are also required to remove illegal content and new laws have been introduced around sending unsolicited sexual imagery online.

Matthew Sowemimo, NSPCC associate head of public affairs for child safety online, said: “Young people bring unique perspectives that help us understand the true impact of online harm, enabling us to identify the support needed to keep them safe.

“That’s why it is crucial that children’s voices are included in conversations about child safety online.

“But the onus for protecting children from the harm they face online, including on social media platforms, should not be put on young people themselves, but rather tech companies need to design and put in place safety features on their sites to tackle the risks.”



Source link

Share.
Leave A Reply

Exit mobile version