One in six secondary school pupils have been sent naked photos or asked to share them, Ofcom finds
- 60 per cent aged between 11 and 18 had received unwelcome messages
- Nearly one in three said they had received an unwanted friend request
- 13 per cent said they were sent images or videos of naked or half-dressed people
One in six teenagers at secondary school have been sent naked photos or asked to share them, Ofcom has found.
A survey revealed 60 per cent of pupils aged between 11 and 18 had received unwelcome messages on social media and messaging apps.
The watchdog has today published draft measures under the Online Safety Act that it believes will better protect children from illegal content on the web.
Among them is a requirement for platforms to ensure its youngest users are invisible to random strangers who might try to contact them.
Ofcom said ‘scattergun’ friend requests were frequently used by adults looking to groom children in order to sexually abuse them.
A survey revealed 60 per cent of pupils aged between 11 and 18 had received unwelcome messages on social media and messaging apps (stock image)
The NSPCC said grooming crimes against kids on social media had surged by a ‘staggering’ 82 per cent in the five years it took for the act to go through parliament.
Dame Melanie Dawes said: ‘Our figures show that most secondary-school children have been contacted online in a way that potentially makes them feel uncomfortable. For many, it happens repeatedly.
‘If these unwanted approaches occurred so often in the outside world, most parents would hardly want their children to leave the house.
‘Yet somehow, in the online space, they have become almost routine. That cannot continue.’
READ MORE: Online Safety Bill becomes law in the UK: Divisive act receives Royal Assent – in move that could see WhatsApp banned in Britain
Once the draft codes are finalised, Ofcom will have the power to fine tech giants up to 10 per cent of their global turnover and even block them from being used in the UK if they are found to have breached them.
Ofcom’s survey of 2,031 secondary school pupils found three in five respondents said they had been in a situation where the contact had made them feel ‘uncomfortable’.
Nearly one in three said they had received an unwanted friend request, and around a fifth were asked where they lived or how old they were.
A further 13 per cent admitted they had been sent pictures or videos of naked or half-dressed people, while ten per cent had been asked to share such content.
Teenagers reported this happened the most on Snapchat, a social media app in which the photos or videos sent automatically disappear seconds after they have been watched.
Under the draft codes published today, the biggest tech platforms will have to ensure by default that children are not visible to random users.
This includes removing them from suggested friends lists and not allowing accounts outside a child’s own friends or followers to send them private messages.
Ofcom said it would not make decisions about individual posts but would instead focus on making the platforms themselves ‘fundamentally safer’.
Teenagers reported this happened the most on Snapchat, a social media app in which the photos or videos sent automatically disappear seconds after they’re watched (stock image)
The draft code requires the biggest tech companies to ensure they have ‘well-resourced and trained’ teams to tackle illegal content.
Among them are proposals to combat online scams for all users, such as introducing new ‘keyword detection’ technology to find and remove stolen data, such as credit card details.
It will also require all services to block accounts run by proscribed terrorist organisations.
Dame Melanie said: ‘Regulation is here, and we’re wasting no time in setting out how we expect tech firms to protect people from illegal harm online, while upholding freedom of expression.
‘Children have told us about the dangers they face, and we’re determined to create a safer life online for young people in particular.’
Later this year Ofcom will also propose guidance on how porn sites will need to show they are stopping children accessing them.
In spring, it will publish details on how social media giants will need protect children from cyberbullying and suicide, self-harm, and eating disorder content.
Source: Read Full Article