While most social media platforms have a minimum age for users to sign up, it’s not hard for underage users to lie about their age.
One in three children lie about their age to access adult content on social media, according to research commissioned by the regulator, Ofcom.
Social media platforms like Instagram, TikTok and Facebook do not permit under-13s to sign up but researchers have found that children fake their ages to skirt the rules.
Ofcom says this increases the risk of children seeing content which may be inappropriate or harmful.
Anna-Sophie Harling, from Ofcom, told BBC News the way social media platforms categorised users by age had a ‘huge impact’ on the content they were shown.
‘When we talk about potentially harmful content to under-18s, it’s content that might have more significant negative consequences for under-18s because they’re still developing,’ said Harling.
‘When children are repeatedly exposed to images and videos that contain certain images, they’re then essentially led to act in different ways or to think differently about themselves or their friends,’
The Ofcom-commissioned research found that 32% of children have an account intended for adults, while 47% of children aged eight to 15 have a user age of 16 and over.
It also found that 60% of children under the age of 13 who use social media accounts have their own profiles, despite not being old enough.
Age categorizations are one of the key ways in which platforms protect underage users.
‘If we want to get serious about protecting children online, we need to make sure that platforms have a way to find out exactly how old those users are,’ said Harling.
‘We need to work both with parents and young people, but also platforms, to make sure that the ages at which those accounts are set are done in an accurate way,’
The report follows the recent court ruling that placed the blame for the death of Molly Russell, a schoolgirl who took her own life, on social media companies.
Russell died in November 2017 after viewing ‘graphic’ content on platforms including Instagram and Pinterest.
Online material viewed by the 14-year-old in the run-up to her death ‘was not safe’, concluded senior coroner Andrew Walker.
In June, Instagram announced that it was testing new options for users to verify their age on the platform, using video selfies and getting people to vouch for them.
What is Ofcom and what does it cover?
Ofcom is the regulator for the communications services that we use and rely on each day.
The watchdog makes sure people get the best from their broadband, home phone and mobile services, as well as keeping an eye on TV and radio.
Ofcom deals with most content on television, radio and video-on-demand services, including the BBC. However, if your complaint is about something you saw or heard in a BBC programme, you may need to complain to the BBC first.
Its rules for television and radio programmes are set out in the Broadcasting Code.
The rules in the Broadcasting Code also apply to the BBC iPlayer.
This Broadcasting Code is the rule book that broadcasters have to follow and it covers a number of areas, including; protecting the under-18s, protecting audiences from harmful and/or offensive material and ensuring that news, in whatever form, is reported with due accuracy and presented with due impartiality.
Audiences can complain to Ofcom if they believe a breach of the Broadcasting Code has been made.
Every time Ofcom receives a complaint from a viewer or listener, they assess it to see if it needs further investigation.
If Ofcom decide to investigate, they will include the case in a list of new investigations, published in the Broadcast and On Demand Bulletin.
An investigation is a formal process which can take some time depending on the complexity of the issues involved.
Ofcom can also launch investigations in the absence of a complaint from a viewer or listener.
Source: Read Full Article