Research has shown that children are targeted with graphic content within 24 hours of creating a social media account. Even when they provide their true age, some children are offered content and experiences that in almost any other context would be illegal, the report revealed.
Digital design Puts Children at Risk
On behalf of 5Rights Foundation, research agency Revealing Reality conducted a study to understand how the digital design of products and services puts children at risk. The researchers interviewed designers and children, and did innovative tests using avatars modelled on children and registered them as 13+ year olds.
The digital services named in the report, like TikTok, Snapchat and Instagram, are not deliberately designed to put children at risk. This is probably the case for most products and services that are likely to be accessed by children. However, what the researchers make clear is that the risks they lead to are not accidental.
“The unavoidable conclusion is that the choices designers make are harming children. Children are trapped in highly automated systems that maximize attention, maximize spread and maximize interaction at any cost. Even when the user is a child”, said Baroness Beeban Kidron, Chair of 5Rights Foundation.
5Rights Foundation’s study lays bare how commercial objectives shape children’s experiences and behavior. Some of the key findings are very upsetting:
- Within hours, adult users targeted the 13+ avatar accounts with direct messages, asking to connect and even offering pornography.
- Automated pathways, optimized for commercial goals, show suicide and self-harm content next to appropriate adverts for tampons and games consoles.
- A child who clicks on dieting tips, is shown images of unachievable body types within a week.
- Young children are offered content and experiences that in almost any other context would be illegal.
“Perhaps the most startling image of the report (found on page 85) is a screenshot in which it is clearly visible that a ‘child’ avatar is being ‘targeted’ with adverts for Nintendo Switch, a sweet shop and teen tampons – and at the same time pro-suicide material (“It is easy to end it all”). How on earth is that right? How on earth is that legal?”, questions Baroness Beeban Kidron.
Children also Concerned
The concerns of the children that were interviewed for this study reflect the design strategies in question. Children felt that they spend too much time online and often found it hard to stop. They felt pressured to gain attention and seek validation online. Many also enhanced their appearance through image alteration and are worried of being left out if they “switch off”.
Some children were embarrassed or worried about revealing what happens online. Others felt guilty. Alarmingly, most children believed that their parents would “punish” them by taking away their phones. Consequently, they were reluctant to talk about their online experiences with their parents.
Note as well, that, currently, age-verification is poor. Many children younger than 13 years use social media. “According to the latest statistics of Ofcom, the UK’s communications regulator, 42% of 5 to 12-year-olds use social media apps or sites”, warns the report.
Call for Mandatory Rules
5Rights Foundation commissioned the research before the pandemic. Therefore, it predates the draft Online Safety Bill, presented in May 2021. But its findings “could not be more relevant” emphasizes the report. “An increasing number of children are spending an increasing amount of time online at an increasingly young age. And the recent Ofsted review of sexual abuse in schools and colleges reveals the eye-watering scale of sexual harassment among pupils.”
The Age Appropriate Design Code of Practice, or Children’s Code, came into force in the UK in September 2020, with a 12 month transition period to give organizations the time to prepare. Once the Children’s Code is in place, the Information Commissioner’s Office (ICO) will be able to impose fines and other punishments to services that fail to build in, by design, a set of 15 safety standards to protect underaged users.
The report concludes with recommendations addressing the Online Safety Bill and other actions that the government could take immediately. “It is unlikely that the Online Safety Bill will be introduced in Parliament before 2022, and the Codes of Practice setting out how services are to comply with the Bill will be produced only after it receives Royal Assent. This means that the effects of the Bill are unlikely to be delivered to children for another three or four years. A child, whether aged 8, 12 or 15, does not have three or four years to wait. They need protection now.”