Standards that would force tech giants to make children’s privacy online a primary consideration have been published by the UK’s data regulator.

The final Age Appropriate Design Code has been published by the Information Commissioner’s Office (ICO), which it hopes will come into effect by the autumn of 2021 pending approval from parliament.

Everything from apps to connected toys, social media platforms to online games, and even educational websites and streaming services, will be expected to make data protection of young people a priority from the design up.

The 15 provisions have been “clarified and simplified” since a draft was first revealed in April last year, after consulting with the industry and then being submitted to the Government in November.

Privacy settings should be set to high by default and nudge techniques should not be used to encourage children to weaken their settings, the code states.

Location settings that allow the world to see where a child is, should also be switched off by default.

Data collection and sharing should be minimised and profiling that can allow children to be served up targeted content should be switched off by default too.

“I believe that it will be transformational,” Information Commissioner Elizabeth Denham told the PA news agency.

“I think in a generation from now when my grandchildren have children they will be astonished to think that we ever didn’t protect kids online. I think it will be as ordinary as keeping children safe by putting on a seat belt.”

Ms Denham said the gaming industry and some other tech companies expressed concern about their business model but overall the move was widely supported by them.

Molly Russell
The father of Molly Russell has campaigned for social networks to do more to protect children (Family handout/PA)

“We have an existing law, GDPR, that requires special treatment of children and I think these 15 standards will bring about greater consistency and a base level of protection in the design and implementation of games and apps and websites and social media.”

The code comes at a time of increased pressure on the tech industry to act on their possible impact upon people’s mental health.

Ian Russell, who believes access to suicide content on social media helped his teenage daughter Molly take her life in 2017, has welcomed the code.

“It is shocking that in failing to make the necessary changes quickly enough, the tech companies have allowed unnecessary suffering to continue,” he said.

“Although small steps have been taken by some social media platforms, there seems little significant investment and a lack of commitment to a meaningful change, both essential steps required to create a safer world wide web.

“The Age Appropriate Design Code demonstrates how the technology companies might have responded effectively and immediately.”

Andy Burrows, head of child safety online policy at the NSPCC, said the code would force social networks to “finally take online harm seriously and they will suffer tough consequences if they fail to do so”.

He said: “For the first time, tech firms will be legally required to assess their sites for sexual abuse risks, and can no longer serve up harmful self-harm and pro-suicide content.

“It is now key that these measures are enforced in a proportionate and targeted way.”

Facebook, which has been under the spotlight for its approach to the safety of its users, said: “We welcome the considerations raised by the UK Government and Information Commissioner on how to protect young people online.

“The safety of young people is central to our decision-making, and we’ve spent over a decade introducing new features and tools to help everyone have a positive and safe experience on our platforms, including recent updates such as increased Direct Message privacy settings on Instagram.

“We are actively working on developing more features in this space and are committed to working with governments and the tech industry on appropriate solutions around topics such as preventing underage use of our platforms.”