Warning to businesses over collecting data from children

Warning to businesses over collecting data from children

Pictured (L-R): Bethany Buchanan and Kirk Dailly

Lawyers have underscored the escalating risks that companies face for failing to adequately safeguard children’s data.

Video games companies, social media platforms and companies using facial recognition technology have all recently come under fire from data protection regulators across Europe for not doing enough to protect data gleaned from young people using their services.

Kirk Dailly, head of corporate and commercial at Blackadders and a certified data protection specialist, said that as data has become more commercially valuable, the power of regulators to issue fines and investigate companies has increased.

Mr Dailly said: “Data is one of the most valuable commodities there is and its protection is now absolutely integral to businesses.

“Think about tech firms developing apps for banks or life sciences companies conducting clinical trials. Or video game developers deciding what data should be collected from children and how.

“The law in this area never used to have teeth but it increasingly does now. There are also huge reputational and commercial consequences for companies of getting it wrong.

“Buyers and investors are increasingly focusing on data protection issues as part of their due diligence. Even if you think you’re getting away with it, there might be a sting in the tail when it comes to selling or receiving investment for your business, as any issues will be flushed out in the transaction process.”

In February, the UK regulator, the Information Commissioner’s Office, released new guidance on the children’s data which was aimed at games developers.

In the UK 93 per cent of children play video games with younger children devoting two to three hours a day and older ones doing three or more hours.

The Information Commissioner said games providers should identify if their players are under 18 years old and the games themselves must not be detrimental to children’s well-being. The recommendations are aimed at ensuring compliance with the Children’s Code which has been developed to cover online services likely to be accessed by young people.

Developers should also:

  • Include checkpoints and age-appropriate prompts to encourage players to take breaks or to disengage from extended sessions without feeling pressurised to continue playing or becoming fearful of missing out.
  • Turn off behavioural profiling for marketing to children.
  • Discourage the use of “nudge techniques” to encourage children to make poor privacy decisions for fear of missing out on rewards.

Bethany Buchanan, a solicitor in the firm’s corporate and commercial team who was recently awarded the specialist certification, said: “We are seeing a number of games companies taking a more ethical game design stance and data protection is a part of that.

“The younger generations value privacy and they’re the consumers of the future, so being robust on data protection should be an offensive commercial consideration and not simply a defensive compliance issue.”

Artificial Intelligence is also looming as a compliance issue for data protection with suppliers having to verify the age of users and avoid harvesting personal data. Facial recognition technology in particular can fall foul of child protection measures built into data compliance.

Mr Dailly added: “The days of looking at data protection a couple of times a year and then forgetting about it have passed.

“It needs to be ingrained in the culture, systems and design processes of the organisation. The risks associated with failing to get it right are too high.”

Share icon
Share this article: