Just how will the GDPR help deal with this?

Tuesday  of this week was Safer Internet Day. This was  also when the BBC chose to publish the results of some interesting research into young people’s compliance with the age requirements of social media sites. Bear in mind that in every case looked at this is meant to be 13. Here are the headlines:

  • More than 75% of children in the UK aged 10 and 12 have accounts
  • Facebook was the most popular with under 13s in the survey. 49% of 10 and 12 year olds in the survey claimed to be users
  • Instagram clocked an impressive 41%

I don’t imagine the picture will be very different in many other EU Member States or indeed in many other OECD countries.

So the European institutions spent over 4 years discussing a new data protection regime. At the last minute – without any prior indication that they were even thinking along these lines – they introduced a proposal to make 16 the new default age, with an option for a particular country to adopt 15, 14 or 13 instead. Yet they did not introduce a requirement for any of the sites to verify that their users complied with any of the stated age limits. In other words they did nothing to address one of the most glaring challenges in the space – the complete divorce between formal policy and the actualité. How will their decision help protect children? It won’t.

Rumour has it that the UK will not go with the new default of 16 but will, instead, adopt 13. In other words it will preserve the status quo – although it won’t be doing quite that because by making 13 a legal baseline we will in fact be changing our law.

Thus huge numbers of children will continue to be users of sites that are not meant for them. They will also learn that telling lies about your age online is easy and has no downside or consequences. Adults make stupid rules. Who knew? And in other countries – where they stick with 16 as the minimum age? The number of children being drawn towards lying will simply get bigger. What a wonderful result. Not.

This situation is not sustainable in the longer run. The only question for me is what will it take, finally, to trigger the inevitable change? Companies that proclaim an age limit should be required to have the capacity to enforce it, or else they should withdraw it and construct their sites and operational policies on the basis that people of all ages will be users.

I do not think that would be  a particularly good outcome. The better alternative is to embrace age verification.

About John Carr

John Carr is a member of the Executive Board of the UK Council on Child Internet Safety, the British Government's principal advisory body for online safety and security for children and young people. In the summer of 2013 he was appointed as an adviser to Bangkok-based ECPAT International. Amongst other things John is or has been a Senior Expert Adviser to the United Nations, ITU, the European Union, a member of the Executive Board of the European NGO Alliance for Child Safety Online, Secretary of the UK's Children's Charities' Coalition on Internet Safety. John has advised many of the world's largest internet companies on online child safety. In June, 2012, John was a appointed a Visiting Senior Fellow at the London School of Economics and Political Science. More: http://johncarrcv.blogspot.com
This entry was posted in Age verification, Facebook, Regulation, Self-regulation. Bookmark the permalink.