More on the GDPR – a draft

This is my first stab at trying to identify all the key headings within the GDPR which are likely to have a significant impact on children. If I have missed anything important or got something wrong please let me know and I will publish a revised version. Think of this as my first draft.

Having hunted around online I can tell you lots of lawyers have produced general briefings on the GDPR but nobody seems to have prepared one on all the children’s and young people’s angles, much less have they brought them together in a single document.

The overarching principles

Here is the key part of Recital 38

Children merit specific protection with regard to their personal data, as they may be less aware of the risks, consequences and safeguards concerned and their rights in relation to the processing of personal data. Such specific protection should, in particular, apply to the use of personal data of children for  the purposes  of marketing  or creating  personality  or user profiles  and the collection of personal  data with regard  to children when using services offered directly to a child…

Related to this is Article 5 (1) (a) which specifies that all data processing shall be done … transparently and fairly. If the data subject is a child this is a highly relevant fact or consideration which will determine whether a particular course of action is transparent or fair and if it is is not both those things it is not lawful.

Article 6 gives more detail about the conditions which need to be met for data processing to be considered lawful. 6(1)(f) reads as follows

… (the processing must be necessary for business reasons… except  where such reasons….) are overridden  by the interests or fundamental  rights and freedoms of the data subject which require protection of personal data, in particular where the data subject is a child. (emphasis added).

Article 35 refers specifically to new technologies and mandates risk assessments. Under this heading  age verification may be compulsory in certain circumstances.

Article 36 is interesting.  Section 1 says

The  (business) shall consult the  supervisory   authority prior to  processing where a  data protection impact assessment under Article  35 indicates that the processing would result in a high risk in the absence of measures taken by the controller to mitigate the risk.

I imagine here is where the supervisory authority might suggest or require that age verification be adopted.

Definition of a child.

There is no definition of a child within the GDPR. However, since every Member State is a signatory to the UN Convention on the Rights of the Child 18 must be the relevant standard. This has a number of potentially interesting consequences i.e. everywhere in the GDPR where the word child or children appears (that’s 25 and 8 times respectively) we must assume they are referring to all persons aged 17 and below. However, because the GDPR is also peppered with references to risk assessments and proportionality – context is everything – this clearly does not mean the GDPR anticipates that everyone under the age of 18 needs to be treated identically.

The age of consent for data purposes

Article 8 allows Member States to make an explicit decision about the lowest age at which a child may decide for themselves whether or not to hand over personal data to an online service provider without the provider having to obtain the consent of the child’s parents. The lowest age a Member State may choose is 13 (the current de facto status quo in most EU Member States) and the highest available option is 16. If a Member State makes no decision then by default the age becomes 16 automatically in May, 2018.

Article 8 does not qualify, modify or change the definition of a child. It only sets a parameter which impacts on an online provider’s obligation to obtain parental consent. Thus, if a country opts for 15 it is not saying that for privacy or any other purposes 15 is the age at which someone may be judged to be an adult or to have the same capacities as an adult. However, I guess it would be odd if, despite designating, say, sub-16s, 15s, 14s or 13s for special treatment within a jurisdiction that this did not imply that they should also receive  extra or special or at any rate different attention. I can see that here there could be important tensions with a young person’s legal rights.

Thus, in effect, we have two classes of children. Those below the age of consent for data purposes  and those above it who are nevertheless still below the age of majority (18). I guess that has always been the case but introducing a spread of ages, including one as high as 16, does rather highlight or emphasise the matter.

Recital 38 makes an exception with regard to online preventative or counselling services offered directly to a child. In such cases the provider is not obliged to seek the consent of the young person’s parents.

 Accessibility and transparency

Recital 58 reads as follows:

The principle of transparency requires that any information  addressed to the public or to the data subject be concise, easily accessible and easy to understand, and that clear and plain language and, additionally, where appropriate, visualisation  be used….

and…

Given that children  merit specific  protection,  any information  and communication, where processing  is addressed to a child, should be in such a clear and plain language that the child can easily understand.

Article 12 gives effect to Recital 58

The controller shall take appropriate  measures to provide any information… to the data subject in a concise,  transparent, intelligible  and easily accessible  form, using clear and plain language, in  particular  for any information addressed specifically to a child. (emphasis added)

Expect to see icons and pictograms being deployed on a much larger scale, especially but not solely in areas known to be frequented by children.

A new kid on the block

The Article 29 Working Party and national data protection authorities as a group have not been conspicuously involved in the campaign to make the online space a safer and better environment for children. That should be about to change.

Article 57 reads as follows

  1. Without prejudice to other tasks set out under this Regulation, each supervisory authority shall on its territory:

(a)    monitor and enforce the application of this Regulation;

(b)    promote public awareness  and understanding  of the risks,  rules,  safeguards  and rights  in relation  to processing. Activities addressed specifically to children shall receive specific attention (emphasis added)

Will national Data Protection Authorities start to attend meetings of the Safer Internet Centres? How will that link work? In the UK will the Information Commissioner’s Office join the Board of UKCCIS? How will it relate to Internet Matters?

Codes of conduct

This is going to be an extremely important area of activity.

Article 40 reads as follows

  1. The Member States, the supervisory authorities,  the (European Data Protection Supervisory) Board and the Commission  shall encourage the drawing up of codes of conduct intended  to contribute  to the proper application  of this Regulation,  taking account of the specific features of the various processing sectors and the specific needs of micro, small and medium-sized enterprises.
  1. Associations and other bodies representing categories  of controllers or processors may prepare codes of conduct, or amend or extend such codes, for the purpose of specifying the application of this Regulation, such as with regard to:…

(g)   the information provided to, and the protection of, children, and the manner in which the consent of the holders of parental responsibility over children is to be obtained… (emphasis added)

Consistency and implementation

Article 57 (1) (g) imposes an obligation  on national data protection authorities to

cooperate   with, including sharing information,  and provide mutual  assistance to, other supervisory authorities with a view to ensuring the consistency of application and enforcement of this Regulation.

Article 41 refers to arrangements for monitoring approved codes of conduct and Article 97 makes clear the Commission has an obligation to keep the operation of the Regulation under review and to report accordingly to the European Parliament and the Council of Ministers.

The right to be forgotten

This makes its appearance in Recital 65  and in Article 17 (where it is also described as “right to erasure”..

In the Recital it says

(the right of erasure)……..is relevant in particular  where the data subject has given his or her consent as a child and is not fully aware  of the risks involved  by the processing, and later  wants  to remove such personal  data, especially  on the internet.  The data subject should be able to exercise  that right notwithstanding the fact that he or she is no longer a child.(emphasis added)

Profiling

In Recital 24 profiling is alluded to as being a form of data processing which allows the data controller (typically an online business) …to take decisions concerning…or for analysing or predicting…personal preferences, behaviours and attitudes.

I guess that includes making decisions about what sort of advertisements to serve up to someone but it could go a lot further than that. However, as we saw in the earlier reference to Recital 38

... specific protection should, in  particular,  apply to  the use of  personal data of  children for  the purposes  of marketing  or creating  personality  or user profiles  and the collection of personal  data with regard  to children when using services offered directly to a child.  (emphasis added)

And Recital 71 makes it clear children should not be the subject of profiling. How that will work out in practice remains to be seen.

About John Carr

John Carr is a member of the Executive Board of the UK Council on Child Internet Safety, the British Government's principal advisory body for online safety and security for children and young people. In the summer of 2013 he was appointed as an adviser to Bangkok-based ECPAT International. Amongst other things John is or has been a Senior Expert Adviser to the United Nations, ITU, the European Union, a member of the Executive Board of the European NGO Alliance for Child Safety Online, Secretary of the UK's Children's Charities' Coalition on Internet Safety. John has advised many of the world's largest internet companies on online child safety. In June, 2012, John was a appointed a Visiting Senior Fellow at the London School of Economics and Political Science. More: http://johncarrcv.blogspot.com
This entry was posted in Age verification, Default settings, Internet governance, Regulation, Self-regulation. Bookmark the permalink.