ICANN refuses to explain

Regular readers will know about the application made by the .Kids Foundation to ICANN to be allowed to run the proposed new .kids gTLD.  ICANN gave a contract to the Economist Intelligence Unit  (EIU) to help them assess the bid.

I have been around the child protection, children’s rights  and child welfare space for several years. I had never heard the EIU’s name mentioned as an authority in connection with anything to do with children. Had I missed something? I contacted the EIU. They refused to discuss it. The EIU referred me to ICANN.

In their reply to my questions ICANN told me

….the EIU was chosen because it offers premier global business intelligence services.

Not a convincing opening line given the nature of my enquiry but ICANN went on to quote from something called the Panel Process document, in particular the following:

The EIU is the business information arm of The Economist Group, publisher of The Economist. Through a global network of more than 500 analysts and contributors, the EIU continuously assesses political, economic, and business conditions in more than 200 countries. As the world’s leading provider of country intelligence, the EIU helps executives, governments, and institutions by providing timely, reliable, and impartial analysis.

The word  child  or  children  have yet to make an appearance. In fact they never do.

Then comes this

The evaluation process respects the principles of fairness, transparency, avoidance of potential conflicts of interest, and non-discrimination. Consistency of approach in scoring applications is of particular importance. In this regard, the Economist Intelligence Unit has more than six decades of experience building evaluative frameworks and benchmarking models for its clients, including governments, corporations, academic institutions and NGOs. Applying scoring systems to complex questions is a core competence.

I added the bold to that word transparency since it is clear it is singularly lacking.

ICANN then gave me some more cut-and-pasted quotes

  • All EIU evaluators undergo regular training to ensure full understanding of all CPE requirements as listed in the Applicant Guidebook, as well as to ensure consistent judgment. This process included a pilot training process, which has been followed by regular training sessions to ensure that all evaluators have the same understanding of the evaluation process and procedures.
  • EIU evaluators are highly qualified, they speak several languages and have expertise in applying criteria and standardized methodologies across a broad variety of issues in a consistent and systematic manner.
  • Language skills and knowledge of specific regions are also considered in the selection of evaluators and the assignment of specific applications.

So I wrote back with only one further question

Did you satisfy yourself that the EIU had (the necessary expertise) or did you simply rely on the EIU’s general assurances (that they had)…..?

Answer came there none.

I doubt the EIU has much of a clue about children and the online space thus, to be clear, I think they were wrong to accept a contract to work in  an area that is outwith their competence but equally ICANN should not have offered them the work without satisfying themselves the EIU  could do it properly.

Children’s interests are marginalized or overlooked once again.

Posted in Default settings, E-commerce, ICANN, Internet governance, Regulation, Self-regulation

Anonymity and porn

S1 (1) of the Obscene Publications Act, 1959, defines what constitutes a criminally obscene article. In the Court of Criminal Appeals in 2002, in the case of  R v Perrin, in the context of  material published on the internet, failure to take steps to prevent children from viewing such material was seen as a key ingredient in determining whether or not an offence under the Act had been committed.

However, to the best of my knowledge, since Perrin there have been no prosecutions of web site owners who publish pornography without an age verification mechanism in place. Why? Because of the practical and jurisdictional issues of enforcement. The relevant sections of the  Digital Economy Bill, 2016, can therefore be viewed as a pragmatic attempt to find a solution to fit modern circumstances. Yet some people are raising concerns. I understand why, but their fears are unfounded.

In the recent debate on the  Digital Economy Bill Liberal Democrat MP Tom Brake made the following point

I agree that denying children access to online pornography is essential, as is ensuring the privacy of adult users of legal adult sites….

In a similar vein the former Secretary of State, John Whittingdale, said

The other big issue covered by the Bill is pornography….and age verification. The Bill does not specify how age can be verified, and I must say that I am not entirely sure how the providers will do that. It will not be sufficient to include the question “Are you 18?”, along with a box to be ticked……

We must bear it in mind that the content that is being accessed is perfectly legal. Of course it is right for children to be prevented from accessing it, because that can be harmful, but it is legal content for adults……

There are a number of things to be said about this.

First of all at the moment unless someone takes special measures to prevent their browsers from handing over any information about them or their machine then it is extremely likely that in the mere act of visiting a porn site, even if you do not buy or actively download anything, you are in truth handing over data about yourself. This compromises your anonymity. I leave aside the possibility of any third-party surveillance that might be taking place on top of that.

Moreover unless Messrs Whittingdale  and Brake want to argue we should get rid of the existing real world laws altogether they need to convince us why, if it were to come down to a straight choice,  they might think it more important to protect adults’ rights to access porn anonymously than it is to keep harmful content away from kids.

Technology to the rescue

However, the good news for Brake and Whittingdale is that nobody has to make such a choice. Remember, unlike gambling sites a porn company does not need your bank or credit or debit card details or your home address. Porn sites only have to establish that you are 18 or above.

Technologies already exist which can prove someone is over 18 without revealing anything else at all.  In this sense the technology could be seen as being privacy enhancing.

Of course the companies that own the age verification systems have to do an initial verification of your age so it is vital that they are trustworthy and secure to the nth degree. There are several ways in which it can be established whether or not that is the case e.g. the Information Commissioner’s Office could tell us.

Posted in Age verification, E-commerce, Pornography, Privacy, Regulation, Self-regulation

Age verification takes another step forward in the UK

The Digital Economy Bill 2016 has just completed its 2nd Reading in Parliament. This is the first major step in the British legislative system. The 1st Reading is largely a formality. The 2nd is where the broad principles and policies behind a measure are debated.

The Bill passed with no votes against. It now goes to Committee where it will undergo line by line scrutiny. However, it is clear from the discussion that at least in respect of the child protection aspects of the Bill, most notably the clauses introducing compulsory age verification for pornography sites, there is an extremely wide  and supportive consensus.

The Government did not indicate that it had modified its position from that set out in the original draft. In other words they did not say they intended to introduce any legal compulsion  to withdraw payments and other ancillary support services  from non-compliant sites. Neither did they say they intended to give the proposed new regulator a power to compel access providers to block persistently non-compliant sites.

Yet many MPs from all parties urged the Government to consider doing just that and in his excellent summing up for the Government  the Minister, Matt Hancock, indicated that it would be necessary to revisit these matters in Committee and that he was keen to reflect on the views which had been expressed by the House.

It’s a long road ahead. Good points were made about privacy concerns that could arise if age verification becomes compulsory but I am certain they can all be answered openly and in a way that will satisfy even the most hardened sceptic.



Posted in Age verification, Default settings, E-commerce, Pornography, Privacy, Regulation

Hilary Clinton declares support for online child safety

There is a human dynamo in Washington DC called Donna Rice Hughes. She is the woman who has persuaded a number of big American Corporations to restrict access to porn on WiFi that they provide in public places. Donna was gracious enough to acknowledge the UK origins of this idea but now she has gone a huge step further.

Earlier this week Donna announced that Hilary Clinton has given her support to the Child Internet Safety Pledge. Much of what is contained in the Pledge will be very familiar to UK, European and some other national audiences but within the US  this is verging on revolutionary, so Donna’s achievement is all the more remarkable for that.

It has long been my view that, in terms of the online child safety, the single most important jurisdiction in the world is the USA, but it is also one of the toughest because of the highly specific  (and for us  very unusual) prevailing political and legal context.

Almost any steps forward for online child safety in the USA will have a huge multiplier effect and redound to the benefit of youngsters across the globe. Clinton signing up may be a harbinger of just such a development. Fingers crossed.

I am not going to dwell on the fact that Donald Trump has also signed the same pledge. I am just going to shrug my shoulders and say it’s not my fault he got this right. However, part of Donna’s point is that for the first time ever the Presidential candidates of both major parties have signed up to the same agenda in relation to online child safety.  That’s quite a coup and perhaps only a DC insider such as Donna could have pulled it off.

Watch this space.


Posted in Child abuse images, Pornography, Regulation, Self-regulation

Value- free technology? I don’t think so

The idea that technology is or can be value free has always struck me as being absurd. Whoever invents a particular application, piece of equipment or a platform has certain objectives in mind and these, in turn,  must have been shaped by their personal attitudes or beliefs or their business aims, often both.

A classic and very un-Olympian example of the latter variety was presented to me several years ago when I attended an IETF workshop. The participants were concerned with developing the protocols to allow browsers to collect and transmit geo-location data from connected devices. I pointed out there would be a number of social consequences attaching to such a development, both good and potentially bad, but almost to a man (repeat, man) the assembled technicians declared they had been sent to the workshop by their employers (mainly big technology companies) to reach an agreement not debate social policy. They didn’t quite say we’re only following orders but it was close.

A recent article in New Scientist  ( “Digital Discrimination”, 30th July – behind a pay wall) shows what can happen when otherwise or supposedly neutral  technologies are allowed to do their thing.

Take the case of Gregory Seldon, a 25 year old black man who lives in the USA. He wanted to make a trip to Philadelphia. Using AirBNB he spotted a particular place, tried to book but was informed it had already gone. Seldon carried on looking and saw that the same location was, in fact, still being advertised for the same dates. Suspicious, he created several new profiles which indicated the applicant was a white man. They were all told the apartment was available.

Seldon Tweeted about his experience on #airbnbwhileblack. The floodgates opened with more or less identical accounts streaming in from all across the country.  It emerged that three academics at Harvard (Edelman, Luca and Svirsky) had found people with names that were primarily thought of as being associated with African Americans e.g. Tanisha and Tyrone were 16% less likely to be accepted as guests on AirBNB than people with names like Brad and Kirsten.

The good news is AirBNB accept they have a problem and are actively seeking a solution but there seems little doubt this goes a lot wider and deeper than social media platforms.

Anupam Chandler, a Professor of Law at UC Davis believes discrimination can be “baked into” the data that form the basis of algorithms thus technology could become a “mechanism for discrimination to amplify and spread like a virus”.

Stands to reason I suppose. Typically algorithms are based on observed patterns of pre-existing behaviour. If that behaviour has a racist (or other) bias then, absent any countervailing measures, the algorithm will simply replicate and thereby, at the very least, sustain it in the future. That would be bad enough but the network effect is likely to give new legs and a new scale to the phenomenon thereby making it worse. In such circumstances it is just not acceptable to say (something like) “it’s not our fault society is riddled with racists (or with sexism)….all we are doing is devising systems which (hand-wringing) unfortunately racists  and sexists are using”.   The logic of this argument is that society needs to deal with racism  and sexism and  technologists are merely hapless, helpless victims of sad circumstance.  Baloney is the least offensive word I can come up with to describe what I think about that argument.

In this connection I was pleasantly surprised to discover that (our old friend) the GDPR has specific provisions which require technology companies to take steps to prevent discrimination based on personal characteristics such as race or religious beliefs. It also creates a “right of explanation”, enabling citizens to question the logic of an algorithmic decision. How easy it will be to enforce this provision is debatable, but it’s not a bad start.



Posted in Default settings, Internet governance, Regulation, Self-regulation

ICANN gets it wrong – again

In previous blogs I have lamented ICANN’s apparent lack of interest in the position of children and young people, despite them being such a substantial constituency of internet users in all parts of the world. Neither does ICANN appear to be seized of the  significance of minors having distinct and important legal rights not enjoyed by or shared with adults.

It is no comfort to know ICANN is not alone in this regard in the constellation of bodies which form what might loosely be called the firmament of internet governance.

New Top Level Domains

In  January 2012 ICANN announced they were going to encourage the domain name system to expand and break free from the “tyranny” of .com, .net and .org. A thousand flowers would be allowed to bloom. An open invitation was issued to anyone and everyone to put forward ideas for new generic top level domains.

The problem is ICANN did not even consider whether this bold step might  have implications for children and young people. Yet they knew or ought to have known that applications were going to be made  for domains which, eventually, would result in web sites being created that were specifically intended to appeal directly to children and young people or would be likely to attract them in substantial numbers.

Anyone who was minded to submit a proposal for a new domain name was pointed towards an Applicant Guidebook.  There is nothing in it which even remotely suggests ICANN accepts it has a duty of care to children. On the contrary ICANN confirmed to me that all applications for all  new domains would be treated identically. Proposals that were obviously child related would be appraised in exactly the same way as applications that were obviously related to such things as pharmacies, banks, insurance and so on.

In England if a general administrative rule is applied inflexibly in such a way as to produce an unjust result for particular groups or classes of individuals the rule itself can often be challenged. I think the present facts constitute just such a case but, sadly, English jurisprudence is not relevant.

Here come the .kids

In the end three entities applied  to establish a new domain that would be called “.kids”. These were Google, Amazon and the .kids Foundation.

Under ICANN’s rules, at the end of the  appraisal process if you have two or more applicants still standing the matter is resolved by an auction. In other words money decides the outcome. Google and Amazon have plenty. The .kids Foundation doesn’t.

However, there is a provision within ICANN’s rules to allow for community-based applications to be made and, if successful, these would “trump” (pardon my use of that word  just now) regular commercial applications. The .kids Foundation was accepted as being eligible to make a community-based  application but in the end the application itself was turned down because it was decided they had not met the criteria. This means an auction will follow and the .kids Foundation will lose.

Having looked at them I am firmly of the view that the criteria the .kids application was judged against showed no understanding whatsoever of the particularities of children’s or young people’s engagement with the internet or of children’s rights but maybe that’s the subject of another blog.

What interested me though was, given ICANN’s historic disinterest in children and young people, on what basis or how would they make any kind of a determination on the .kids Foundation application anyway?

Enter the Economist Intelligence Unit

The short answer is, they didn’t. ICANN sub-contracted the adjudication process to the Economist Intelligence Unit. (EIU)

Knowing what a smart bunch you are I’m guessing your next question is going to be: so what expertise does the EIU have to adjudicate on a matter of this type? I mean when you think about children’s rights, children’s organizations, children online or children anything the EIU is not going to be the first outfit to pop into your well-informed mind. They might be the last but they definitely won’t be the first.

The basis on which ICANN contracted with the EIU to  carry out the evaluation process is set out in commendable detail. Here is an extract from an EIU document.

The  evaluation  process  respects  the  principles  of  fairness,  transparency,  avoidance  of potential conflicts of interest…..

Emboldened by this assurance I asked the EIU the following question:

Are you able to say if any of the EIU staff who assisted with the decision-making process (in the matter of .kids) had any background in matters connected with children’s and young people’s use of the internet or in working with children’s organizations?

The answer was

We generally don’t respond to this type of question. Please contact ICANN.

I also asked if Google or Amazon were customers of the EIU (see above).

Here the answer was

We are not allowed to discuss our clientele.

Answer came there none

I followed the EIU’s advice and contacted ICANN. After a bit of to-ing and fro-ing over a couple of weeks answer came there none, hence this blog.

EIU and ICANN are both at fault

So here’s where I’m at right now: ICANN behaved according to form  – no excuse but no surprise either – but the EIU should have declined to adjudicate in the matter of .kids, either because the criteria they were being asked to use were palpably inappropriate and needed changing or because they (the EIU) lacked the expertise, or both.

Mind you, without the expertise in the first place how could the EIU have known the criteria  they were being asked to use were rubbish? In which case the EIU should simply have refused to enter the space at all saying it was outwith their operational competence. Alternatively ICANN should have satisfied itself that the EIU had the necessary expertise before entrusting it with the task and if they were not satisfied the EIU had the expertise they should have insisted they acquired it. My guess is both organizations are therefore equally and grievously at fault.

We are not the only ones with a beef

Having read a recent article in The Register I learn I am not alone in criticising the way both ICANN and the EIU have handled community applications. According to the recently departed ICANN Ombudsman the application for .gay has been similarly banjaxed. Interestingly,the Ombudsman also says the ICANN Board are not obliged to follow the advice proferred by the EIU. I can imagine why the Board might be nervous about departing from  the EIU’s advice but if the whole process was messed up to begin with that should be the least of their concerns. They ought to be courageous and focus on making the right decision.

A postscript:  Doubtless there are a great many already established websites  in pre-existing domains which appeal directly to children and  young people. Perhaps there are questions to be asked about them also, but that ought not to deflect or detract from raising concerns about  the ownership, management and operation of a whole domain with such an exclusive, predominant or overwhelming focus on minors.


Posted in E-commerce, ICANN, Internet governance, Regulation, Self-regulation | 1 Comment

More on the GDPR – a draft

This is my first stab at trying to identify all the key headings within the GDPR which are likely to have a significant impact on children. If I have missed anything important or got something wrong please let me know and I will publish a revised version. Think of this as my first draft.

Having hunted around online I can tell you lots of lawyers have produced general briefings on the GDPR but nobody seems to have prepared one on all the children’s and young people’s angles, much less have they brought them together in a single document.

The overarching principles

Here is the key part of Recital 38

Children merit specific protection with regard to their personal data, as they may be less aware of the risks, consequences and safeguards concerned and their rights in relation to the processing of personal data. Such specific protection should, in particular, apply to the use of personal data of children for  the purposes  of marketing  or creating  personality  or user profiles  and the collection of personal  data with regard  to children when using services offered directly to a child…

Related to this is Article 5 (1) (a) which specifies that all data processing shall be done … transparently and fairly. If the data subject is a child this is a highly relevant fact or consideration which will determine whether a particular course of action is transparent or fair and if it is is not both those things it is not lawful.

Article 6 gives more detail about the conditions which need to be met for data processing to be considered lawful. 6(1)(f) reads as follows

… (the processing must be necessary for business reasons… except  where such reasons….) are overridden  by the interests or fundamental  rights and freedoms of the data subject which require protection of personal data, in particular where the data subject is a child. (emphasis added).

Article 35 refers specifically to new technologies and mandates risk assessments. Under this heading  age verification may be compulsory in certain circumstances.

Article 36 is interesting.  Section 1 says

The  (business) shall consult the  supervisory   authority prior to  processing where a  data protection impact assessment under Article  35 indicates that the processing would result in a high risk in the absence of measures taken by the controller to mitigate the risk.

I imagine here is where the supervisory authority might suggest or require that age verification be adopted.

Definition of a child.

There is no definition of a child within the GDPR. However, since every Member State is a signatory to the UN Convention on the Rights of the Child 18 must be the relevant standard. This has a number of potentially interesting consequences i.e. everywhere in the GDPR where the word child or children appears (that’s 25 and 8 times respectively) we must assume they are referring to all persons aged 17 and below. However, because the GDPR is also peppered with references to risk assessments and proportionality – context is everything – this clearly does not mean the GDPR anticipates that everyone under the age of 18 needs to be treated identically.

The age of consent for data purposes

Article 8 allows Member States to make an explicit decision about the lowest age at which a child may decide for themselves whether or not to hand over personal data to an online service provider without the provider having to obtain the consent of the child’s parents. The lowest age a Member State may choose is 13 (the current de facto status quo in most EU Member States) and the highest available option is 16. If a Member State makes no decision then by default the age becomes 16 automatically in May, 2018.

Article 8 does not qualify, modify or change the definition of a child. It only sets a parameter which impacts on an online provider’s obligation to obtain parental consent. Thus, if a country opts for 15 it is not saying that for privacy or any other purposes 15 is the age at which someone may be judged to be an adult or to have the same capacities as an adult. However, I guess it would be odd if, despite designating, say, sub-16s, 15s, 14s or 13s for special treatment within a jurisdiction that this did not imply that they should also receive  extra or special or at any rate different attention. I can see that here there could be important tensions with a young person’s legal rights.

Thus, in effect, we have two classes of children. Those below the age of consent for data purposes  and those above it who are nevertheless still below the age of majority (18). I guess that has always been the case but introducing a spread of ages, including one as high as 16, does rather highlight or emphasise the matter.

Recital 38 makes an exception with regard to online preventative or counselling services offered directly to a child. In such cases the provider is not obliged to seek the consent of the young person’s parents.

 Accessibility and transparency

Recital 58 reads as follows:

The principle of transparency requires that any information  addressed to the public or to the data subject be concise, easily accessible and easy to understand, and that clear and plain language and, additionally, where appropriate, visualisation  be used….


Given that children  merit specific  protection,  any information  and communication, where processing  is addressed to a child, should be in such a clear and plain language that the child can easily understand.

Article 12 gives effect to Recital 58

The controller shall take appropriate  measures to provide any information… to the data subject in a concise,  transparent, intelligible  and easily accessible  form, using clear and plain language, in  particular  for any information addressed specifically to a child. (emphasis added)

Expect to see icons and pictograms being deployed on a much larger scale, especially but not solely in areas known to be frequented by children.

A new kid on the block

The Article 29 Working Party and national data protection authorities as a group have not been conspicuously involved in the campaign to make the online space a safer and better environment for children. That should be about to change.

Article 57 reads as follows

  1. Without prejudice to other tasks set out under this Regulation, each supervisory authority shall on its territory:

(a)    monitor and enforce the application of this Regulation;

(b)    promote public awareness  and understanding  of the risks,  rules,  safeguards  and rights  in relation  to processing. Activities addressed specifically to children shall receive specific attention (emphasis added)

Will national Data Protection Authorities start to attend meetings of the Safer Internet Centres? How will that link work? In the UK will the Information Commissioner’s Office join the Board of UKCCIS? How will it relate to Internet Matters?

Codes of conduct

This is going to be an extremely important area of activity.

Article 40 reads as follows

  1. The Member States, the supervisory authorities,  the (European Data Protection Supervisory) Board and the Commission  shall encourage the drawing up of codes of conduct intended  to contribute  to the proper application  of this Regulation,  taking account of the specific features of the various processing sectors and the specific needs of micro, small and medium-sized enterprises.
  1. Associations and other bodies representing categories  of controllers or processors may prepare codes of conduct, or amend or extend such codes, for the purpose of specifying the application of this Regulation, such as with regard to:…

(g)   the information provided to, and the protection of, children, and the manner in which the consent of the holders of parental responsibility over children is to be obtained… (emphasis added)

Consistency and implementation

Article 57 (1) (g) imposes an obligation  on national data protection authorities to

cooperate   with, including sharing information,  and provide mutual  assistance to, other supervisory authorities with a view to ensuring the consistency of application and enforcement of this Regulation.

Article 41 refers to arrangements for monitoring approved codes of conduct and Article 97 makes clear the Commission has an obligation to keep the operation of the Regulation under review and to report accordingly to the European Parliament and the Council of Ministers.

The right to be forgotten

This makes its appearance in Recital 65  and in Article 17 (where it is also described as “right to erasure”..

In the Recital it says

(the right of erasure)……..is relevant in particular  where the data subject has given his or her consent as a child and is not fully aware  of the risks involved  by the processing, and later  wants  to remove such personal  data, especially  on the internet.  The data subject should be able to exercise  that right notwithstanding the fact that he or she is no longer a child.(emphasis added)


In Recital 24 profiling is alluded to as being a form of data processing which allows the data controller (typically an online business) …to take decisions concerning…or for analysing or predicting…personal preferences, behaviours and attitudes.

I guess that includes making decisions about what sort of advertisements to serve up to someone but it could go a lot further than that. However, as we saw in the earlier reference to Recital 38

... specific protection should, in  particular,  apply to  the use of  personal data of  children for  the purposes  of marketing  or creating  personality  or user profiles  and the collection of personal  data with regard  to children when using services offered directly to a child.  (emphasis added)

And Recital 71 makes it clear children should not be the subject of profiling. How that will work out in practice remains to be seen.

Posted in Age verification, Default settings, Internet governance, Regulation, Self-regulation