Election 2017 – a Digital Manifesto

Age verification for pornography sites was a key measure advocated in the Digital Manifesto published by the UK children’s charities ahead of the 2015 General Election. Earlier this week that proposal was adopted in the Digital Economy Act, 2017. This shows that raising these matters with the political parties can bring about change.

However, further improvements in business practices and legislation are still needed. Summaries of some of the main improvements that have been identified are reflected in “Election 2017 – a Digital Manifesto” on the CHIS website but the text appears below.

Election 2017 is drawn from the original, larger  2015 document which can be downloaded from here.

Election 2017

Creation of a new independent statutory authority

  • With powers to ensure businesses and other organizations are transparent and accountable in respect of child safety, child welfare and children’s rights in the online environment.
  • The authority to publish codes of practice and make legally binding orders.
  • The resources to further initiatives which help parents support their children’s use of the internet and associated technologies, promote children’s rights and fund research.

Improved business practices

  • Anyone supplying legally age restricted goods or services over the internet must install a robust age verification mechanism and providers of online payments services must ensure they are not providing facilities to an entity in breach of this provision.
  • Online businesses and other organizations must be able to demonstrate they have taken all reasonable and proportionate steps to ensure their services are not being misused to store or distribute child abuse images.
  • Online businesses and other organizations must be able to demonstrate they are taking all reasonable and proportionate steps to enforce their Terms and Conditions of Service.


  • A new law should establish an unequivocal duty of care to children on the part of every business or other organization providing online services or producing devices which can connect to the internet.
  • Regardless of the child’s country of residence victims of child sex abuse should be able to obtain compensation from persons in the UK found in possession of images of their abuse.


Every UK territorial police force should have a dedicated unit with appropriately trained officers to deal specifically with sexual and other online offences against children.

Privacy and the age of digital consent

Before May 2018 Parliament must decide on the age of consent for data purposes for UK children. This will be at the heart of a range of children’s rights, particularly their right to privacy. The decision must be preceded by the fullest consultation and discussion with parents and children themselves.

Posted in Advertising, Age verification, CEOP, Child abuse images, Consent, Default settings, E-commerce, Facebook, Google, ICANN, Internet governance, Microsoft, Pornography, Privacy, Regulation, Self-regulation

Ring the bells!

At  3.15.p.m. today the Digital Economy Bill 2016-2017  completed the penultimate formal stages of the Parliamentary processes. The House of Lords agreed to all of the amendments to its amendments that had been made when the Bill went back to the Commons yesterday. For people reading this from outside the UK – don’t worry. Very few Brits understand it either. All you need to know is for all practical purposes it’s done. At some point this evening when the Prorogation is completed we will need to start calling it the Digital Economy Act, 2017.

The Bill addressed many different issues but readers of this blog will want to know that age verification is now a requirement for all significant commercial publishers of pornography, wherever in the world they are based and whether or not they style themselves as being “free”.  If they are about making money and they want to have access to the UK market they must have age verification.  If they don’t they can be fined and ultimately ISPs will be required to block access to them in roughly the same way that they block access to child abuse images. The exact date for the commencement is unknown by me right now. There is bound to be a run in period to give everyone time to prepare for the new regime.

There ends many years of campaigning by the British children’s groups and others to get to this point but great credit must go to David Cameron and the last Conservative Administration (a) for picking up the matter and, uniquely, putting it in their Manifesto for the 2015 General Election and (b) sticking with their promise when they won.

The sterling efforts of Theresa May’s Administration in carrying on what Cameron began must also be acknowledged. However, once again, what was extremely gratifying was the very broad level of support there was for “our bits” of the Bill across all the political parties. Aside from Karen Bradley and Matt Hancock, as the hands-on Government Ministers, particular tributes need to be made to the continued leadership of Claire Perry from the Government’s backbenches, to  Fiona Mactaggart, Thangam Debbonaire and Helen Goodman from the Labour Party, Baroness Floella Benjamin from the Liberal Democrats and Baroness Howe from the Crossbenches. Several Scottish Nationalists and members from Northern Ireland similarly were star players. Truly this was an all-Party affair in the end.

Because of the unexpected calling of a General Election the Bill suddenly had to go into the emergency procedures known as the “wash-up” and “ping pong” (again, don’t ask) where the Labour Front bench, in the shape of Louise Haigh and Kevin Brennan, had to reach compromise agreements with the Government.  In the past, at this point entire Bills have been lost but that didn’t happen here. Phew!

Even if the Bill hadn’t had to go into wash-up I doubt we would have ended up in a very different place before it reached its otherwise “natural” conclusion.

Everyone appears to be agreed that the definitions of the sorts of materials which the Regulator may require to be removed before a site can be age verified are not satisfactory. We have ended up with them because of ill-judged, inaccurate scaremongering. Anyway the Government accepted a Labour amendment which means that all the definitions have to be looked at again. We can take a deep breath and pause although I think the children’s groups are unlikely to depart from their past practice. If there is an apparent threat to children that is associated with the availability of certain types of material we will speak out but otherwise we will not engage. We have no general view on the desirability or otherwise of porn. We do care about children accessing it.

A code of practice will also be developed for social media platform providers to deal with a range of concerns, not least in respect of the way different types of bullying are being handled.

Nothing stands still. Very soon we will publish Digital Manifesto 2017 addressed to the Parties contesting the upcoming General Election. The manifesto reflects several bits of unfinished business.

Let’s see where this ends up.

Posted in Age verification, Child abuse images, Default settings, E-commerce, Internet governance, Pornography, Regulation, Self-regulation | 1 Comment

Sex crimes in the USA

The UK is not the only country where child sex abuse offences are or have been rising rapidly. See New stats on US Federal sex crimes. If you read the text it seems a similar picture emerges in respect of other jurisdictions within the USA. Federal arrests in fact only accounted for 4% of all arrests.

No indication of if or how the internet was part of the offending behaviour. Neither are child sex offences separated out but the publication lists some of the initiatives and laws which lie behind some of the stats and it is quite clear that cyber and children are part of it.


Posted in Child abuse images, Regulation, Self-regulation

An unsatisfactory state of affairs

Thanks to Baroness Jones of Whitchurch for tabling a Parliamentary Question about what happens to non-photographic child sexual abuse images, in particular, Manga and CGI-based material (computer generated imagery).  Lady Jones asked about such materials hosted on machines physically located in the UK and machines not physically located within the UK.

Here are the relevant parts of the answer, which became available today.

The IWF addresses reports concerning non-photographic images when they are hosted on UK websites. Where such images are believed to be criminal and are depicted on a website hosted in the UK, (the IWF) will work in partnership with the hosting provider and NCA-CEOP to remove the content and provide information to assist investigations into its distribution.

The limitation to websites hosted in the UK makes no sense at all from a child protection point of view. So why does that limitation exist?  I have asked but so far have received no answer. I appreciate the volumes may not yet be large but having recently seen an online game that used a lot of CGI material there can be no doubt it will become more common within the grimy world of csam.  I thought the characters in the game I saw were actors. They were that lifelike.

So what happens to identical material that is published on sites hosted on machines that are physically based outside of the UK? Here is the reply

If the site is outside the UK, it is reported by the NCA to the host country via Interpol channels to take appropriate action.

I would wager that is another way of saying nothing happens.  The stuff remains completely accessible to everyone in the UK without limitation unless it happens to be caught by some other defensive measure.

That is not good enough. I doubt this will loom large in the forthcoming General Election in the UK but it might register somewhere.

Posted in Child abuse images, Regulation, Self-regulation

ICANN lets down child abuse victims, again

There is a major story in today’s Sunday Times.  It has a front page splash and a half page inside.  It concerns two remnants of Empire, technically known as British Overseas Territories, the Chagos Islands and St Helena, the latter is in the South Atlantic the former in the Indian Ocean. Both have country code Top Level Domains: .sh and .io respectively. As the Chagos Islands are, theoretically, uninhabited, you might wonder why it needs a country code at all but let’s leave that on one side for now. Actually there are people living there but they are all in the US military. We evicted the former inhabitants, the Chagossians, in order to make way for a naval base.

As with all country codes these were awarded to a Registry, presumably after someone or other within the British Government gave it the the thumbs up. The Registry in question, responsible for both .sh and .io, is owned by a company called ICB, which has an office in Bournemouth on England’s south coast.

So far, so unremarkable. However, two weeks ago the IWF published its Annual Report. Turns out the IWF found one url with child abuse images on a .sh domain but the Chagos Islands had  nearly 1,500. It has gone from almost zero to being the world’s 4th largest source of child abuse images.  .io “only” had 3% of the global total, with .com and .net way ahead on a combined 70% (.com and .net, by the way, are owned by the same Registry, Verisign, based in Reston, Virginia). Even so, for .io to get to be global number 4 overnight takes some doing, or rather it takes some indolence.

This highlights once again the uselessness of ICANN’s system of granting Registry Agreements without insisting on and policing basic security measures and checks. It is hard to believe anyone would register a domain then use it for criminal purposes if they knew their real world identities and home addresses were easily discoverable by anyone with a legitimate reason to know.

Part of the story in the Sunday Times was that the Foreign Office was conducting an urgent enquiry into how all this came about. I’ll bet they are. It’s clearly an oversight. A decision almost certainly taken by a low level official who probably had little or no expert knowledge of the issues associated with domain names, Registries and the like.

The logical and obvious answer is for the Government to say, in future, anyone who needs their permission to run a Registry for anywhere in the former colonies, dominions or dependencies must agree to follow identical policies to Nominet, who are responsible for .uk. And if they cannot do that perhaps Nominet could be prevailed upon to do it for them or else the country code will have to wait or be withdrawn until it can be made to run in an acceptable manner.








Posted in ICANN, Internet governance, Regulation, Self-regulation

New data on prosecutions for images related offences

On 28th March, Parliamentary Under Secretary of State at the Ministry of Justice Sam Gymiah MP answered a Parliamentary Question about proceedings in the Magistrates Courts in England and Wales and convictions in all courts in England and Wales under s1, Protection of Children Act (child abuse images), s2, Obscene Publication Act, 1959 (er, obscene publications), s63 Criminal Justice and Immigration Act, 2008 (extreme pornography), and s62  Coroners and Justice Act, 2009 (prohibited images).

The police are now saying they are arresting around 3,000 people per year for child abuse images related offences  – record numbers – and this seems to be borne out by the 2015 figures in respect of proceedings brought in the Magistrates’ Courts (2374). The number of convictions obtained that year in all courts lags behind (1994) but that is to be expected. 2016’s numbers will start to show the bulge.

The table also underlines that the Obscene Publications Act is pretty much a dead letter. No new proceedings initiated in 2015 and only two convictions arising from proceedings started earlier.

The scale of activity in relation to extreme pornography seems low. Hard to judge exactly what this means but given the Age Verification Regulator has a power to act in relation to such material under the Digital Economy Bill we may get some further insights when they begin to report on how they are implementing what will then be an Act.

Posted in Age verification, Regulation, Self-regulation

Brussels Bulletin – report back on GDPR matters

Earlier this week in Brussels the Article 29 Working Party and the Commission independently organized two separate sets of meetings on the GDPR. They were arranged sequentially. This meant I was able to go to both. Article 29 came first.

At the Article 29  meeting there was a workshop on consent, one on profiling and one on data breach notifications. I attended the workshop on consent but there was a plenary with a report back from the other two.

Both sets of meetings were dominated by a wide range of industries and interests far removed from the usual child protection and child welfare circuits we normally inhabit.

Gillick Principles?

Having said that, a chap from the British National Health Service (NHS) had some pressing points about children being given access to their own medical records.

Where those records are accessible online – seemingly a great many already are and more will be in future – would a child’s doctor need to obtain parental consent before allowing the child access?  His view seemed to be (a) the NHS is not an Information Society Service provider within the meaning of the GDPR (because it is not commercial) and therefore is not bound by its terms, but (b) even if it was, the Gillick Principles should apply in any event i.e. irrespective of a child’s age and the statutory minimum age defined for wider purposes, a child’s doctor could grant access  exclusively to the child without having to obtain parental consent and without even having to inform the parents, providing the doctor judged the child to have sufficient understanding of what the records were and sensitivities surrounding them. Definitive answers or guidance came there none but in that respect, the NHS was not alone. Not by a long chalk.

Everybody has lots of questions

Which brings me to point number one. There is still a great deal of uncertainty hovering over many bits of the GDPR.  We are not alone. Not much comfort, but maybe a little.

Several of these areas of uncertainty have potentially enormous economic, legal, technical and operational implications for businesses and consumers alike. Everyone was impatient to hear when draft guidance documents would start to appear for comments. The answer was generally the same – “soon”.

Quite strong feelings were expressed that there was no way everything could be done satisfactorily in time for the May, 2018 deadline. There was talk of possible “transitional arrangements” but I thought we were already in transition.

Need for a specific focus on children

The very wide number of interests that were clamouring to make their points or ask their questions meant there was simply no way I could air all mine consequently I hammered home one simple point both in the consent workshop, in the plenary, then at the Commission’s meeting. This was about the need for a specific discussion around the position of children under the GDPR. That seemed to be accepted by everyone and both  PEGI and the Toy Industry trade association spoke up to agree with me.

I mentioned that the UK’s data protection authority – the ICO – had indicated they would organize a consultation on children and the GDPR but nobody from Article 29 or the Commission appeared to be aware of any other authority in any other Member State that was planning to do the same. That was a surprise.

A quick summary of a number of the other points I thought were important:

No update was available about the age limits that different countries were going to adopt.

It was acknowledged that having children on the same site or App at the same time but in different jurisdictions with different age limits was throwing up “interesting challenges”.

Article 35 impact assessments seemingly will not influence or alter a company’s obligation to carry out, or not carry out, age verification, neither will it determine the lengths to which a company might need to go to ensure any parental consents it obtains are authentic. I am not sure a court would ultimately see things that way but that was the view of most people in the room.

A single consent cannot be used for multiple purposes. But when you join Facebook or YouTube you are potentially engaging with a number of different types of activity, each of which can generate data that is processed. That being the case…..?

What happens when the GDPR kicks in? Are all previously obtained consents vitiated?

It seems it would be unwise for any company to leave consent solely within their Ts&Cs. They should be pulled out and presented as a standalone. This stems from the need for the giving of consent to be unambiguous and clear.

Officials repeatedly were at pains to point out that consent is only one basis on which data can be lawfully collected and processed although how or whether any of the other grounds will apply in the case of children must be open to doubt.


Last but by no means least, it is apparent that the issue of “profiling” is going to be huge. By coincidence, the UK’s ICO issued a “request for feedback” on profiling the day after the workshop and there is a specific section on children. For ease of reference I will quote from bits of it.

We are reminded that Recital 38 of the GDPR says

“….as they may be less aware of the risks, consequences and safeguards concerned and their rights in relation to the processing of personal data. Such specific protection should, in particular, apply to the use of personal data of children for the purposes of marketing or creating personality or user profiles……” 

The ICO interprets this as meaning

Controllers must not carry out solely automated processing, including profiling, that produces legal or similar significant effects (as defined in Article 22(1)) in respect of a child.

A child for these purposes is someone who is under 18. The age limit for giving consent to data transactions without the need to obtain parental consent seemingly is irrelevant under this heading. Where is this going to leave social media sites that rely on automated processing to serve advertisements to persons under the age of 18 and will they be required to determine whether or not you are 18 or above before your profile can be assigned any advertisements?

Before we get too carried away there is an issue about whether or not advertising counts as being the product of a decision based solely on an automated process.  I think it must be, especially the way the modern system of bidding works. Is being exposed to an advertisement a “legal or significant effect” anyway? Some obviously thought it wasn’t. Others disagreed.

I also had an interesting side discussion about whether or not making it easy for children to lie about their age with impunity or with very little possibility of discovery, and gain benefits from so doing e.g. by becoming and remaining a member of a social media service, could in and of itself and without more constitute a harm? At least one person other than me thought it could. Watch his space.

You can see why many people from industry are having sleepless nights. So am I.


Posted in Age verification, Consent, Default settings, E-commerce, Regulation, Self-regulation