Towards a new internet strategy for the UK

On 27th February Karen Bradley MP, Secretary of State at DCMS announced the Government’s intention to  develop a new internet strategy. As a member of the Executive Board of UKCCIS I was asked for my views.  I reproduce them below.

I pointed out that in the time available it had not been possible for me to consult the children’s organizations with whom I normally work, through CHIS. The note I sent in therefore expressed only my personal views.

End of the Bill

In his speech on the Digital Economy Bill, last Monday night in the House of Lords, Lord Ashton referred to the  Secretary of State’s announcement in the context of there being a need for a wider discussion about the effects of pornography in society as a whole, not solely in respect of children. I would hope there will be an opportunity to contribute to that aspect of the review. I accept it was never envisaged that the Digital Economy Bill was to be a trigger for a wider debate about what sorts of pornography are more or less acceptable, whether being viewed by children or not. However, just because children cannot view certain types of material that have been put  behind an age verification wall, it does not mean that its continued availability to adults does not constitute a threat to children. Such material might encourage, promote or appear to legitimize or condone  harmful behaviours which either directly or indirectly put children at risk.

Of the issues mentioned in the Secretary of State’s statement of 27th February  I will focus on:

  1. Future proofing and technology solutions for online safety
  2. The role of industry in online safety

Future proofing

“Future proofing” is a great idea but it sits ill with the reality  of how Silicon Valley and the  high tech industries generally operate and how they think they ought to operate.

Speed to market and first-mover advantage are often dominant or enormously important concerns in the product development cycle. Novelty, “newness” and built-in or planned obsolescence are major drivers.  “Permissionless innovation” is held out as a defining principle. All this can sometimes get in the way of prudence. Nowhere is this better exemplified than in the case of Facebook.  It was founded around the idea that the company would “ Move fast and break things”. Apparently recently this has been changed to “The fast shall inherit the Earth”. It would be interesting for someone with the time and the inclination to count how many times Facebook has apologised for “getting it wrong”, said it is in “listening mode”, “learning” or something of that kind.

As long as companies believe it is easier and, from a business point of view, better and comparatively risk-free,  to apologise after the event if a problem is discovered than it is to get it right  beforehand we will forever be playing catchup. To some degree there will always be an element of catchup but it feels as if there is  currently a  gaping chasm. It needs to be narrowed, even if it cannot be entirely and permanently bridged.

Duty of care

Under our general laws of negligence everyone is under an obligation to take reasonable care in everything they do but in the technology space a great many factors militate against individuals in the UK  being able to bring enforcement actions, mount claims for damages or require companies to improve their safety performance. For this reason there may be some value in explicitly establishing that all businesses operating in UK markets  have a duty of care to children and that prior to any product launch or revision of terms and conditions of service which will apply or be available in the UK they are obliged to consider the child safety and child welfare aspects of their proposed action and maintain documentary evidence that demonstrates  this has happened. In appropriate circumstances, such evidence could be called upon by a new regulator (see below). A breach could be the subject of a substantial fine or other penalty.

A whole new world is approaching rapidly

Right now we stand on the edge of a major step change in the technological space. It builds on what has gone before but appears to promise a qualitative shift of substantial proportions. Advances in Artificial Intelligence (AI), in the emergence of an algorithmic world, the growth of the “Internet of Things” (including toys)  and in particular the development of Augmented Reality and Virtual Reality applications and associated hardware, are taking us towards a tremendously exciting future but it is completely beyond the scope of a body like UKCCIS even remotely to contemplate being able to maintain any sort of credible or useful public interest oversight. I am not saying I am against us trying but we shouldn’t misrepresent what we could reasonably hope to achieve.

The question of age

In respect of a hugely important set of issues which arise in relation to the implementation of the GDPR, it is acknowledged that the Office of the Information Commissioner will be in the driving seat. There is nevertheless the key question for Parliament to resolve, presumably on a motion to be tabled by the Government,  of what the lower age limit is to be i.e. at what age can a child decide for themselves whether or not to hand over personal data to commercial and other third parties providing Information Society Services? If such a motion is not decided by Parliament ahead of May 2018, we will default to age 16.

Thus, if the UK is to depart from the de facto current standard of 13 a great deal of re-writing and re-consideration of all sorts of things will need to happen so, clearly, the sooner everyone knows the longer-term position the better it will be for all concerned.

A related question is how and on what basis the Government will decide what recommendation to place before Parliament about the age limit? What consultation will there be and with whom? Will children themselves be able to have a say? Is there any research evidence which strongly points in a particular direction in terms of the optimal lower age limit? I know of none. Back in the 20th Century, when the Americans came up with 13, social media sites did not exist, or at least not in anything even remotely like they do today.

Has any consideration been given to the online safety and other implications of different countries having different lower age limits?

The role of industry – an obligation to act

 If we are not already there I think we are coming close to the end of the road with the way online child safety policy has been formulated, implemented and monitored hitherto in the UK.  The major social media platforms play a pivotal role in the lives of our children. In that regard, they are in a monopolistic or near monopoly position. Yet there is an almost complete lack of transparency in respect of the way in which  they operate in relation to children’s safety. Quick to say sorry, slow to reveal. It is true we know they do some excellent work in relation to child abuse images and detecting paedophile behaviour but there is a lot more to online child safety than that. Bullying and children’s concerns about inappropriate content also loom large.

How much do these businesses spend on child safety as opposed to, say, lobbying and corporate hospitality?  How many people do they employ in carrying out moderation as opposed to government relations? Wouldn’t knowing that give us some idea of the real priority attached to child safety?

Late last year the Children’s Commissioner for England asked Facebook and Google a number of questions about children’s use of their service and they both declined to reply. This has happened before, at least in respect of Facebook. A few years ago, at the end of several months of discussion and negotiations with the UKCCIS Evidence Group, Facebook finally said (paraphrasing) “We will not give you any information we are not legally obliged to publish. It could be commercially sensitive and we have to take account of things like the rules of the New York Stock Exchange”.

A new regulator

We need a body with legal powers to compel online businesses to answer questions which are relevant to determining how they are addressing online child safety issues. In the face of evidence that things continue to go wrong we cannot forever be asked to take everything on trust.

Such a body could also play a role in developing and keeping under review the broader and emerging online space in respect of the online child safety agenda. It could perhaps replace  some or all of the functions of the UKCCIS Executive Board and become a new, well-resourced, independent focal point for online child safety with the capacity and the obligation to keep tabs on new and emerging technologies specifically in the context of online child safety.

This regulator could issue codes of practice which would have legal force, establishing standards against which businesses are held accountable. In lots of ways the internet industry has matured. It has become central to the way modern societies work. It cannot expect to be left outside the same sort of oversight rules that apply in practically every other area of importance in our communal life. “Internet exceptionalism” is an idea whose time has not come.

When the last Government asked Ofcom to enquire into the operation of the “Big Four’s” filtering policies Ofcom merely asked the ISPs to provide them with information. Ofcom did not seek to verify the information it was given, neither did it try to determine if it told the whole story. Ofcom did not seek to explain why there were differences in outcomes as between the “Big Four”. This is not satisfactory but it was all Ofcom felt it could do given it had no legal powers to do more.

Preserve the immunity but narrow its scope

Microsoft recently confirmed that it was only aware of around 100 businesses and other organizations (this included law enforcement bodies) worldwide that were availing themselves of their (free) PhotoDNA technology for fighting against the distribution or storage of child abuse images on their networks or services. A depressingly low number.

To combat this degree of lethargy or apparent indifference we need to narrow the scope within which the principle of immunity under the eCommerce Directive can be maintained. In other words, while leaving the overriding principle of immunity intact – it would be wrong for anyone to be held liable for something where they had no knowledge of it – we should nevertheless institute a new law or rule which says, having regard to available technologies, online businesses  must take all reasonable and proportionate steps to ensure their networks or services are not being used for unlawful purposes and also shows that they are taking reasonable and proportionate steps to enforce their own Terms and Conditions. If Ts&Cs can be published but there is zero serious effort to ensure they are honoured it is tantamount to being a deceptive practice. Relying so heavily on users reporting breaches isn’t working well enough. I understand that AI is seen as being a possible route to salvation. Let’s hope that works – and quick.

The current principle of immunity was first introduced in the USA in 1996 and taken up in Europe in the eCommerce Directive of 2000 in order to protect fledgeling businesses from being sued while they innovated in good faith. It, has, however, become an alibi for inaction, a refuge for scoundrels. So far from being fledgelings some online businesses are now fully grown condors.

Even new start-ups are acting in an environment which is completely different from that which prevailed in the mid-to-late 1990s and the early part of “noughties” when several of today’s giants began their lives. That said it should still be recognised that we expect more of larger companies than we do smaller ones. Proportionality matters. Small businesses deserve more protection from liability than large ones, providing such latitude is not abused or interpreted as giving them a licence to be reckless or not care at all.

The international dimension

 The UK Government, rightly, is widely acknowledged to be a global leader in the online child protection space as evidenced, for example, by its role in establishing and funding the WePROTECT Global Alliance. I think there would be some value in having, say, an annual report on the activities undertaken by the Alliance, detailing progress.

HMG also participates in major internet governance institutions such as ICANN and the IGF where, again, an annual report may be both valuable and interesting. I think this is especially true in respect of ICANN whose performance in respect of online child safety has been somewhere between woeful and outright neglectful or positively dangerous.