The Facebook files

There are astonishing and detailed  revelations in today’s Guardian about the scale of abuse on the Facebook platform.  They are based on a major leak from someone who was obviously once on the inside track – and maybe still is.

Seemingly in January of this year Facebook had to assess 54,000 reports of alleged revenge porn and “sextortion”.  14,000 accounts related to these complaints were disabled. Apparently, 33 cases involved children. If these numbers are not bad enough just think about how many abuse reports Facebook might have received in the same month which they categorised  under other headings?

Let’s not leap to any conclusions. Leaks are not always the most reliable way of learning the truth, the whole truth and nothing but the truth. Little things like context also matter but set against Facebook’s historic tradition of near total opacity about its inner workings the Guardian piece could quickly become the received wisdom.

Less than three weeks ago Facebook “revealed” they were going to recruit a further 3,000 moderators to add to the 4,500 we are told were already in place. There will now be the faintest suspicion that the company knew about the leak and were trying to get their retaliation in first because up to that moment Facebook had steadfastly refused ever to disclose how many people it employed as moderators.

Of course, we do not know when Facebook will reach 7,500 moderators – recruiting and training them will take time – and as one of their senior staff commented “there is not necessarily a linear equation between the number of moderators employed and the effectiveness of the moderation function.” Quite so.

In other words without knowing a great deal more, 7,500 may turn out to be a wholly politically determined number that bears no relationship whatsoever to what is needed.

As soon as the UK elections are over I would say social media platforms are in for a tough time.  And once that political ball starts rolling there is no knowing where it might end up.

My guess is there’s more to come from The Guardian. Watch this space.

Posted in Facebook, Google, Internet governance, Regulation, Self-regulation

GDPR – Chapter 724

The UK’s children’s organizations have submitted a detailed note to our data protection authority – the Office of the Information Commissioner.  I won’t try to summarise it here. You can download it from the CHIS website. If you have any comments or suggestions please let me know.

In Brussels earlier this week at the ICT Coalition a few other matters arose that I had missed. One is important and obvious, the others were just important!

Obvious and important: we are all fixated on May 2018 but that is not a once and for all date. In May 2018 the GDPR comes into effect . This means unless a Member State has already “derogated” from a provision where derogation is allowed e.g. Article 8, all of the defaults will apply automatically. A country could, of course, still revisit a particular item afterwards and change its domestic law. I guess the only point is that between now and May 2018 everybody with an interest will or ought to have their heads down as they try to get ready for the big bang. Thus I am not sure if there will be any real appetite to reopen anything to do with the GDPR anytime soon after May 2018. This means  May 2018 remains an incredibly important target date.

Other points

If a company sets itself up in a jurisdiction within the EU where the age of consent is, say, 13, can it apply the (traditional)  “country of origin principle” and, in effect, make 13 the standard in every jurisdiction where it operates?  I doubt it, otherwise what would have been the point of allowing Member States to choose in the first place?

Similarly, where the service is provided from outside the EU what latitude will there be to ignore or have terms and conditions or operational principles which would not be permitted if the business was within the EU?  I think here the answer is “none” because somewhere there is a provision which says data can only be transferred across national boundaries if the third party has a data regime which is consistent with that prescribed by the EU in the GDPR. An age limit of 11 or 12, for example, would therefore not be allowed unless verifiable parental consent was obtained.

Within a classroom, if a teacher wishes to use a particular online resource in the course of a lesson is it necessary for every child in the class to have given their permission either for that specific site to be displayed or would a general consent be sufficient? What happens if such  consent has not been given? Is the teacher allowed to proceed or must the child be excluded for the duration?

And if the children are below the age of data consent and their parents refuse to give permission?

The GDPR does not change or affect any rights children may have which are not linked to an “information society service”and it has been suggested that, for example, the “right to be forgotten” encompasses both online and offline environments.

Posted in Age verification, Consent, Regulation, Self-regulation

A very disturbing message from Oz

José Mourinho (Manchester United) once described rival manager Arsène Wenger (Arsenal) as a “specialist in failure”. I think we should start referring to Facebook as a “specialist in apologising”.

Seriously. Is there anyone out there who has recorded the number of times the company acknowledged that they “got this wrong” or that they  had “taken their eye off the ball”, insisted they were “listening” or some variation of one of these themes? I wish I had had the foresight to keep notes over the years.

Perhaps this readiness to apologise is rooted in the confessional culture which seems to be so pervasive in the USA. Everything can be made ok again as long as you admit it and throw a cloak of earnest humility over your multi-billion dollar, near-monopoly business. Do it with a bit of wit and self-deprecation and you can even come out ahead.

I agree that admitting stuff is better than covering things up or denying them but when it becomes a persistent habit you have to wonder if it hasn’t been elevated to a strategy. “Move fast and break things but when  you are completely cornered……”

At first Facebook sought to be excused their lapses because, contrary to then appearances, they were still a “small but growing company” whereas today the fact that they are so big is offered as an explanation for error. Heads I win. Tails you lose.

I mention all this by way of a preface to the revelations which have emerged courtesy of The Australian newspaper. It seems senior  Antipodean Facebook executives authorised research which looked at how advertisers might benefit from being able to present their wares to children as young as 14 who might be feeling a bit low, depressed or similar. Mood manipulation for money. Worse and worse, the Facebook users whose data the company looked at as part of the research were not aware that this was happening so, obviously, they could not consent to it. It is reassuring to know that no individual was identifiable from the research. Reassuring, but not nearly good enough.

Somehow the documents exposing what had being going on within Facebook found their way into the public domain.  Thank goodness for whistle blowers. Immediately the project was disowned by the big bosses and apparently some sort of enquiry is now underway internally. I doubt that a purely internal process will be acceptable in a matter of this sort. That is precisely the sentiment expressed in an excellent letter  just published by around 25 different privacy and consumer groups from different parts of the world.

But I guess at the end of the day the key question is this: what is it about attitudes at senior levels within Facebook that allowed the guys responsible for the Aussie experiment to think that this was or could ever be in any way acceptable? Are any similar activities taking place right now in other jurisdictions or have there been in, say, the past three years or so?

Once again, where was Facebook’s Safety Advisory Board when all this was going on?

It is true Facebook do a lot of good work in other areas, particularly with regard to detecting paedophile behaviour and dealing with child sex abuse material but when things like this occur it just makes everyone wonder what truly makes the company tick. The cultivated bonhomie starts to wear thin. The other face of Facebook is staring right at you.

I imagine what happened Down Under will have mortified many good people within Facebook who are completely and genuinely committed  to what they do in the child protection and child welfare space. They will be horrified at the idea that their employer could be linked in any way whatsoever with the possibility of exploiting youngsters in such an underhand way. They need to start a rebellion. In particular they need to focus on what I can only suppose are the  structural failings within the company that allow things like this to keep on happening.

Poor editorial and operational decisions or oversights continue to mount, making an unanswerable case for at least some degree of external regulation or scrutiny of Facebook’s affairs. I think  that is now more or less inevitable but even so one last act of redemptive contrition might soften the blow.

Posted in Child abuse images, E-commerce, Facebook, Internet governance, Privacy, Regulation, Self-regulation

What’s really going on with the new law

The number of sites that are going to be affected in any way at all  by the relevant provisions of the Digital Economy Act, 2017, is comparatively small although, of course, they are the biggest sites – the ones that have been shown to account for the overwhelming majority of visits to porn sites by under-18s in the UK. The threat to smaller sites, whether commercial or non-commercial, that might cater for particular or indeed any and every sort of taste or sexual preference is and always has been zero. This conforms with the principle of proportionality that is threaded throughout the legislation.

Definitions. No new categories of illegal material have been created

The Digital Economy Act changes absolutely nothing in terms of defining what is and what is not legal by way of depictions of sexual activity. If a particular type of image was legal before the Act, it is still is, and if it is was illegal it still is.

All that has happened as a result of the Act is the Regulator can, in effect, secure the removal of “extreme pornography” from qualifying sites. Extreme pornography is defined in the Criminal Justice and Immigration Act 2008. It would be very odd indeed, would it not, if the Regulator knowingly gave a green light to a site that contained material which, on the face of it, was illegal?

Whether the existing definitions of what is legal and what isn’t are, in and of themselves,  (a) good enough and (b) appropriately congruent in online and offline environments, is a different matter but that will be separately addressed in the upcoming review of the UK’s overall internet safety strategy.

Would we all have preferred this important dimension to have been resolved as the Bill went through Parliament? Of course and, but for the early General Election, it might have been although I agree the odds were running against it. The question of definitions was acknowledged to be a shortcoming by everyone involved, both on the Government side and among the opposition parties, so I think we can all be confident the matter will be resolved asap.

Should this acknowledged deficiency have brought the whole proceedings to a full stop? Those who did not like the Bill anyway thought so. Should everything have been put on ice until it was cleared up? Absolutely not. What we have ended up with is definitely a great beginning and it will get better when the definitions have been looked at again with less immediate pressure on everyone to navigate a Bill through all manner of Parliamentary exotica and hurdles.

Privacy. Things will only get better

In relation to privacy, some of the solutions we know are emerging to meet the needs of the new regulatory environment mean that  “Ashley Madison” scenarios are impossible, or at any rate hugely less likely to happen than they are right now.

This is because – and again this has been apparent for some time – the porn sites that will be regulated are going to have to use privacy-friendly solutions, and here the Information Commissioner has an independent role to ensure that they do. According to the basic legal principle of data minimization, the only thing a porn site needs to know in order to allow access is whether or not you are over the age of 18. Not your name, not your address, IP or otherwise, not your actual age or gender. Nothing but the simple fact of being or not being over 18.

Most of the affected pornography sites are going to use independent third party age verification providers who will not retain any kind of record that can be traced back to a named individual, account, device or profile.  Moreover, even if it could, the mere fact that someone has been age verified will not prove anything about their interest, or lack of it, in pornography. There is a wide range of products and services that require someone to be able to prove their age e.g. in relation to the purchase of alcohol, tobacco, knives, gambling and so on.

Of course if, as now, someone buys something directly from a pornography site and they give the site their credit card details then that is completely different and a matter of choice for every user.

Some of the age verification solutions are likely to incorporate measures which will allow their users to prevent any of their data from being collected once they have gained access to a porn site. Thus, no doubt paradoxically to some, the era of age verification heralded by the Digital Economy Act, 2017, could turn into the beginning of an era where privacy is greatly enhanced, not threatened even to the smallest degree.

Posted in Age verification, Default settings, E-commerce, Pornography, Privacy, Regulation, Self-regulation

The news today

Anyone reading the British newspapers today who works for or owns one of the major social media platforms will quickly realise that the end of the self-regulatory road is upon them.  Coming on the heels of extensive coverage last week of major online child protection stories, generated by the NSPCC and CHIS, today the Parliamentary Home Affairs Select Committee released their report about what they suggest is the wholly inadequate way in which social media companies are dealing with online hate crimes. However, if you were to put the Home Affairs report side by side with the documents issued by the NSPCC and CHIS what would be striking is the similarity of the analysis, the conclusions and the recommendations.

The Times made the Select Committee Report their lead front page story. The Daily Telegraph did likewise. The Guardian, The Daily Mail The Daily Mirror and The Independent also carried major pieces. I stopped counting at that point.

Last week the NSPCC called for the social media companies to be made subject to fines if they didn’t up their game in terms of child protection. CHIS called for a new independent regulator with the power to make legally binding orders.  In their report today the Select Committee made an analogy with how we police football.  In the UK the clubs have to pay for the police time involved in keeping public order on match days. The MPs argued that if the social media companies do not improve their performance the police will start going on the sites and charge them for services rendered. Stick that in your pipe and smoke it.

Then there’s the fake news stuff and in case you had forgotten we are in the middle of a General Election campaign.

General Election or not this day has been a long time coming and its inevitability has never been in doubt. The guys in California are super smart. They will doubtless have anticipated the current scenarios. They knew it couldn’t last forever but they have had a good run, perhaps a better run than they ever expected.   In the meantime, the cash balances have grown and now they might just have to take a small hit to avoid some of the worst threats and possibilities that are definitely on the near horizon. In twelve months some things will be very, very different.

Posted in Default settings, Facebook, Google, Internet governance, Regulation, Self-regulation

Election 2017 – a Digital Manifesto

Age verification for pornography sites was a key measure advocated in the Digital Manifesto published by the UK children’s charities ahead of the 2015 General Election. Earlier this week that proposal was adopted in the Digital Economy Act, 2017. This shows that raising these matters with the political parties can bring about change.

However, further improvements in business practices and legislation are still needed. Summaries of some of the main improvements that have been identified are reflected in “Election 2017 – a Digital Manifesto” on the CHIS website but the text appears below.

Election 2017 is drawn from the original, larger  2015 document which can be downloaded from here.

Election 2017

Creation of a new independent statutory authority

  • With powers to ensure businesses and other organizations are transparent and accountable in respect of child safety, child welfare and children’s rights in the online environment.
  • The authority to publish codes of practice and make legally binding orders.
  • The resources to further initiatives which help parents support their children’s use of the internet and associated technologies, promote children’s rights and fund research.

Improved business practices

  • Anyone supplying legally age restricted goods or services over the internet must install a robust age verification mechanism and providers of online payments services must ensure they are not providing facilities to an entity in breach of this provision.
  • Online businesses and other organizations must be able to demonstrate they have taken all reasonable and proportionate steps to ensure their services are not being misused to store or distribute child abuse images.
  • Online businesses and other organizations must be able to demonstrate they are taking all reasonable and proportionate steps to enforce their Terms and Conditions of Service.


  • A new law should establish an unequivocal duty of care to children on the part of every business or other organization providing online services or producing devices which can connect to the internet.
  • Regardless of the child’s country of residence victims of child sex abuse should be able to obtain compensation from persons in the UK found in possession of images of their abuse.


Every UK territorial police force should have a dedicated unit with appropriately trained officers to deal specifically with sexual and other online offences against children.

Privacy and the age of digital consent

Before May 2018 Parliament must decide on the age of consent for data purposes for UK children. This will be at the heart of a range of children’s rights, particularly their right to privacy. The decision must be preceded by the fullest consultation and discussion with parents and children themselves.

Posted in Advertising, Age verification, CEOP, Child abuse images, Consent, Default settings, E-commerce, Facebook, Google, ICANN, Internet governance, Microsoft, Pornography, Privacy, Regulation, Self-regulation

Ring the bells!

At  3.15.p.m. today the Digital Economy Bill 2016-2017  completed the penultimate formal stages of the Parliamentary processes. The House of Lords agreed to all of the amendments to its amendments that had been made when the Bill went back to the Commons yesterday. For people reading this from outside the UK – don’t worry. Very few Brits understand it either. All you need to know is for all practical purposes it’s done. At some point this evening when the Prorogation is completed we will need to start calling it the Digital Economy Act, 2017.

The Bill addressed many different issues but readers of this blog will want to know that age verification is now a requirement for all significant commercial publishers of pornography, wherever in the world they are based and whether or not they style themselves as being “free”.  If they are about making money and they want to have access to the UK market they must have age verification.  If they don’t they can be fined and ultimately ISPs will be required to block access to them in roughly the same way that they block access to child abuse images. The exact date for the commencement is unknown by me right now. There is bound to be a run in period to give everyone time to prepare for the new regime.

There ends many years of campaigning by the British children’s groups and others to get to this point but great credit must go to David Cameron and the last Conservative Administration (a) for picking up the matter and, uniquely, putting it in their Manifesto for the 2015 General Election and (b) sticking with their promise when they won.

The sterling efforts of Theresa May’s Administration in carrying on what Cameron began must also be acknowledged. However, once again, what was extremely gratifying was the very broad level of support there was for “our bits” of the Bill across all the political parties. Aside from Karen Bradley and Matt Hancock, as the hands-on Government Ministers, particular tributes need to be made to the continued leadership of Claire Perry from the Government’s backbenches, to  Fiona Mactaggart, Thangam Debbonaire and Helen Goodman from the Labour Party, Baroness Floella Benjamin from the Liberal Democrats and Baroness Howe from the Crossbenches. Several Scottish Nationalists and members from Northern Ireland similarly were star players. Truly this was an all-Party affair in the end.

Because of the unexpected calling of a General Election the Bill suddenly had to go into the emergency procedures known as the “wash-up” and “ping pong” (again, don’t ask) where the Labour Front bench, in the shape of Louise Haigh and Kevin Brennan, had to reach compromise agreements with the Government.  In the past, at this point entire Bills have been lost but that didn’t happen here. Phew!

Even if the Bill hadn’t had to go into wash-up I doubt we would have ended up in a very different place before it reached its otherwise “natural” conclusion.

Everyone appears to be agreed that the definitions of the sorts of materials which the Regulator may require to be removed before a site can be age verified are not satisfactory. We have ended up with them because of ill-judged, inaccurate scaremongering. Anyway the Government accepted a Labour amendment which means that all the definitions have to be looked at again. We can take a deep breath and pause although I think the children’s groups are unlikely to depart from their past practice. If there is an apparent threat to children that is associated with the availability of certain types of material we will speak out but otherwise we will not engage. We have no general view on the desirability or otherwise of porn. We do care about children accessing it.

A code of practice will also be developed for social media platform providers to deal with a range of concerns, not least in respect of the way different types of bullying are being handled.

Nothing stands still. Very soon we will publish Digital Manifesto 2017 addressed to the Parties contesting the upcoming General Election. The manifesto reflects several bits of unfinished business.

Let’s see where this ends up.

Posted in Age verification, Child abuse images, Default settings, E-commerce, Internet governance, Pornography, Regulation, Self-regulation | 2 Comments