A very disturbing message from Oz

José Mourinho (Manchester United) once described rival manager Arsène Wenger (Arsenal) as a “specialist in failure”. I think we should start referring to Facebook as a “specialist in apologising”.

Seriously. Is there anyone out there who has recorded the number of times the company acknowledged that they “got this wrong” or that they  had “taken their eye off the ball”, insisted they were “listening” or some variation of one of these themes? I wish I had had the foresight to keep notes over the years.

Perhaps this readiness to apologise is rooted in the confessional culture which seems to be so pervasive in the USA. Everything can be made ok again as long as you admit it and throw a cloak of earnest humility over your multi-billion dollar, near-monopoly business. Do it with a bit of wit and self-deprecation and you can even come out ahead.

I agree that admitting stuff is better than covering things up or denying them but when it becomes a persistent habit you have to wonder if it hasn’t been elevated to a strategy. “Move fast and break things but when  you are completely cornered……”

At first Facebook sought to be excused their lapses because, contrary to then appearances, they were still a “small but growing company” whereas today the fact that they are so big is offered as an explanation for error. Heads I win. Tails you lose.

I mention all this by way of a preface to the revelations which have emerged courtesy of The Australian newspaper. It seems senior  Antipodean Facebook executives authorised research which looked at how advertisers might benefit from being able to present their wares to children as young as 14 who might be feeling a bit low, depressed or similar. Mood manipulation for money. Worse and worse, the Facebook users whose data the company looked at as part of the research were not aware that this was happening so, obviously, they could not consent to it. It is reassuring to know that no individual was identifiable from the research. Reassuring, but not nearly good enough.

Somehow the documents exposing what had being going on within Facebook found their way into the public domain.  Thank goodness for whistle blowers. Immediately the project was disowned by the big bosses and apparently some sort of enquiry is now underway internally. I doubt that a purely internal process will be acceptable in a matter of this sort. That is precisely the sentiment expressed in an excellent letter  just published by around 25 different privacy and consumer groups from different parts of the world.

But I guess at the end of the day the key question is this: what is it about attitudes at senior levels within Facebook that allowed the guys responsible for the Aussie experiment to think that this was or could ever be in any way acceptable? Are any similar activities taking place right now in other jurisdictions or have there been in, say, the past three years or so?

Once again, where was Facebook’s Safety Advisory Board when all this was going on?

It is true Facebook do a lot of good work in other areas, particularly with regard to detecting paedophile behaviour and dealing with child sex abuse material but when things like this occur it just makes everyone wonder what truly makes the company tick. The cultivated bonhomie starts to wear thin. The other face of Facebook is staring right at you.

I imagine what happened Down Under will have mortified many good people within Facebook who are completely and genuinely committed  to what they do in the child protection and child welfare space. They will be horrified at the idea that their employer could be linked in any way whatsoever with the possibility of exploiting youngsters in such an underhand way. They need to start a rebellion. In particular they need to focus on what I can only suppose are the  structural failings within the company that allow things like this to keep on happening.

Poor editorial and operational decisions or oversights continue to mount, making an unanswerable case for at least some degree of external regulation or scrutiny of Facebook’s affairs. I think  that is now more or less inevitable but even so one last act of redemptive contrition might soften the blow.

Posted in Child abuse images, E-commerce, Facebook, Internet governance, Privacy, Regulation, Self-regulation

What’s really going on with the new law

The number of sites that are going to be affected in any way at all  by the relevant provisions of the Digital Economy Act, 2017, is comparatively small although, of course, they are the biggest sites – the ones that have been shown to account for the overwhelming majority of visits to porn sites by under-18s in the UK. The threat to smaller sites, whether commercial or non-commercial, that might cater for particular or indeed any and every sort of taste or sexual preference is and always has been zero. This conforms with the principle of proportionality that is threaded throughout the legislation.

Definitions. No new categories of illegal material have been created

The Digital Economy Act changes absolutely nothing in terms of defining what is and what is not legal by way of depictions of sexual activity. If a particular type of image was legal before the Act, it is still is, and if it is was illegal it still is.

All that has happened as a result of the Act is the Regulator can, in effect, secure the removal of “extreme pornography” from qualifying sites. Extreme pornography is defined in the Criminal Justice and Immigration Act 2008. It would be very odd indeed, would it not, if the Regulator knowingly gave a green light to a site that contained material which, on the face of it, was illegal?

Whether the existing definitions of what is legal and what isn’t are, in and of themselves,  (a) good enough and (b) appropriately congruent in online and offline environments, is a different matter but that will be separately addressed in the upcoming review of the UK’s overall internet safety strategy.

Would we all have preferred this important dimension to have been resolved as the Bill went through Parliament? Of course and, but for the early General Election, it might have been although I agree the odds were running against it. The question of definitions was acknowledged to be a shortcoming by everyone involved, both on the Government side and among the opposition parties, so I think we can all be confident the matter will be resolved asap.

Should this acknowledged deficiency have brought the whole proceedings to a full stop? Those who did not like the Bill anyway thought so. Should everything have been put on ice until it was cleared up? Absolutely not. What we have ended up with is definitely a great beginning and it will get better when the definitions have been looked at again with less immediate pressure on everyone to navigate a Bill through all manner of Parliamentary exotica and hurdles.

Privacy. Things will only get better

In relation to privacy, some of the solutions we know are emerging to meet the needs of the new regulatory environment mean that  “Ashley Madison” scenarios are impossible, or at any rate hugely less likely to happen than they are right now.

This is because – and again this has been apparent for some time – the porn sites that will be regulated are going to have to use privacy-friendly solutions, and here the Information Commissioner has an independent role to ensure that they do. According to the basic legal principle of data minimization, the only thing a porn site needs to know in order to allow access is whether or not you are over the age of 18. Not your name, not your address, IP or otherwise, not your actual age or gender. Nothing but the simple fact of being or not being over 18.

Most of the affected pornography sites are going to use independent third party age verification providers who will not retain any kind of record that can be traced back to a named individual, account, device or profile.  Moreover, even if it could, the mere fact that someone has been age verified will not prove anything about their interest, or lack of it, in pornography. There is a wide range of products and services that require someone to be able to prove their age e.g. in relation to the purchase of alcohol, tobacco, knives, gambling and so on.

Of course if, as now, someone buys something directly from a pornography site and they give the site their credit card details then that is completely different and a matter of choice for every user.

Some of the age verification solutions are likely to incorporate measures which will allow their users to prevent any of their data from being collected once they have gained access to a porn site. Thus, no doubt paradoxically to some, the era of age verification heralded by the Digital Economy Act, 2017, could turn into the beginning of an era where privacy is greatly enhanced, not threatened even to the smallest degree.

Posted in Age verification, Default settings, E-commerce, Pornography, Privacy, Regulation, Self-regulation

The news today

Anyone reading the British newspapers today who works for or owns one of the major social media platforms will quickly realise that the end of the self-regulatory road is upon them.  Coming on the heels of extensive coverage last week of major online child protection stories, generated by the NSPCC and CHIS, today the Parliamentary Home Affairs Select Committee released their report about what they suggest is the wholly inadequate way in which social media companies are dealing with online hate crimes. However, if you were to put the Home Affairs report side by side with the documents issued by the NSPCC and CHIS what would be striking is the similarity of the analysis, the conclusions and the recommendations.

The Times made the Select Committee Report their lead front page story. The Daily Telegraph did likewise. The Guardian, The Daily Mail The Daily Mirror and The Independent also carried major pieces. I stopped counting at that point.

Last week the NSPCC called for the social media companies to be made subject to fines if they didn’t up their game in terms of child protection. CHIS called for a new independent regulator with the power to make legally binding orders.  In their report today the Select Committee made an analogy with how we police football.  In the UK the clubs have to pay for the police time involved in keeping public order on match days. The MPs argued that if the social media companies do not improve their performance the police will start going on the sites and charge them for services rendered. Stick that in your pipe and smoke it.

Then there’s the fake news stuff and in case you had forgotten we are in the middle of a General Election campaign.

General Election or not this day has been a long time coming and its inevitability has never been in doubt. The guys in California are super smart. They will doubtless have anticipated the current scenarios. They knew it couldn’t last forever but they have had a good run, perhaps a better run than they ever expected.   In the meantime, the cash balances have grown and now they might just have to take a small hit to avoid some of the worst threats and possibilities that are definitely on the near horizon. In twelve months some things will be very, very different.

Posted in Default settings, Facebook, Google, Internet governance, Regulation, Self-regulation

Election 2017 – a Digital Manifesto

Age verification for pornography sites was a key measure advocated in the Digital Manifesto published by the UK children’s charities ahead of the 2015 General Election. Earlier this week that proposal was adopted in the Digital Economy Act, 2017. This shows that raising these matters with the political parties can bring about change.

However, further improvements in business practices and legislation are still needed. Summaries of some of the main improvements that have been identified are reflected in “Election 2017 – a Digital Manifesto” on the CHIS website but the text appears below.

Election 2017 is drawn from the original, larger  2015 document which can be downloaded from here.

Election 2017

Creation of a new independent statutory authority

  • With powers to ensure businesses and other organizations are transparent and accountable in respect of child safety, child welfare and children’s rights in the online environment.
  • The authority to publish codes of practice and make legally binding orders.
  • The resources to further initiatives which help parents support their children’s use of the internet and associated technologies, promote children’s rights and fund research.

Improved business practices

  • Anyone supplying legally age restricted goods or services over the internet must install a robust age verification mechanism and providers of online payments services must ensure they are not providing facilities to an entity in breach of this provision.
  • Online businesses and other organizations must be able to demonstrate they have taken all reasonable and proportionate steps to ensure their services are not being misused to store or distribute child abuse images.
  • Online businesses and other organizations must be able to demonstrate they are taking all reasonable and proportionate steps to enforce their Terms and Conditions of Service.

Legislation

  • A new law should establish an unequivocal duty of care to children on the part of every business or other organization providing online services or producing devices which can connect to the internet.
  • Regardless of the child’s country of residence victims of child sex abuse should be able to obtain compensation from persons in the UK found in possession of images of their abuse.

Policing

Every UK territorial police force should have a dedicated unit with appropriately trained officers to deal specifically with sexual and other online offences against children.

Privacy and the age of digital consent

Before May 2018 Parliament must decide on the age of consent for data purposes for UK children. This will be at the heart of a range of children’s rights, particularly their right to privacy. The decision must be preceded by the fullest consultation and discussion with parents and children themselves.

Posted in Advertising, Age verification, CEOP, Child abuse images, Consent, Default settings, E-commerce, Facebook, Google, ICANN, Internet governance, Microsoft, Pornography, Privacy, Regulation, Self-regulation

Ring the bells!

At  3.15.p.m. today the Digital Economy Bill 2016-2017  completed the penultimate formal stages of the Parliamentary processes. The House of Lords agreed to all of the amendments to its amendments that had been made when the Bill went back to the Commons yesterday. For people reading this from outside the UK – don’t worry. Very few Brits understand it either. All you need to know is for all practical purposes it’s done. At some point this evening when the Prorogation is completed we will need to start calling it the Digital Economy Act, 2017.

The Bill addressed many different issues but readers of this blog will want to know that age verification is now a requirement for all significant commercial publishers of pornography, wherever in the world they are based and whether or not they style themselves as being “free”.  If they are about making money and they want to have access to the UK market they must have age verification.  If they don’t they can be fined and ultimately ISPs will be required to block access to them in roughly the same way that they block access to child abuse images. The exact date for the commencement is unknown by me right now. There is bound to be a run in period to give everyone time to prepare for the new regime.

There ends many years of campaigning by the British children’s groups and others to get to this point but great credit must go to David Cameron and the last Conservative Administration (a) for picking up the matter and, uniquely, putting it in their Manifesto for the 2015 General Election and (b) sticking with their promise when they won.

The sterling efforts of Theresa May’s Administration in carrying on what Cameron began must also be acknowledged. However, once again, what was extremely gratifying was the very broad level of support there was for “our bits” of the Bill across all the political parties. Aside from Karen Bradley and Matt Hancock, as the hands-on Government Ministers, particular tributes need to be made to the continued leadership of Claire Perry from the Government’s backbenches, to  Fiona Mactaggart, Thangam Debbonaire and Helen Goodman from the Labour Party, Baroness Floella Benjamin from the Liberal Democrats and Baroness Howe from the Crossbenches. Several Scottish Nationalists and members from Northern Ireland similarly were star players. Truly this was an all-Party affair in the end.

Because of the unexpected calling of a General Election the Bill suddenly had to go into the emergency procedures known as the “wash-up” and “ping pong” (again, don’t ask) where the Labour Front bench, in the shape of Louise Haigh and Kevin Brennan, had to reach compromise agreements with the Government.  In the past, at this point entire Bills have been lost but that didn’t happen here. Phew!

Even if the Bill hadn’t had to go into wash-up I doubt we would have ended up in a very different place before it reached its otherwise “natural” conclusion.

Everyone appears to be agreed that the definitions of the sorts of materials which the Regulator may require to be removed before a site can be age verified are not satisfactory. We have ended up with them because of ill-judged, inaccurate scaremongering. Anyway the Government accepted a Labour amendment which means that all the definitions have to be looked at again. We can take a deep breath and pause although I think the children’s groups are unlikely to depart from their past practice. If there is an apparent threat to children that is associated with the availability of certain types of material we will speak out but otherwise we will not engage. We have no general view on the desirability or otherwise of porn. We do care about children accessing it.

A code of practice will also be developed for social media platform providers to deal with a range of concerns, not least in respect of the way different types of bullying are being handled.

Nothing stands still. Very soon we will publish Digital Manifesto 2017 addressed to the Parties contesting the upcoming General Election. The manifesto reflects several bits of unfinished business.

Let’s see where this ends up.

Posted in Age verification, Child abuse images, Default settings, E-commerce, Internet governance, Pornography, Regulation, Self-regulation | 2 Comments

Sex crimes in the USA

The UK is not the only country where child sex abuse offences are or have been rising rapidly. See New stats on US Federal sex crimes. If you read the text it seems a similar picture emerges in respect of other jurisdictions within the USA. Federal arrests in fact only accounted for 4% of all arrests.

No indication of if or how the internet was part of the offending behaviour. Neither are child sex offences separated out but the publication lists some of the initiatives and laws which lie behind some of the stats and it is quite clear that cyber and children are part of it.

 

Posted in Child abuse images, Regulation, Self-regulation

An unsatisfactory state of affairs

Thanks to Baroness Jones of Whitchurch for tabling a Parliamentary Question about what happens to non-photographic child sexual abuse images, in particular, Manga and CGI-based material (computer generated imagery).  Lady Jones asked about such materials hosted on machines physically located in the UK and machines not physically located within the UK.

Here are the relevant parts of the answer, which became available today.

The IWF addresses reports concerning non-photographic images when they are hosted on UK websites. Where such images are believed to be criminal and are depicted on a website hosted in the UK, (the IWF) will work in partnership with the hosting provider and NCA-CEOP to remove the content and provide information to assist investigations into its distribution.

The limitation to websites hosted in the UK makes no sense at all from a child protection point of view. So why does that limitation exist?  I have asked but so far have received no answer. I appreciate the volumes may not yet be large but having recently seen an online game that used a lot of CGI material there can be no doubt it will become more common within the grimy world of csam.  I thought the characters in the game I saw were actors. They were that lifelike.

So what happens to identical material that is published on sites hosted on machines that are physically based outside of the UK? Here is the reply

If the site is outside the UK, it is reported by the NCA to the host country via Interpol channels to take appropriate action.

I would wager that is another way of saying nothing happens.  The stuff remains completely accessible to everyone in the UK without limitation unless it happens to be caught by some other defensive measure.

That is not good enough. I doubt this will loom large in the forthcoming General Election in the UK but it might register somewhere.

Posted in Child abuse images, Regulation, Self-regulation