News on the e-Privacy Regulation

On 30th November, 2018 I wrote to Vice President Ansip of the European Union  on behalf of over fifty NGOs from all parts of the world, including from 26 of the 28 Member States of the EU. The letter addressed major deficiencies in the draft e-Privacy Regulation making its way through the legislative processes. Right now it is stuck at the level of the Council of Ministers. That is where national Governments are directly involved.

Two days ago I received a reply  from Vice President Ansip. It is very disappointing but I take comfort from the fact that more and more national Governments appear to be becoming convinced of the draft’s shortcomings. Specifically a growing number seem to be aware of faults that will put children in danger. They are gearing up to address those faults and children’s groups everywhere must continue to encourage them to look for an acceptable solution.

Because things are moving very fast in Brussels I have today sent a reply to the Vice President’s letter. I won’t repeat the contents here. You can download it.

I have  sent this blog to every signatory of the November letter suggesting they might think about writing again to the relevant people in their national governments or others with influence, to continue pressing the case for an explicit carve out to be inserted into an Article of the Regulation in order to maintain current child protection standards.

Such a carve out could be achieved in one of two ways.

A clause could be inserted into an Article in the e-Privacy Regulation making it clear that where providers of electronic communications services, sometimes referred to as “over the top services”, take measures to detect the presence or exchange of child sex abuse materials, with a view to securing their deletion and reporting, or where measures are taken to detect other forms of illegal abusive behaviour towards children, the Regulation does not apply therefore such measures continue to be lawful.

Alternatively, steps taken by electronic communication service providers to detect the presence or exchange of child sex abuse materials,  with a view to securing their deletion and reporting, or taken to detect other forms of illegal abusive behaviour towards children, could be brought within scope by making clear businesses are allowed to process metadata and communications data for such purposes without first having to obtain the consent of the end user. In addition, it should be made explicit that any data obtained in this way cannot be used or further processed for any commercial or other reason.

For good measure I am also here providing a link to a letter released by the UK Government just before Christmas.  I don’t think it has been picked up elsewhere. With Brexit mayhem approaching its climax it’s hard to get a word in edgeways.

While it discusses a range of issues, in relation to child sex abuse material and other threats to children the UK Government expresses exactly the same sentiments as all the children’s groups did in the November letter. Look in particular at what it says  about why derogation is not the answer.

Posted in Child abuse images, Internet governance, Privacy, Regulation, Self-regulation

Blinded by the light

I removed the Open Rights Group (ORG) from my Christmas card list because of its disgraceful behaviour over those parts of the Digital Economy Act 2017 which establish a new regime for reducing the exposure of children to material published on commercial pornography web sites.

That does not mean I think we can or should ignore everything they say or do. In and around the ORG there are some serious people with great knowledge and valuable insights. It’s a shame they don’t always have the upper hand. I know in fast moving campaigns it is not always possible to get everything right – boy do I know that! – yet in the case of the ORG one might wish their mistakes or errors of judgement were not so monotonously predictable.

Beautiful dreamer

The ORG and similar bodies strike me as romantics who refuse to abandon a dream. Everyone has a soft spot for romantics.

To borrow a phrase from Manfred Mann, they were “blinded by the light”.  There was a brief, dazzling moment in the 1990s when it looked like the internet could and would change the world in wholly beneficial ways. Zero downside. We all, me included, fell in love with the buzz and the high of those days of innocence. The romantics believe it is possible to recreate them or something like, or at least to hang on to the echoes. I don’t.

As they were tripping to John Perry Barlow the romantics believed tyrants everywhere would fall and we would all hook up to save the northern white rhinoceros. Disease and poverty would be sent into headlong, urgent retreat as engaged and enraged righteous citizens across the globe, rose in a virtual and elevated modern form of Athenian democracy, joining hands to force politicians and big business to make everything right.

Candy Crush, dancing kittens, rigged elections and beheadings

What did we get? Candy Crush, rigged elections, countless videos of cute kittens vying with images of journalists being beheaded, a tsunami of child sex abuse material and an avalanche of images of anti-female violence rebranded as “free speech”. ORG tells us videos of minority sexual behaviour need to remain easily viewable in order that a handful of interested people can hide in plain sight.

Meanwhile surveillance capitalism, looking for ever more efficient ways of targeting advertising, paved the way for, er,  surveillance everything.

OK. That’s not all we got, not by a long chalk, and on another day I will happily sing the praises of the technology’s positives, but we have to remove the rose tinted glasses and the rose tinted hope. We have to recognise that if we do not find effective ways of addressing  some of the extremely serious problems the technology is creating – not just the ones the free speech lobby appears to care about – we will end up with, not nothing exactly – we are way past the point where that can happen –  but something very different that is definitely not better.

Techies might think they are super smart and smugly believe they can,  in perpetuity, keep outwitting governments and the monopolies but in the end without the mass of informed popular opinion on their side they are bound to lose.  And anyway this shouldn’t be reduced to some kind of macho war of attrition (note how I elegantly avoided the more obvious, earthier metaphor).

The internet is many different things, some more important than others

For the vast majority of people for the vast majority of time the internet occupies a niche in the consumer space, alongside toasters, vacuum cleaners and so on.  Internet access now comes packaged as an add on to your TV and cable service. It is used to do shopping, book flights and holidays, watch Hollywood blockbusters and send pictures of the kids to Granny in Australia.

Thus, across a broad spread of its operations the internet cannot escape being judged by consumer-style rules and expectations. The difficulty, of course, is that the internet is not just in the consumer space but speech and politics seem to blot out everything else for the ORG crowd. That’s why we have these seemingly interminable debates about how and where we strike the balance.

The smart move

The truly clever thing would be for the techies and free speech/political internauts  to make common cause with more broadly-based interests in the fight against widely acknowledged evils. I do not associate with anyone who wants to ignore, suppress or is careless about the rights of minorities but there has to be a point somewhere on the compass where these can be preserved or even enhanced without compromising the rights of others e.g. in this case a large vulnerable group such as children.

Many years ago I said something like this to a leading 1st Amendment lawyer in the USA. He looked at me as if I was nuts. He left me with the impression that he just hangs about waiting for someone, preferably the Government or a large monopoly, to make a move he can get someone to pay him to criticise or, ideally, take to court.  He liked the old version of the internet and believes the judges will help everyone go back there or get near to it. Our conversation was pre-Trump. I wonder if he still thinks that?

The pervasive view with a certain set  is every internet user is or ought to be a fully competent, literate, tech-savvy adult and if they aren’t that’s their problem and they deserve whatever they get. No one should have to be put to even the smallest inconvenience or suffer the most momentary delay because some idiot doesn’t know the difference between a TCP IP stack and a bowl of custard or feckless parents can’t keep their kids under control.

The ORG’s proposals

These thoughts are prompted by my reading of the ORG’s latest publication.  UK Internet Regulation – Part 1: Internet Censorship in the UK today has much in it with which I can agree. For example I fully support the calls for greater transparency.

I also agree any organization with formal or informal power to require content to be removed from the internet or for particular addresses to be blocked should be subject to Freedom of Information requests and public reporting requirements. I absolutely do not agree that blocked addresses or specific details of removed content should be published but there should always be independent appeals or review mechanisms.

I get that some poor unfortunate, non-English-speaking trader in Kyrgyzstan  or Germany might not realise their wares are not accessible by potential customers in the UK because of an error of classification and if someone can come up with an answer to that which does not also require all the addresses of  sites likely to be breaking the law to be made available to the whole world I’ll vote for it. But not otherwise.

I am attracted to the idea that there might be an over-arching judicial tribunal of some kind which could, as it were, act as a guarantor of the integrity of all of the self-regulatory/co-regulatory mechanisms in the internet space. The IWF, in effect, already has something like that. A retired senior judge runs the slide rule over their operations. Maybe this aspect  could be swept up when the Government’s long awaited White Paper  on harms and governance comes out next year.

Censorship? I don’t think so

I am intensely irritated by the loose way in which the word “censorship” is bandied about.  Seeking compliance with the law e.g. by requiring a known illegal image to be removed, cannot possibly be thought of as censorship. You might not like the substantive law but that is a different point.

For me censorship involves permanently and inflexibly removing something that is otherwise lawful, as in requiring cuts to be made from a book, article or film or directing that a book, article or film be completely withdrawn from circulation.

Not everything that is legal needs to be on unrestricted public display but any actions which reduce access to it have to be grounded in law. But in what sense is it censorship if all you do is impose reasonable, temporary and reversible limitations e.g. removable filters, in order to protect children? Or you impose limitations which can be lifted if anyone can establish that they meet a reasonable requirement e.g. by proving they are 18 or above? When I was a lad in Leeds all the “best dirty books” were kept in a locked room in the main library and only adults were allowed in. I cannot now remember how I came to know that.

We impose restrictions in relation to the sale, advertising, display or promotion of alcohol, tobacco and so on, why not in respect of porn? I appreciate there is more social baggage associated with porn consumption than there is with drinking and smoking but we have to weigh these things and choose which is more important. Who would say that the protection of children is less important than asking people to comply with privacy-respecting rules to prove their age? It is a tiny and minor irritant designed to deliver a larger social good.

Dancing in the same ballroom

Nevertheless it is obvious that even if the above are not censorship they are dancing in the same ballroom as censorship and whenever politicians or political institutions step into or approach that ballroom it is unreasonable to expect citizens to take everything on trust.  Transparency and accountability are key. In that regard I agree the status quo is failing.

But enough already with the “Virgin Killers” story. One mistake in 2008? Is that your best shot? It was a mistake that could not have been avoided by any judicial or other process in Britain because nobody in the UK section of Planet Earth knew how Wikipedia’s systems would react. It was an error that was swiftly corrected. Would it have been put right so rapidly if the matter had had to go back to court? And in their document the ORG speak of other (classification?) decisions by the IWF that have been “found wanting” . That’s news to me and I am sure it will be to others.

Laughed out loud

Which brings me to the bit of the ORG report that made me laugh out loud. ORG essentially wants nothing to happen until a court has approved it. Yet they tell us the area where it appears there is the highest number of mistakes being made is the one where a judicial order is required, namely in respect of blocking copyright infringing web sites.  It seems 38% of blocking injunctions examined were wrong in some way. There’s a moral there somewhere.

Perhaps part of the moral revolves around this: while written rules are essential for a great many things, they are rarely sufficient. We need to recognise and give weight to other things too.  The political culture and traditions within a jurisdiction matter.

The UK Government, police and security services are not barely distinguishable  from their equivalents in North Korea, East Germany and Saudi Arabia, nor do they wish to emulate them or if they ever do acquire such an ambition we will all have gone to hell in a handcart anyway. Look for me in the Yorkshire hills with my survivalist backpack.

Posted in Consent, Default settings, E-commerce, Internet governance, Pornography, Regulation, Self-regulation

Age Verification regulations clear their final hurdle

The House of Commons tonight agreed the final regulations necessary to introduce age verification on commercial pornography web sites. Yippee!! All we need now is notice of the commencement date. Expected to be 3 months from today or thereabouts. Then the hard work begins.

Posted in Age verification, Internet governance, Pornography, Regulation, Self-regulation | 1 Comment

More about the power of PhotoDNA and similar tools

Twitter report that in the first 6 months of 2018 they suspended nearly half a million accounts for violations relating to child sexual exploitation. 97% of those accounts were proactively flagged using PhotoDNA and similar tools.

In Dublin on Friday a Google spokesperson  informed us that of  7.8 million videos they removed from YouTube in the three months July-September, 2018, 81% had been identified by automated systems and of these 74.5% appear never to have been viewed by anyone. Not a single person. The latter numbers cover all categories of materials, of which CSAM likely was a part, but either way they illustrate the power of the technology.

Posted in Child abuse images, Regulation, Self-regulation, Uncategorized

More progress on tackling child sex abuse material

Things may be going crazy in Brussels but in Alexandria, Virginia, excellence and calm sanity remain the norm. The International Centre for Missing and Exploited Children (ICMEC) has just released the 9th edition of the always eagerly awaited Model Legislation and Global Review.

Over the years the scope of the review has expanded beyond “simply” recording how countries address each of the original“five factors” highlighted in the recommended model legislation. While these remain at the constant core, the 9th edition has evolved to become a rich and authoritative source of information on thirteen

“fundamental topics/provisions that are essential to a comprehensive legislative strategy to combat child sexual abuse material”

In relation to the original five factors, ICMEC tells us that while at the time of the first edition in 2006, only 27 countries in the world had a legislative framework considered sufficient to allow them to tackle child sex abuse material, today it is 118 and between the 8th edition (2016) and the 9th the rate at which countries have been making things right in this department has noticeably quickened. The message is getting through thanks to the work of a multiplicity of agencies, many of which are mentioned.

While in 2006 95 countries had no legislation at all specifically addressing the problem, ICMEC advises today we are down to 16. Let’s hope by the 10th edition we hit zero. Another number that needs to get closer to zero asap concerns simple possession of child sex abuse material irrespective of an intention to distribute. 38 countries still do not explicitly say that is illegal. They should.

ICMEC is at pains to point out

“As always, it is important to note that the legislative review accompanying our model legislation is not a scorecard or a scold, but an effort to assess the current state and awareness of the problem. Realizing the importance of taking into consideration varying cultural, religious, socio-economic, and political norms, our model legislation continues to resemble a menu of concepts that can be applied universally….”

Quite so.

Good laws are rarely in themselves sufficient but they are an essential building block.

Posted in Child abuse images, Internet governance, Regulation, Self-regulation

Mr Ansip’s Terms and Conditions

Is this what Mr Ansip has in mind for all companies’ Ts& Cs in the future?

“Please feel free to distribute illegal content using our facilities. We won’t do anything to try to detect it or stop you. Knock yourself out.”

If that’s what you mean, why not say it?

Or else

” These Ts&Cs don’t mean anything because we don’t do anything to ensure they are being honoured. Have fun. We thought we would state the position honestly because, thanks to the EU, the chances of us finding out you misbehaved are so close to zero that they might as well be zero.”

Law enforcement agencies across the world have candidly admitted they cannot address the volumes of illegal behaviour online, which includes the distribution of child sex abuse material. They have constantly called for companies to step up their efforts to rid cyberspace of this kind of material. So have the victims. What does multistakeholderism mean if it excludes businesses from doing the right thing in a way that jeopardises nobody’s rights?

PhotoDNA and similar look at patterns. If I post a picture of  my kitten or a recipe for banana cake it will not see either. PhotoDNA only picks up files containing patterns that match images that have already been found to be illegal and are included in a database. In what meaningful sense of the word is that “surveillance”?

PhotoDNA has been “out there” for nine years. We constantly hear calls for evidence. We have lots of evidence about PhotoDNA’s  successes. Where is the evidence that anything is going wrong? And why did nobody, repeat, nobody, talk to anyone in the child protection world before they came up with this ridiculous idea?

Anybody who is looking for bad stuff must necessarily look at non-bad stuff in order to eliminate it. Thus, PhotoDNA and similar can therefore be seen as a highly targeted  form of anti-surveillance. It does not go through every file of mine saying “Not another kitten video. Let’s move on. Oh dear, the same crap recipe for banana cake, has this guy no imagination? Ah ha. This is new. I wonder what it is? Nah. Pictures of his parrot. Has he not got a life?” As I said, in what meaningful sense of the word is that “surveillance”?

Posted in Child abuse images, Internet governance, Regulation, Self-regulation

News from the front

Yesterday Ministers from the 28 Member States met to consider progress on the new e-Privacy Regulation. They could still not agree a final text so the matter remains unresolved. I have only been able to listen to those Ministers who spoke in English (quite a few did) but it is clear we have strong support from some countries.

My network will be letting me know what “their”non-English-speaking Ministers said once they have had a chance to listen to and translate the proceedings but I would say we are moving in the right direction.

Vice President Ansip’s office has emailed me to say he will reply to the NGOs’ letter in due course so watch this space.

If you click  here and go to 2 hours 22 minutes and 26 seconds you will hear the Vice President say that police powers are not affected by the proposed e-Privacy Regulation in respect of investigating and prosecuting child sex abuse material cases. We already knew that. Police powers were never in question.

What is at issue is the continuing ability of private companies to take proactive measures to detect, report and delete known child sex abuse materials e.g. through deploying PhotoDNA, as they have been doing since 2009, with fantastic results and no known or reported downside.

The worrying bit in the Vice President’s remarks comes at the end where he says that “mass surveillance is not allowed”. Where did that come from? Who wants mass surveillance? Not me.

PhotoDNA only matches known illegal images and it does so to defend a child’s right to privacy and human dignity.  No one has a right to exchange images that have already been deemed to be illegal and which harm someone else’s fundamental rights. Do they? Anti-spam, anti-phising and various other security programmes undertake similar functions.

Several Ministers at the meeting were at pains to point out the proposed Regulation does not impact on measures taken in any country in the interests of their national security. Don’t our most vulnerable citizens, our children, deserve a similar level of consideration?

Thus, we need an explicit carve out in an Article in the Regulation which makes clear that measures to detect child sex abuse material and other forms of abusive behaviour towards children are outside the scope of the Regulation. Alternatively  the text of an Article should state such measures are legal without the need for derogation.

That is limited, clear and proportionate. It gives no get out for any company to exploit any data thus obtained for commercial or indeed any other purpose. And it absolutely knocks on the head any possible suggestion that the user’s consent is required before such processing can take place.

As I said, I will report again when I have heard from Vice President Ansip.  Meanwhile we need to keep our foot on the gas.

Posted in Child abuse images, Consent, Default settings, E-commerce, Internet governance, Privacy, Regulation, Self-regulation