Hot news: on Thursday evening in Brussels officials working on the draft ePrivacy Regulation agreed they were at an impasse. They cannot reach a consensus on the text so the matter has been referred “upstairs” to see if the outstanding issues can be resolved at political level i.e. by national governments.
One of the outstanding issues concerns the use of PhotoDNA and similar tools which can detect child sex abuse material moving across a network or being stored on devices connected to a network. PhotoDNA is currently being used extensively and it works spectacularly well. It represents the single most important advance in eliminating child sex abuse material since the internet began. The draft makes it illegal.
The only way I can explain the otherwise inexplicable is that privacy people are taking an exceptionally narrow, blinkered view of what privacy means. This is limiting their ability to see the absurd results of their decisions, in this case profoundly damaging consequences for children’s safety and well-being across the 28 Member States.
Children’s advocates across the European Union need to get busy. We must lobby national Governments to include in an Article (not simply a Recital) a specific exemption to allow businesses to deploy or continue deploying tools which are designed to protect children.
At the moment only officials from Ireland, the UK, Portugal, the Czech Republic and Belgium are speaking out directly in a way which is favourable to children’s interests. The rest appear to be opposed or are silent, which amounts to the same thing. The Austrian Presidency is being singularly unhelpful. This matters precisely because they hold the Presidency. Member States look to them for a lead. If you know any Austrian Members of Parliament or Ministers, get on the phone straight away.
The draft allows Member States to opt out of the part of the Regulation that prohibits the use of PhotoDNA and similar tools. This would inevitably lead to a patchwork of laws. That could, effectively, cripple operations. And, to repeat, what the Regulation is proposing is to disrupt systems that are already in place.
What is the EU saying anyway? Children in Britain, Ireland, Portugal, the Czech Republic and Belgium can be protected but kids in other jurisdictions just have to hope their national legislative bodies get their act together? How does that square with other lofty statements about the importance the EU attaches to protecting children?
How does it square with the 2011 Directive which requires every Member State (no discretion or variation allowed) to have machinery to remove discovered child sex abuse material? How does it square with the UNCRC and the Declaration of Fundamental Rights? States have obligations to protect children. They do not have a discretion to prevent companies from protecting children.
Meanwhile, in other hot news UK MEP Mary Honeyball is tabling a question in the European Parliament asking officials to document how and when experts on online child safety and officials within the Commission with a brief to look after children’s interests were consulted on the content of the draft Regulation. I think I already know the answer. They weren’t.
Is this how it was for Woodward and Bernstein?
Today practically every law enforcement agency in the world is saying they cannot cope with the volumes of child sex abuse materials circulating on the internet. They desperately want tech companies to help stem the tide. If we hear criticisms these are generally grumbles from governments, the police or civil society about the private sector not doing enough. It is therefore strange beyond words to hear objections to companies doing too much to try to make the internet safer for children.
Almost universally the task of making the internet a better place is seen as a joint endeavour. We even have a word for it: “multistakeholderism”.
The European Union in particular has been an extremely vocal and energetic proponent of multistakeholderism but it is becoming apparent that the message has not been filtering down or taking hold evenly in all parts of the Commission and its associated agencies.
Who gave you permission to do that?
The argument focuses on PhotoDNA but the logic extends way beyond that single product. No one is suggesting Microsoft did anything illegal when, unilaterally, it developed PhotoDNA and decided to give it away to companies or agencies with a legitimate interest in locating, deleting and reporting child sex abuse material. The fact I am even writing a sentence like that tells you something about how ridiculous this whole thing has become.
While nobody is accusing Microsoft of breaking the law, in effect what is being said is that because there is no EU or national law which specifically mandates or specifically allows PhotoDNA to be deployed, really it has to stop.
In some countries citizens can only do what the law allows. In others, citizens can do anything providing the law says it is not illegal. I know which approach I prefer. Up to now, deploying tools to protect children has never been illegal. Let’s keep it that way.
If an individual company can be shown to be using tools ostensibly designed to protect children in order to gain access to metadata, or anything else which, otherwise, they would not be able to access, and they then exploit the information thus obtained for commercial purposes, they should be brought before a court and severely punished. But absent that we should leave things as they are.
Signor Buttarelli has made clear he is worried that if an agreed text for the Regulation is not finalised swiftly then all will be lost when the current Parliament dissolves ahead of the next European elections. Having to start again with a new Parliament and a perhaps new Commission clearly holds little appeal.
Remember what happened when the GDPR was coming through? On the little matter of the age limit for children to give consent without online businesses having to engage with obtaining verifiable parental consent, the privacy priesthood inside and outside the Commission united behind a single proposition. They said there should be one age, 13, which should be applied without exception in every Member State.
They presented no evidence in support of it, did no research to justify it, simply believing they could steamroller it through, hoping that, after four years of discussions, politicians would say “enough already” because they wanted to move on to something new.
At the very last minute politicians bit back. We ended up with a patchwork of ages. And now, guess what? We are hearing across the EU that data protection authorities are astonished to learn how many companies are abandoning the idea of obtaining consent altogether, including for children, instead using legitimate interests or contracts as the basis of their engagement.
While constantly talking about the need to engage with parents, in effect, businesses have intentionally cut parents out of the loop. By a back door a minimum age of 13 has been created EU-wide. Politicians have been thwarted by a sleight of hand. Article 8 is becoming a dead letter.
Privacy leaders need to start engaging with the children’s agenda
What that story illustrates is, with notable and honourable exceptions, thinking through how to address children’s interests, is not high on the list of priorities of the leadership of the privacy world.
That has to change.