A link between hate speech postings online and on-street violence?

Interesting piece in The Economist. Suggesting there is a correlation between hate speech postings on Facebook and violent crimes against refugees in Germany. Correlation. Not causation. But interesting nevertheless.

Posted in Regulation, Self-regulation

More and better compensation for victims of child sex abuse

Things are moving on in the USA. People are campaigning to get more and better compensation for victims of child sex abuse that results in images of the abuse being made and distributed over the internet. Well done James Marsh, the brilliant lawyer who will not let this go. We need a similar scheme in the UK. And everywhere else. If guys who engage in the vile trade are no longer worried about being arrested and spending time in goal they might worry if they thought it could cost them their pension and other assets as they are compelled to provide monetary compensation to the people they have hurt. #marshlawfirm

Posted in Child abuse images, Regulation, Self-regulation

Toy company fined for online data breach

Following disclosures in 2015 VTech – the toy company – has now been fined US$650,000  by the Federal Trade Commission for disgraceful and inexcusable lapses in security. That is not a huge sum for a company the size of VTech but I guess it is a shot across their virtual bows.

Here are some key extracts from the story.

“Not only was the website itself not secure, but the data were not encrypted in transit or at rest, contradicting security claims made in VTech’s privacy policy. This is not just poor practice, it’s a violation of COPPA, a rule meant to protect children’s privacy.”

“The number of parents and children affected is hard to estimate, but at the time nearly 5 million parent records and 227,000 child records were shown to be accessible. (to hackers) However, the FTC in the summary of its investigation notes:

…about 2.25 million parents had registered and created accounts….. for nearly 3 million children. This included about 638,000 Kid Connect accounts for children. In addition, about 134,000 parents in the United States created Planet VTech accounts for 130,000 children by November 2015…

And the Canadian Office of the Privacy Commissioner writes that “more than 500,000 Canadian children and their parents” were affected. “

Posted in Default settings, E-commerce, Regulation, Self-regulation

2018 – transparency will be the key

At last month’s Internet Governance Forum in Geneva, one of the acknowledged creators of the internet,  Vint Cerf,   was asked what he wished for in 2018.  His reply was telling

No further fragmentation of the internet.

Conforming with a single set of technical rules to allow computer to speak unto computer is one thing. As far as I know there is no immediate or obvious threat to that. What Cerf must have been referring to, therefore, are the politically and commercially determined policies which shape how people actually experience the internet wherever they happen to be on the planet.

If I am right about that Vint Cerf’s wish has already perished. On Monday Germany ‘s new laws on hate speech became operative.   In May next year the GDPR will kick in. There are countless examples of how different countries do things their own way in cyberspace.

The laws of nation- states increasingly will determine the sort of internet that is accessible to you and me. There is a sort of democratic symmetry about that which people will find easy to understand. I’m ok with that although I think it may be a while before it will be true for every country in the world. Size matters.

And in those countries where democratic symmetry or democratic anything is still a far-off dream? I can see that is an issue but it is simply absurd to argue that internet policy for the whole world is, in effect, determined by the fluctuating fortunes of the worst dictatorships.

So let’s start the New Year with a statement of the obvious. The honeymoon is well and truly over.

The wonder of the internet and its associated technologies remains as bright as ever but there is a much greater willingness on the part of many to look beyond the glitz.

“Many” here means consumer groups, civil rights and other civil society organizations, small technology firms and, above all, governments. Traditional media outlets have become not disinterested allies of all the above. That is quite an array of adversaries.

This scenario is particularly true in the developed nations, where there has been the longest, deepest and most pervasive exposure to the internet. Yet even in the global south, where the internet is still growing exponentially, there is much less willingness to buy into the marketing hype or the nerdy muteness which appears to accept anything and everything in the name of innovation. The flaws have become all too apparent.

Tony Blair famously said politicians

Campaign in poetry but govern in prose

That is true not just of politicians. It is also true of Silicon Valley and anyone selling anything. If the gap between the poetry and the prose becomes too large cynicism and distrust set in.

Either way, assuming it is not already too late, the only way forward for the internet industry in its broadest sense is going to involve substantial changes and the key to those changes will be transparency.

With their promises to expand the number of moderators Google and Facebook have certainly been showing they understand the position they are in but unless and until we see how they propose to deal with the transparency imperative it is too soon to say whether we have an acceptable, and therefore stable, way forward.

However, I will make one small prediction. Google, Facebook, Twitter and others will not be able, simply, to announce the terms on which they are willing to be transparent. There has to be a strong, properly resourced and credible, independent element. The history of Advisory Boards is not a good augury. Hopefully,  everybody has realised that.

 

 

Posted in Default settings, E-commerce, Facebook, Google, Internet governance, Privacy, Regulation, Self-regulation

Not a happy end to the year

Today’s Guardian carries a front page lead story about men living in the UK who are “potential child abusers”. Simon Bailey, Chief Constable of Norfolk and UK policing’s national lead for combatting child sex abuse, puts the number at around 20,000.

We are told that at the moment there are “hundreds” of police officers working on these sorts of crimes, but even “thousands and thousands” would not be enough to deal with them all. Our justice system does not have that sort of capacity but, in austerity Britain, there seems little prospect of expansion anyway.

Bailey refers to a recent report from the NSPCC showing an annual  increase of 31% in reports to them  of child sex abuse.

Bailey singles out live streaming as an issue of growing concern and he calls on Twitter – owners of Periscope –  and Facebook – owners of Facebook Live – to do more to curb this type of misuse. And he reminds us that the child victims are coming from every type of background including some with “very capable, very caring parents …who think they are internet savvy.”

Once more we are reminded that there is no possibility of every adult perpetrator being arrested and brought before the courts. Police are having to prioritise those offenders whom they believe to be the most dangerous.

Let’s hope their prioritisation techniques are accurate.

Around one in five of all new child abuse images being found appear to be self-generated or have been made via the engagement of two or more children. Identifying child abuse images as quickly as possible, getting them deleted at source or having access to them restricted in whatever way that works, has to remain a key objective of public policy. In that connection, technical tools have a vital role to play. We need greater transparency in respect of online businesses’ use, or non-use, of such tools.

Maybe this is something the UK’s IWF and INHOPE can look into.

 

 

Posted in Child abuse images, Regulation, Self-regulation

Birds of a feather flock together

A key Enlightenment value is a commitment to a spirit of enquiry, a recognition of the importance of opening oneself to a variety of opinions, experiences and knowledge, accepting that such exposure may cause you to alter your existing views or act differently.

Easier said than done. Even in the 17th Century.

Here is  Francis Bacon on the subject

The human understanding, when it has once adopted an opinion draws all things … to support and agree with it. And though there be a greater number and weight of instances to be found on the other side, yet these it either neglects and despises or … sets aside and rejects in order that by this great and pernicious predetermination, the authority of its former conclusions may remain inviolate.

This is another way of describing confirmation bias. We are dealing with a tenacious, deeply ingrained aspect of human nature, therefore it becomes all the more important to guard against it.

One-way traffic

Now imagine a world reconfigured to provide you with a constant stream of information designed to support your current beliefs. It plays into your established interests, never seriously questioning anything. It only connects you with people who think like you.

We call these “filter bubbles”. They have always been around but, as with so many things, the internet has put them on steroids.

Obviously, we still have free will but some find it harder to wriggle away than others. Many online environments are designed and constructed to keep your eye on the page. Silicon Valley employs some of the world’s smartest psychologists. They know how our brains work.  Big money bonuses go to anyone devising new and better ways to lock us in.

Then there’s the attention economy, operating as another kind of shackle.

Myth v reality 

Filter bubbles undermine an oft’ repeated utopian idea about the internet providing a way to draw people together or allow for dialogue across previously difficult boundaries of nationality, culture, religion and politics. Let’s leave the Tower of Babel on one side for the moment but it ought to figure somewhere in the calculus of global harmony.

Sure, we know of cases where young Palestinians and young Israelis hang out together (virtually), Catholics speak to Protestants in Northern Ireland (ditto), pro-environmental action flash mobs are put together, people organize and contribute to good causes, but these examples need to be weighed in the balance. It is misleading – nothing more than marketing hype – to camouflage the difficulties by adopting the pose of Mr Pangloss or attacking critics as Luddites.

Building communities or digging deeper trenches?

The suggestion that the internet is only holding up a mirror to society is hardly a point in its favour. There are quite a lot of things going on in society that we must know about but that is not a reason for putting them on parade.

This can magnify, amplify, even normalize, encourage or promote some really evil stuff. When it comes to evil we should not be in the amplifying, promoting, encouraging or reflecting business but rather be in the reducing business.

Thus, far from drawing people together, as former British Prime Minister Gordon Brown aptly put it 

The internet often functions like a shouting match without an umpire. Trying to persuade people through social media seems to matter less than finding an echo chamber that reinforces one’s own point of view

and

Achieving a consensus in a wilderness of silos is difficult, if not impossible.

Some companies describe what they do as “building communities”. Maybe they should look for a more accurate term? “Digging deeper trenches”. How does that sound?

Children and filter bubbles

The relevance of all this to young people hardly needs explaining. Unless we are on a mission to indoctrinate children, unless we positively want to keep them away from different ways of looking at the world, we start from the premise that every child deserves to be able to embrace the widest possible horizon.

Through education, through teaching critical thinking and media literacy, we strive to give each child the best possible start to that voyage of self-discovery we call growing up. 

If it turns out the internet is fighting, determinedly and effectively,  even if not intentionally, to push children in the exact opposite direction, what are we going to do? Answers, please, on an old-fashioned, non-virtual postcard.

We cannot force anyone to mix, online or offline, with others who hold different views and, as a good friend said to me, it would anyway be bizarre if we insisted that members of anti-fascist groups had to sing the Horst Wessel Lied at least once per year.

But might there be other things that are scalable and effective or which at any rate ameliorate the tendency towards the prison of blind conformity and sameness?

I am limiting this to a plea to ensure we look for ways that don’t force children into straight jackets too early in their lives. I will leave it to others far cleverer than me to work out what we do about poor old adults.

Lastly, and obviously, this is too important an issue to be left solely to the industry to resolve. Much larger interests are at stake.

I think this is going to be important in 2018.

Posted in Facebook, Google, Internet governance, Regulation, Self-regulation

The UK’s draft guidance on children and the GDPR

The British data protection authority (DPA) – the Information Commissioner’s Office (ICO) – kept its promise and released a guidance note on children and the GDPR before the end of the year. In fact, it came out on 21st December. People can comment on it up until 28th February.

I am pretty sure the ICO is the only DPA in the European Union to have tried to do anything as ambitious as this in respectof children.  Well done them. My guess is all the other DPAs will be watching what happens next so, vicariously, this might be a sort of EU-wide consultation. I say that only because I think it highly unlikely the staff of the ICO wrote what they did without there having been at least some prior discussions between colleagues from a number of DPAs in different Member States (MS) and the Commission.

The document makes clear it is possible to see elements of continuity between the old data protection regime and the approaching one, but it is also obvious there are some radical new elements. These should definitely enhance the position of children.

The ICO paper is well laid out and is presented in commendably simple language. This does not mean I didn’t struggle with bits of it. I did, but that wasn’t down to the language as such. Some of this is unavoidably complicated and unless you are already steeped in privacy law……we need more case studies to illuminate and thereby vanquish the darkness.

I have not tried to summarise the whole of the ICO’s draft and I doubt it is necessary to comment on all of it anyway. Instead, I have picked the bits I think are interesting or acknowledged to be unresolved. If I have missed anything important please let me know.

UK only – the Data Protection Bill

Parliament is currently going through the process of adopting the GDPR into UK law. During the debate in the House of Lords earlier this month Baroness Beeban Kidron won Government and cross-party support for an amendment which will require the ICO to devise a code of practice to entrench a number of specific improvements to the broader provisions of the GDPR.  The ICO says the effects of the amendment are not reflected in their paper. Nor could they be. The amendment was only agreed in mid-December and it has yet to go to the Commons. It may be mid-February before we know for sure what the final version will look like.

That said, and assuming the final version remains the same as or similar to that agreed in the Lords, it is still too early to say to what extent such a code might lead to material differences in substantive practices between the UK and other EU jurisdictions. Even if it did, it is exceptionally unlikely the unique UK arrangements would give rise to enforcement action by the Commission before Brexit or, after Brexit, to a finding that the UK had not made “adequate” arrangements.

Risk assessment, “available technology” and co-ordination

There are countless references in the ICO document to the importance of risk assessments. In other words, while there are a number of “hard” rules, context is key.

This means, for example, if a company offers a service which could result in harm to a young person then it will be expected to go to greater lengths to ensure, among other things, that persons below its minimum age are not able to gain access to it.

There will also be an escalating expectation that Information Society Service providers (ISS)  have an accurate idea of to whom they are providing a service, at least insofar as that knowledge may be relevant in relation to possible harms.

There are many mentions of the importance of having regard to “available technology”. In other words, as more and better age verification solutions become available, or more and better ways of obtaining verifiable parental consent come on-stream so ISS will be expected to avail themselves of them.  In a modern-day equivalent of the old adage about living and dying by the sword,  you need to keep up.

While the ICO does not allude to it in their note, it should be recalled that in relation to a number of points concerned with age and parental verification, the Article 29 Working Group (Article 29) referred to the desirability of DPAs across the EU developing a co-ordinated approach. Differences in available infrastructure within jurisdictions may make that difficult and it is hard to know what the consequences of these differences might be. We shall all just have to hold our breath and see how that works out.

Ignorance will no longer be an excuse

In the past companies said they did not know the real age of their users. They were not under any obligation to verify anyone’s age so they didn’t. At most, they would simply kick someone off if they discovered they had lied about it. Zero requirements meant zero incentive. This bred inertia.  No longer good enough. The ICO says to every ISS

If you aren’t sure whether your data subjects are children, or what age range they fall into, then you usually need to adopt a cautious approach.

This may mean:

  • designing your processing so that it provides sufficient protection for children;
  • putting in place proportionate measures to prevent or deter children from providing their personal data;
  • taking appropriate actions to enforce any age restrictions you have set; or
  • implementing up-front age verification systems.

The choice of solutions may vary depending upon the risks inherent in the processing.

Definition of a child

The section headed “About this guidance” opens by unambiguously stating everyone below the age of 18 is a child and it cites the fact that the UK has ratified the UNCRC as authority for that proposition. Every MS has ratified the UNCRC.

This is important  for two main reasons

  • because there has been a common misperception that, for GDPR purposes, a child is simply and only someone who is below the Article 8 age. In the UK this is going to be 13. In other countries, it might be 16, 15 or 14
  • because we are reminded (in Recital 38) that everyone below the age of 18 is entitled to “particular protection”.

The note also makes clear that a child enjoys all the same rights as an adult in relation to data privacy. In essence the GDPR therefore creates extra rights which benefit children i.e. a right to have their data considered and treated differently from that of persons over 18.

Moreover, the ICO reiterates the Article 29 view that counselling services are not required to obtain parental consent in respect of children below the Article 8 age.

The ICO further suggests, as a general rule

It is good practice to consult with children

when companies design their processes, and we are reminded that Article 12 of the UNCRC makes the same point.

 Offered directly to a child

The ICO repeats the position taken by Article 29 when it says

….. an (information society service) is offered directly to a child when it is made available to all users without any age restrictions or when any age restrictions in place allow users under the age of 18.

Competence still matters

Hitherto the dominant legal consideration in the UK and many other countries was that of the individual child’s actual ability to understand the nature of the transaction being put in front of them. This is a subjective test and in the UK it is known as the “Gillick Principle” following a decision of our Supreme Court in 1985. This notion was taken up in 1989 in the UNCRC where it is framed as a requirement to have regard to the evolving capacities of the child.

To an extent the GDPR cuts across this. Article 8 establishes a “hard” age limit, in the UK’s case 13, and says a child of 13 can give consent and there is no reference to the nature of the proposition to which they are consenting. Although is there a hint that, actually, in some cases, even the 13 age limit may be a lot less “hard” than we thought?

The ICO tells us the

concept of competence  remains valid

because

fairness and compliance with data protection principles remain key concepts

Then these words appear

“In many circumstances, you (a data controller) may wish to continue to allow an individual with parental responsibility for a young child to assert the child’s data protection rights on their behalf, or to consent to the processing of their personal data.

 Likewise, if an older child is not deemed competent to consent to processing or exercise their own data protection rights, you may allow an adult with parental responsibility to do this for them.”

Hmm. A rich vein for lawyers to mine and companies to worry about I fear.

However, part of the significance of this becomes apparent later because while it is clear that children below the Article 8 cannot consent, for example to joining an ISS, this seemingly does NOT mean that a child below the Article 8 age has zero data privacy rights independent of their parents because

All data subjects, including children have the right to:

  • be provided with a transparent and clear privacy notice which explains who you are and how their data will be processed.
  • be given a copy of their personal data;
  • have inaccurate personal data rectified and incomplete data completed;
  • exercise the right to be forgotten and have personal data erased. See How does the right to erasure apply to children?
  • restrict the processing in specified circumstances;
  • data portability;
  • object to processing carried out under the lawful bases of public task or legitimate interests, and for the purposes of direct marketing. See What if I want to market children?
  • not be subject to automated individual decision-making, including profiling which produces legal effects concerning him or her or similarly affects him or her;
  • complain to the ICO or another supervisory authority;
  • appeal against a decision of a supervisory authority;
  • bring legal proceedings against a controller or processor;
  • claim compensation from a controller or processor for any damage suffered as a result of their non-compliance with the GDPR.

And the ICO goes on to say

A child may exercise the above rights on their own behalf as long as they are competent to do so.

Note the absence, there,  of any reference to a specific age.

Right of erasure

The same principle applies in respect of the right of erasure i.e. the child holds this right on their own behalf and, at least in theory, its exercise does not depend upon the concurrence of a person with parental responsibility. Here is what the ICO says

Children have the same right to have their personal data erased as adults

 This right is particularly relevant when an individual originally gave their consent to processing when they were a child, without being fully aware of the risks.

And

It should generally be as easy for a child to exercise their right to erasure as it was for them to provide their personal data in the first place. 

In practice a child may need their parents’ help to exercise this right, and they can even authorise their parents to act on their behalf in such matters.

The three main routes to processing

The ICO outlines three principal ways (there are six in total) in which a child’s data may be lawfully processed: consent, legitimate interests and performance of a contract.

Consent

The issue of consent is the one we are all reasonably familiar with and we know that for children below the Article 8 age this will involve the business in obtaining verifiable parental consent.

However, even for children above the Article 8 age, not only will it be important for the young person to understand what it is they consenting to, their consent may nonetheless still be found to be invalid if it is apparent what they are agreeing to is against their own best interests. More work for the lawyers.

Legitimate interests

In respect of legitimate interests, I suspect lengthier explanations, with several case studies or illustrations, will be required before many in the child protection world (me included) will fully “get” what is intended. For example, what are we to make of this, addressed to ISS?

Using legitimate interests as your lawful basis for processing a child’s personal data puts the onus on you, rather than the child (or adult acting on their behalf), to make sure that their data protection interests are adequately protected. You need to consider what the child might reasonably expect you to do with their personal data, in the context of your relationship with them.

 In practice, this means that if you intend to process children’s personal data you need to design your processing from the outset with the child, and their increased need for protection, in mind.

Performance of a contract

In the UK and many countries, persons below the age of 18 do not have full contractual capacity and therefore almost all of the agreements they might enter into are either void or  “voidable”.  The ICO refers to voidable contracts but it would be useful for concrete examples to be given of how this aspect might work in practice in respect of some of the apps and sites most commonly used by persons below the age of 18.

Applicable law

Here the ICO says they are still not sure what the final view will be on the question of the applicable law, save to say that, regardless of where an ISS is based if they offer a “UK version” of their service or they “actively target” UK children then UK law and rules will apply. But if a UK business targets children in other MS it will need to be aware of and comply with their laws and rules. Apparently

In practice this may mean that the child needs to select, or confirm, their main country of residence when they give their personal data to an ISS; so that the ISS …. provider knows which age limit to apply.

That’ll be fun.

Profiling and codes

It is clear that individual industry codes of practice and industry standards are going to acquire considerable legal importance, particularly those devised by or through BEREC or made under the auspices of the Privacy and Electronic Communications Regulations.

And as for the rules about profiling….. how these will work is going to be of absolutely vital importance but their true meaning still remains elusive. Concrete case studies are needed.

Posted in Age verification, Consent, Default settings, E-commerce, Internet governance, Privacy, Regulation, Self-regulation