Just how will the GDPR help deal with this?

Tuesday  of this week was Safer Internet Day. This was  also when the BBC chose to publish the results of some interesting research into young people’s compliance with the age requirements of social media sites. Bear in mind that in every case looked at this is meant to be 13. Here are the headlines:

  • More than 75% of children in the UK aged 10 and 12 have accounts
  • Facebook was the most popular with under 13s in the survey. 49% of 10 and 12 year olds in the survey claimed to be users
  • Instagram clocked an impressive 41%

I don’t imagine the picture will be very different in many other EU Member States or indeed in many other OECD countries.

So the European institutions spent over 4 years discussing a new data protection regime. At the last minute – without any prior indication that they were even thinking along these lines – they introduced a proposal to make 16 the new default age, with an option for a particular country to adopt 15, 14 or 13 instead. Yet they did not introduce a requirement for any of the sites to verify that their users complied with any of the stated age limits. In other words they did nothing to address one of the most glaring challenges in the space – the complete divorce between formal policy and the actualité. How will their decision help protect children? It won’t.

Rumour has it that the UK will not go with the new default of 16 but will, instead, adopt 13. In other words it will preserve the status quo – although it won’t be doing quite that because by making 13 a legal baseline we will in fact be changing our law.

Thus huge numbers of children will continue to be users of sites that are not meant for them. They will also learn that telling lies about your age online is easy and has no downside or consequences. Adults make stupid rules. Who knew? And in other countries – where they stick with 16 as the minimum age? The number of children being drawn towards lying will simply get bigger. What a wonderful result. Not.

This situation is not sustainable in the longer run. The only question for me is what will it take, finally, to trigger the inevitable change? Companies that proclaim an age limit should be required to have the capacity to enforce it, or else they should withdraw it and construct their sites and operational policies on the basis that people of all ages will be users.

I do not think that would be  a particularly good outcome. The better alternative is to embrace age verification.

Posted in Age verification, Facebook, Regulation, Self-regulation

A barrier to understanding

A while ago a rough consensus emerged about the importance of no longer using the term “child pornography.” Part of some people’s reasoning  turned on the use of the word “pornography”. It carried with it ambiguous overtones, suggesting the participants depicted were engaging in sexual activity on a voluntary, consensual basis when, of course, that can never be true if a child is involved. The phrase “child abuse material” or “child sex abuse material” emerged as preferred descriptions because they more accurately reflected  what was actually going on. The children are victims. The pictures are crime scenes.

Now to be completely clear, if there was any way we could all be sure that legal pornography would only normally be seen by persons over the age of 18, even though I  definitely have “issues” with it (see below), I  most probably would not be writing this blog. However, whereas in the past we all know kids could occasionally grab someone’s porn mag and sneak a peak behind the garden shed,  by and large their exposure to hard core porn was  limited and spasmodic. Not any more. The scale, nature and ease of access to porn has created an entirely new paradigm.

However, to revert to my opening theme, there is a larger difficulty with the word “pornography”.  In too many people’s minds it conjures up  ideas taken from a  time and a world which for practical purposes has all but ceased to exist.

Thus when people like me talk about the importance of finding new and better ways of shielding children from porn I can hit up against a wall of incomprehension or misplaced liberal sentiment.  Am I pushing a religion-inspired agenda or am I simply, somehow anti-sex  or troubled by  sex, reflecting a personal hang up which probably has its origins in an unfortunate childhood experience? The answer to the first question is an emphatic no.  As to the second, what can I say? If there is anything lurking there it is buried so deep I cannot call it forth into my conscious memory.

The porn industry has gone online and it has been industrialised.  That is the root of the contemporary problem. Much of the material being published now is a million miles away from anything that used to be thought of as porn. Think Enid Blyton and Stephen King. The gap is much, much  wider than that. Yet the vast bulk of porn publishers, based outside the UK,  appear to think they have no real responsibility to try to keep their wares away from children, even very young children. That is the parents’ or anyway someone else’s problem, not theirs. Wrong.

In its new online milieu market forces seem to be pressing porn production inexorably in one direction, towards the more bizarre, violent and degrading end of the spectrum. Moreover there is a world of difference between a static Playboy centrefold and the kind of stuff  that is instantly available today on millions of free web sites in high definition video with high quality sound.

Recently I asked a well known feminist if she thought there could ever be such a thing as “feminist porn”. She said “Yes in theory but I can’t remember the last time I saw any.”

I also spoke to someone who works in the field of relationship counselling. They acknowledged that there were instances where pornography can help individuals or couples to “get things going again”  or even “get things started” but the circumstances in which such materials were incorporated into therapy typically are carefully choreographed and the images are chosen judiciously. No self-respecting sex-therapist would ever just say “have a butchers  at the internet”. Porn on the internet is causing far more relationship difficulties than it is solving.The idea that young people might learn anything  useful from the great bulk of online porn is way beyond a sick joke.

Much of the porn available on the internet looks like nothing more than a series of violent and often bizarre acts perpetrated on women many of whom, we know from research, have had  extremely troubled lives. Very often they are also being “managed” by aggressive pimps who use drug dependency and physical force to secure compliance.

Is it nonetheless possible that some of the women actors nevertheless are truly choosing and consenting to take part? Are they genuinely ok with being subjected to the sort of acts that now feature regularly and prominently on many online porn sites?  The power imbalance that is often so evident suggests otherwise.

Yet maybe there is a tiny minority of women porn actors for whom this is a true career choice or work option, even an artistic or creative one, but the modern context of the internet has completely changed the terms of the debate. We are all free or should be free to make choices that don’t harm others but it is evident that because of the lack of controls over the output harm is being done both to the position of other women and to those who might be exposed to it, particularly children.

Time and again we hear young women complain that online porn is shaping boys’ expectations of how they will behave and many young men also feel extremely uncomfortable with the brutalised roles they are pushed towards. True enough there may be other aspects of popular culture which play into the “pornification” agenda but anyone who cannot tell the difference between an advert for perfume and Pornhub needs their heads or their eyes examining, probably both.

If there is  a contemporary equivalent of someone or a group of people seeking to produce ethical, or feminist porn, I doubt they will want children to be able to access it and  if they are worried  about the way the policy environment is moving in relation to their oeuvre I suggest they pick a fight with the real culprits: the porn industry, not child protection organizations. Not everything the internet has brought us is an unqualified good and  in the case of porn we need to find ways of dealing with it that more closely align with long-established real world standards.

In the meantime if anyone has any bright ideas about how we can use language better to convey the reality of modern porn I am keen to hear.

 

Posted in Pornography, Regulation, Self-regulation

A miscellany

Yesterday morning I attended a meeting in Brussels held under the auspices of the Community of Practice for better self- and co-regulation.  It was an interesting and very useful event.

Impact asessments

The importance of “impact assessments” was mentioned many times during the meeting. They have been part of the EU canon for many years.

In the discussion following the first presentation I pointed out that, in relation to the recently adopted GDPR, the Commission had not carried out an impact assessment concerning its proposal to make 13  the EU-wide minimum age at which young people could decide for themselves whether or not to join an online service such as Facebook.

Responding to me a Commission official said I was wrong because in their  initial consultation document it had been noted that 13 was already in widespread use. What can I say?  If that is an impact assessment I will eat President Juncker’s hat.

A digital single market with four choices

I further remarked that in the several years during which the GDPR was being consulted on and debated at no stage did anyone mention the possibility that we could end up with what was finally agreed – a menu of minimum ages stretching from 16 (the default) to 13.   If only to state the completely obvious: how can this be reconciled with the EU’s avowed aim of creating a Digital Single Market?

Has the EU broken its own rules?

Thus, in reaching the decision it did in the way that it did, is it possible the EU has abandoned one of its own rules? Might this have legal consequences? In general in the democratic world public bodies are required to abide by their own rules and failure to do so can lead to a decision being invalidated. Could this be the case here? Seemingly that is open to doubt because the “culprits” in this case were the legislative body itself and the Council of Ministers. They did the dastardly deed behind closed doors in the Trialogue.

Even if, on this (important) technicality, the decision cannot be challenged (and I am not certain that is the case) it most definitely highlights a scandalous state of affairs. Any and every consultation of a similar kind in future will have to come with a health warning:

Please be aware that at the end of the day  you may be wasting your time taking part in this consultation because we can do whatever we like, up to and including introducing wholly new propositions that you will only learn about after the consultation is over.

We’re in listening mode

More generally Commission officials emphasised how, in the new arrangements which are emerging post-GDPR, everyone in the relevant European Institutions would be “listening” to what the private sector and civil society organizations had to say in relation to how the self and co-regulatory environment ought to be constructed and  operate.

I  suggested the idea of “listening” implied some sort of conversation was taking place. This in turn  assumed the parties had a shared understanding of  what they were meant to be talking about. That does not square with the possibility that something completely different can emerge after the talking is supposed to have finished. 

The “S” word make an appearance

The same Commission official  I referred to earlier added a new twist when he went on to say that the 16-14 idea was in line with the principles of subsidiarity, the implication being it would allow each country to choose an option that was aligned with their pre-existing laws or norms.

That is the first time I have heard anyone use the “s” word” in this context. I will come back to that in a minute.

But anyway the  official was not right.  None of the options provided for in the GDPR  are aligned, for example,  with the existing  law or practice in the UK and Italy and they may not be aligned with the law or practice  in other EU countries. We just don’t know because no impact assessment was carried out and published. Er… we’ve already been there.

Here is a thought which has occurred to me since: suppose the Commission had carried out  and published an impact assessment in respect of 13. Might that have acted as a constraint when, in the Trialogue, other non-impact assessed alternatives started to be advanced? At the very least it should have been a reminder of how the decision-makers were meant to behave when making such a huge decision.

Boundaries?

I have long thought that the issue of subsidiarity and children is one that we ought to debate in a very direct and explicit way.

When it comes to the care of children and young people how many governments of nation states will openly acknowledge or willingly agree that decisions about them can be freely taken by foreigners or agencies outwith their own national borders? That probably sounds a bit blunt but that is the underlying reality and is what was being hinted at the by the official who mentioned subsidiarity.

How are we going to deal with that? Where are the boundaries within which international standards will be  easily accepted and outside of which they will not be or will be only with great difficulty?  Where does COPPA stand in relation to this perilous perimeter?

Other stuff

At the same meeting there were two other presentations, one of which looks like a model of how to go about things, while the other didn’t.

Audio Visual Media Directive

The one that  impressed concerned the Audio Visual Media Directive where a research company has been hired to set out, inter alia, what the state of play is in terms of the protection of minors and media content in all 28 EU Member States. Bravo.

The parcels delivery service

The other presentation came from a Commission official who is charged with removing obstacles to the smooth and economic delivery of parcels across national borders. It was absolutely fascinating and I got the importance of this work straight away.

However, during Q&A I asked what consideration was being given to the issues raised where different countries had different legal standards in terms of what children could and could not buy on their own account? If every EU Member state had the same rules governing the age at which someone can purchase alcohol, tobacco, weapons, fireworks and such like things would be simpler but they don’t. Here is his answer:

“None. Or  rather none  that I know of.” No one else in the room knew either.

 

Posted in Default settings, E-commerce, Facebook, Internet governance, Regulation, Self-regulation

The new data protection regulation – possible wrinkles?

Late last week I was talking to a colleague from Sweden who works for a small social media start up. She was very exercised by the new GDPR.  They had already put their service together based on the notion that 13  was the minimum qualifying age in every country in the EU where they planned to work.

Her point now is that say 10 of the 28 Member States each opt to stick with 16 as the new default age and the other 18, between them,  go different ways across the other 3 options (15, 14 or 13) or  maybe they’ll split in some other way?  Anyway, potentially, this is going to involve the Swedish platform in having to rework the site, have more complex systems and get new Ts&Cs drawn up. They are going to have to spend money on lawyers and programmers – money they can ill afford but probably it won’t be such a burden for the larger, better established, richer  sites, most of which, she noted wryly, are not  European owned enterprises and have their Headquarters thousands of miles away.

But then she had a brainwave.  Suppose they just wait until they see which country opts for 13 as the minimum age and they then simply domicile themselves there or put all their servers there?  The Data Protection Authority in that land would have to make a finding that 13 was OK and wouldn’t that then have to be accepted by each Member State? This would achieve a Digital Single Market for young people by stealth – via the back door.

Otherwise the new law will come into effect in 2018 and any youngster who had told the truth about their age could face being kicked off if the site believes they are now or will be below their national minimum age.

Is there another Mr Schrems waiting to challenge such an expulsion?  My guess is because of the way the EU adopted this new rule the chances of them succeeding in overturning the decision will not be negligible.

Thoughts?

Posted in Age verification, Default settings, E-commerce, Facebook, Google, Regulation, Self-regulation

A new right for victims of abuse?

The UK’s Children’s Charities’ Coalition on Internet Safety, of which I am Secretary, today publishes a copy of its letter to Michael Gove MP, the Secretary of State for Justice and Lord Chancellor. We are asking the Government to bring forward legislation to create a new right for children who have been sexually abused to obtain compensation from anyone subsequently found to be in possession of images of that abuse.

This call is based on a principle which has already been established in the USA, recently upheld in the US Supreme Court decision in Paroline.

Apart from delivering justice to victims it is hoped such a law will, inter alia,  also act as a deterrent to potential future offenders. We definitely need to find better ways of discouraging this type of offending as traditional methods are falling short of the mark.

Good coverage in today’s Observer

Posted in Child abuse images, Regulation

The children’s privacy debacle – Part 3

We all know that in January 2012 the Commission of the European Union began a process of consultation and discussion in relation to the formulation of a General Data Protection Regulation (GDPR). In the following almost four years wide ranging debates and public consultations were held to try to hammer out the all-important detail.

I am sure like me many of the readers of this blog prepared written submissions, attended conferences and did all of the things that, in democracies, we take for granted as being part and parcel of how our systems of government work.

In respect of children the first draft of the GDPR, at Article 4, 18, defined a child as being anyone under the age of 18 but Article 8,1 made clear that for practical purposes in the online space 13 would be the operative age.  Below that age companies would need to obtain parental consent before they could collect or process a youngster’s personal data. However, if a site set the bar at 13, with one bound they were free of any such troublesome and potentially expensive obligation.

The Commission advanced no argument or evidence to support setting 13 as the standard other than a pusillanimous acknowledgement that it was already in widespread use because under US Federal law all the US companies were obliged to follow it. I hadn’t realised we had contracted out our policy making quite so comprehensively. Moving on.

To the best of my knowledge what nobody ever argued for, at any rate not in public, was what was finally adopted. I say “not in public” but the truth is I’ve never heard anyone argue for it in private either.

Just to remind you: what we now have is 16 as the default age with Member States being given an option to adopt 15, 14 or 13 instead. Either way this will change the law in the UK and probably also in several other countries.

So where did this idea come from?

I haven’t a clue. But following some modest super-sleuthing I can now tell you where it didn’t come from:

  1. The UK’s data protection authority – the Office of the Information Commissioner
  2. The Office of the Children’s Commissioner for England
  3. The UK Council for Child Internet Safety
  4. Any of the UK’s Children’s Charities
  5. I believe the Article 29 Working Party wasn’t consulted either but I’m still awaiting final confirmation of that

The proposal to establish 16 emerged during the Trialogue so the UK Government must have been made aware of it at some point but they did not seek to engage any of the above. Maybe they were bound by rules of confidentiality but, whatever the reason, because the outcome was so ridiculous we have to consider the possibility this was due at least in part to the process.

My guess would be that 16 sprang from nowhere as a rushed political fix that emerged in Brussels on or about  27th November  and it was a done deal by 17th December when the LIBE Committee met in Strasbourg to adopt the final text. Four years reduced to two and a bit frantic weeks where transparency ceased to exist.

Further comment seems superfluous.

At this stage. But watch this space.

 

Posted in Age verification, Default settings, E-commerce, Internet governance, Regulation, Self-regulation

The responsibility of platforms

The European Union is conducting a consultation  on the responsibility of online platforms.  Here is their definition of  what constitutes a platform.

“Online platform” refers to an undertaking operating in two (or multi)-sided markets, which uses the Internet to enable interactions between two or more distinct but interdependent groups of users so as to generate value for at least one of the groups.

So typically these are big web sites that lots of people use to connect, communicate or transact with each other.

The definition continues

Certain platforms also qualify as internet intermediaries. Examples include search engines, specialised search tools (e.g. Google Shopping, Kelkoo, Twenga, Google Local, TripAdvisor, Yelp,), location-based business directories or some maps (e.g. Google or Bing Maps), news aggregators (e.g. Google News), online market places (e.g. Amazon, eBay, Allegro, Booking.com), audio-visual and music platforms (e.g. Deezer, Spotify, Netflix, Canal play, Apple TV), video sharing platforms (e.g. YouTube, Dailymotion), payment systems (e.g. PayPal, Apple Pay), social networks (e.g. Facebook, Linkedin, Twitter, Tuenti), app stores (e.g. Apple App Store, Google Play) or collaborative economy platforms (e.g. AirBnB, Uber, Taskrabbit, Bla-bla car).

Internet access providers fall outside the scope of this definition.

The consultation closes on 6th January. This is an extremely important opportunity to influence a key area of policy. I hope the powers that be seize the moment. Here’s what I think about some of what I believe are the major issues.

  1. The eCommerce Directive established the notion of “mere conduit” status.  As a result  there can be no liability for any illegal or unlawful content on the platform unless and until those responsible for its management have actual knowledge of such content.
  2. Alternatively, on being given notice of apparently unlawful or illegal content liability can only arise if those responsible fail to act expeditiously to remove it.
  3. That principle should remain undisturbed.
  4. It would be unjust to make a company liable for something it knew nothing about or for liability to arise the instant the company was informed.
  5. That said it would be good if there could be greater clarity  about what constitutes “actual knowledge”. A better definition of  what “expeditiously” means  and of how “notice” is served or deemed to have been served would also come in handy. These sorts of issues have already been addressed by the hotlines that deal with child abuse images so we could draw on their experience.
  6. The whole idea then would be to ensure the courts in Member States always followed common definitions.
  7. But the current incentive for platforms to do nothing should be banished.
  8. This incentive  exists because, on a narrow interpretation of the eCommerce Directive if a company takes any steps to inspect content on its site, even if solely to try to ensure compliance with its own terms and conditions,  it runs the risk of being deemed to have become the publisher of everything on the site and therefore to have “actual knowledge” of it.
  9. This remains the case even if, for whatever reason, when inspecting the site the owner missed some offending content.
  10. Some companies choose to ignore the risk of attracting liability. They constantly patrol or monitor activity on their platform and deal with what they consider to be non-compliant content. Other companies see the Directive as an alibi for inaction.
  11. Thus, in future, it should be made clear that any action taken by a company to police their platform solely to enforce compliance with their terms and conditions (one assumes all Ts&Cs forbid unlawful and illegal activity) can never give rise to liability.
  12. Actual knowledge must remain the anchor.
  13. I  don’t think I would be averse to excluding  the potential for liability to arise even where through negligence the company failed to discover unlawful or illegal matter that was present even though they were ostensibly deliberately looking for it.
  14. In many circumstances companies in the physical world have obligations to take steps  to protect their customers or the wider public, for example under health and safety legislation and in the online space data privacy laws now impose minimum standards.
  15. A duty of care should be created for online platforms
  16. Thus, where tools are available which help reduce the level of illegal or unlawful behaviour on a platform firms should be under an obligation to deploy them.
  17. To put that slightly differently, no one should be free to establish or maintain an online presence in a way that puts other people’s businesses or individuals at risk, particularly if there are reasonable and proportionate steps they could take  to mitigate or eliminate the risk of that happening.
  18. I acknowledge  that this amounts to an amendment to the idea of intermediary liability but the core principle remains intact.
  19. In the early days of cyberspace there might have been understandable reluctance  to adopt an idea of this kind but we now know a lot more about how the internet is working and what adverse impacts it is having across a wide spectrum of activity. Much of that downside is avoidable or at any rate can be reduced.
Posted in Default settings, E-commerce, Internet governance, Regulation, Self-regulation, Uncategorized