Campaign bulletin: child pornography on the internet – bad reasons and non-reasons for opposing blocking

 

A draft Directive on combating the sexual exploitation of children and child pornography is making its way through the decision making processes of the European Union.

The draft started off as a proposal from the Commission, the EU’s executive arm. For the proposal to become an EU-wide law it must be agreed both by the Council of Ministers, which represents each of the national governments in the 27 Member States, and the European Parliament.

  • The Council of Ministers

The Council of Ministers finished their deliberations and agreed their text before Christmas.  In Article 21 they endorsed two distinct measures which the Commission had proposed to tackle online child pornography.

The first part of Article 21 would make it a requirement for every Member State to establish machinery to facilitate the prompt deletion of all child pornographic images hosted on web servers within their own national jurisdictions.  This is not hugely controversial. Nearly all the countries in the EU already do it anyway, however it is good that the concept becomes entrenched in law.

  • How long does it take to get an image deleted?

In some instances in a number of EU Member States it can take as little as one hour from an illegal image being identified to it being removed from an in-country server.

But what about images that are hosted outside the EU?

In 2009 over half of all child pornography found online by the British Internet Watch Foundation was on servers outside of Europe. Of the content found within Europe a significant proportion was housed in countries which are not in the EU. Clearly in those instances an EU Member State’s national authorities cannot instruct the foreign host to take down the image, in the same way as they could a company within their own borders. All they can do is pass on a request. Machinery exists to allow these requests to be transmitted swiftly but in some instances the time taken to complete the deletion can  be substantial e.g. weeks, months, or more than a year.

The reasons for this state of affairs are many and varied but that is little consolation. For as long as the images remain on view they continue to do several forms of harm. See below.

  • The problem of extended time delays

To deal with this problem of significant time delays a handful of EU Member States e.g. Sweden, Italy, the UK, Finland and Denmark, with France soon to join them, established a system for blocking access to web addresses  where the image is hosted abroad. The block remains in place until the illegal content is deleted.

  • Blocking pending deletion can work with great precision

Blocking pending deletion can work with great precision: pinpointing the exact page on a web site where the illegal material is to be found. This is important because it means there will be no collateral or unintended damage to any innocent web sites or pages.

Here is where the second measure in Article 21 comes in. The draft proposes to make it mandatory for every EU Member State to create machinery to enable blocking pending deletion.

  • Controversy in the European Parliament

Members of the European Parliament (MEPs) are mulling over the same text that went to the Council of Ministers. However, the idea of blocking pending deletion, is being fiercely opposed by some of them. Right now, it looks as if it could be lost. We will have to fight to save it. All hands are needed on deck.

  • The arguments for deletion

The arguments for deleting child abuse images from the internet are overwhelming. Here are a few of them

  1. The images are an egregious breach of the right to privacy of the child or children depicted in them
  2. The continued publication of the images re-victimizes the children and raises the possibility that the child or children will be recognised by a third party who can use the existence of these images to humiliate the child or children further or blackmail them into more harmful activity
  3. The continued availability of the images helps to sustain and encourage paedophile activity more generally
  4. New third parties may find the images and, for the first time, become engaged in collecting or downloading them thus embarking on a path which may lead to committing further offences either against their own or other people’s children
  5. The continued publication of the images raises the possibility that the makers and publishers will feel emboldened and encouraged to re-abuse previous victims or find new children to abuse so as to produce fresh images either for paying customers or to swap with collectors
  6. They are illegal
  • Why would you delete but not block pending deletion?

How can anyone be in favour of deletion but be against blocking? Blocking is, after all, a form of deletion. It renders the material inaccessible to the great majority of internet users in the country where blocking happens.

Obviously everyone would prefer that there was no need for blocking. It would be much better if the foreign host removed the image quickly, thereby making it unavailable to everyone in the world, but if that possibility is not on offer it seems very strange to refuse to accept a solution that goes some way towards it pro tem. That seems like a case of letting the best be the enemy of the good.

Opponents of blocking are sort of saying everyone should be able to see the images until no one can.

  • How do the opponents of blocking attempt to justify their position?

It is easy to see that blocking is antithetical to a notion many internet enthusiasts have held dear over the years: that the internet is about, or ought to be about, facilitating information flows, not restricting them. I am sympathetic to that idea but it only holds good if you believe you have zero responsibility for what it is you are helping to propagate.

No one is suggesting that internet service providers or other online publishers or distributors have an obligation to inspect everything that passes through their pipes but where they know this type of activity is happening they cannot pretend indifference. Of course they can claim legal immunity under the E Commerce Directive but that hardly makes such a stance ethical.

  • Thin end of wedges and slippery slopes

A number of opponents of blocking make references to “the thin end of the wedge” and to “dangerous precedents”, sometimes referred to jointly or severally as the “slippery slope”. “Where will it all end?” they ask.

  • Enter Professor Cornford

A famous British philosopher, Professor F M Cornford of Cambridge University, demolished these lines of non-reasoning back in 1908 when he published “Microcosmographia Academica”.

The good Professor describes “the wedge” as being a way of saying you would

“…not act justly now for fear of raising expectations that you may act…justly in the future – expectations which you are afraid you will not have the courage to satisfy.”

Cornford adds that anyone using “the wedge” as an argument is in fact acknowledging that they do not believe the proposal itself is wrong because if they did

“….that would be the sole and sufficient reason for not doing it.”

 The “dangerous precedent” is similar. Here Cornford suggests what is being said is

“…you should not now do an admittedly right action for fear you, or your equally timid successors, should not have the courage to do right in some future case which, ex hypothesi, is essentially different, but superficially resembles the present one.”

  • Striking a bargain over child pornography?

More reprehensible in some ways are those who make no attempt to deny that blocking child abuse images is a good thing to do. Instead, and often without any apparent embarrassment, they say they would do it in a trice if only they could be sure blocking would forever be limited to that. Terrorism, anorexia or suicide related materials frequently get mentioned as examples of the types of content it is known others are pushing to be blocked.

This argument turns sexually-abused children into bargaining chips. Its proponents are saying, in effect, “If you promise not to ask us to block anything else, we’ll go along with blocking child abuse images.” Not an honourable position. The obvious difference between child abuse images and web sites that mention or promote anorexia is, of course, the question of legality. One is illegal the other is not.

  • No one is defending the existence or creation of the images

Everyone agrees that children should not be sexually abused in the first place. Everyone therefore also agrees that these kinds of images should not exist at all. No one has argued the images do no harm to anyone and therefore it is safe to leave the images where they are, or ignore them. No one has argued that the removal of these images denies anyone their right to free expression, free speech or artistic endeavour.

What I have found difficult to understand therefore is, if everyone is agreed there is no basis at all for the images to be created, to be there or remain there, why do they find it difficult to agree that they should be blocked pending their deletion? It does not add up.

  • Worries about hidden agendas

Or rather it does not add up unless you believe that the machinery created to allow blocking to happen would be misused i.e. that instead of it being used only to block child abuse images it will inevitably start being used to block items which the Government of the day or some other powerful interest in your country simply finds inconvenient or undesirable.

This sort of takes us back to the bargaining chips point but the draft Directive simply provides no authority for blocking anything other than child abuse images. With proper scrutiny, with proper checks and balances, there should be no reasonable grounds to suppose the machinery would or could be misused.  There is also something more than slightly bizarre about denying yourself a tool to help enforce the law because you believe the tool itself might foster illegality.

  • Scrutiny and the potential for review are essential

Some say an image should only be blocked, or removed, if a judge or a court has said so. Leaving aside the practicalities of this for the moment,  presumably this point is made because of a worry that images that are not child pornography could, innocently or otherwise but nonetheless still wrongly, be selected and put on a blocking list or made the subject of an incorrect take down notice.

Even judges and courts can get things wrong. No system is going to be 100% foolproof 100% of the time. Transparency, scrutiny, accountability and the potential for review are the best guarantors against excess and error. It is entirely possible to create a body which,  when deciding whether or not to issue a take down notice or whether or not to include an address on a block list, is bound by established legal principles. Not only is it possible. It has already happened. They are called hotlines.

  • Other routes to the same or similar images

Perhaps the lack of enthusiasm for web blocking is rooted in the knowledge that there are alternative ways of getting at the same or similar images e.g. by abandoning the web and using a different technology such as Peer2Peer, or by using known methods to circumvent web blocking. Maybe opponents of blocking think the images are likely to reappear elsewhere sooner or later, so blocking is pointless? But no. That cannot be the reason because they support deletion. If you were against deletion or thought deletion was no use there would be some logic in also opposing blocking but no one is arguing that the images should not be deleted.

  • Knowledge of how to navigate these other routes is unevenly distributed

The techie world generally dislikes solutions which it believes are “broken” i.e. that can be defeated or circumvented, but the point is the knowledge and the determination to circumvent or defeat blocking are very unevenly distributed.

One academic has also argued that by a sort of reverse engineering it could even be possible to use blocking to help you identify the addresses of sites containing child pornography which you could then go off and access using a work around. I read the paper that demonstrated this theory. I count myself reasonably technically literate but I was lost quite early on. My point is there is undoubtedly a class of persons – highly technically competent determined paedophiles, devoted collectors – for whom probably no barrier will ever work. Nobody knows or can know how big this group is but commonsense and experience tell us it is not very large.

  • Succour and comfort to totalitarians? I don’t think so

Critics say that if the EU gives official blessing to the use of blocking it would enable totalitarian regimes in other parts of the world to point to it to justify their own oppressive use of blocking. It would cut the moral ground from under our feet, or so the argument goes. No it wouldn’t.

First of all the totalitarian states complained of are already doing it. Clearly they did not wait to see what view we took in Europe before deciding whether or not this was right for them. Secondly the EU draft Directive conforms with human rights laws, what these other states do does not.

  • The technology is neither inherently good nor inherently bad

Just because some people do bad things with certain tools, it does not mean the same tools cannot also do good things. Guns can be used to murder people. They can also be used to shoot diseased rats.

  • Is it proportionate?

Is blocking disproportionate? Proportionality, like beauty, is in the eye of the beholder. Is proportionality something you judge by the cost of implementation? If so what price do you put on an abused child’s rights? What price do you put on the right of tens of thousands of children whose images are on the internet and are being viewed and downloaded millions of times per annum?

Moreover you cannot generalise about cost. Everything hangs on the existing architecture of the individual ISP or company establishing a system to enable blocking. For some ISPs the cost of implementing a blocking regime could be very close to zero.

  • Will it work at scale?

Where is the evidence that blocking works? In 2009 BT calculated that its blocking system was preventing up to 40,000 attempts per day to reach illegal addresses containing child abuse images across its consumer broadband network. Grossed up to cover the whole of the UK consumer broadband network that means blocking in the UK is preventing up to 58 million attempts per year. A very high proportion of these were undoubtedly generated by botnets and other automated systems but many will not have been. Whatever their source every one of them was illegal and ought not to have happened.

  • Scepticism yes, cynicism no

Some say blocking is simply a cunning ruse to allow the police, politicians and bureaucrats to appear as if they are doing something to save children when in fact they are doing little or nothing. If that truly was the motive of these people, which I doubt, it would indeed be reprehensible, but it still would not make blocking wrong.

Then it is said if blocking is in place the police and their political masters will have no incentive to ensure the images are actually deleted. It will look as if they have done their job. The pressure is off. They will not feel the need to ensure the crimes the images represent are properly investigated.

There are a number of things wrong with that perspective. The most obvious is that blocking only works in relation to images which are housed outside of your own country so it should have zero impact on domestic investigations.

An extension of this argument holds that if you have blocking installed in your country you will not bother asking foreign Governments and foreign police services to do more in relation to the images published from machines in their territories.

It is about now you have to pinch yourself and remember to dismiss any unworthy thoughts you may be harbouring about the people who say things like this. To you it may sound like a last desperate attempt to preserve the status quo, to justify doing nothing, but it still has to be confronted as an argument.

And actually I cannot find an answer. All I would say is it requires a huge degree of cynicism to believe that Governments and police forces around the world are looking for ways to avoid protecting children and that the only thing between them doing more or sitting back on their haunches is the fear of or wish to avoid criticism from colleagues overseas.

  • Keeping up the pressure for deletion remains hugely important

A more substantial concern is that whilst it might be all very well for you to feel you have done what you can to protect your country by implementing a blocking regime, the fact remains that for as long as the images remain on web servers they can be seen by people in other countries and the children within the images remain in jeopardy.

There is no getting around that, and it is a very good reason for continuing to keep up the pressure worldwide to speed up deletion times everywhere using all diplomatic and other tools that are available. But it would be very curious reasoning to leap from that position and conclude you should therefore still do nothing to restrict or limit the circulation of the images where you can. This again takes us back to an earlier point: it is absurd to argue the images should be available to everyone until they are available to no one.

The opponents of blocking say what they really want is for deletion to be achieved much more swiftly so blocking is unnecessary or redundant. So do I, so do the children’s organizations, but the difference is we are not willing to eschew blocking in the meantime. That amounts to punishing the children for other people’s past failures.

  • The EU should use all available tools and instruments

Absolutely we want the EU to use its clout in trade and other negotiations to get firmer commitments about speedier deletion times for images but we do not think it is right to leave things as they are until that happens. With the best possible will in the world these types of negotiations could take years, yet we have a good partial solution available now. Large sections of the internet industry and mobile phone industry agree. They are already deploying blocking mechanisms against child pornography on their global networks. More will do so if this measure becomes law.

  • The processes followed must be demonstrably fair and reasonable

If an item is legal there is no case and no basis for the State or any of its proxies to insist that it be blocked. But if it is illegal, if the processes to determine the illegality conform with national and international law, if the system is constructed in such a way as to exclude the reasonable possibility of the system being misused for other purposes, if there is adequate scrutiny and accountability, then there is no case for refusing to block.

  • All major children’s organizastions are of one mind

Every major children’s organization in Europe that has taken a position on the subject of blocking has come out in favour. If your starting point is the best interests of the child there is no way you can end up concluding that, actually, after a lot of careful thought, a great deal of soul searching and hand wringing it is best to leave pictures of children being raped on full public view for a little while longer.

About John Carr

John Carr is a member of the Executive Board of the UK Council on Child Internet Safety, the British Government's principal advisory body for online safety and security for children and young people. In the summer of 2013 he was appointed as an adviser to Bangkok-based ECPAT International. Amongst other things John is or has been a Senior Expert Adviser to the United Nations, ITU, the European Union, a member of the Executive Board of the European NGO Alliance for Child Safety Online, Secretary of the UK's Children's Charities' Coalition on Internet Safety. John has advised many of the world's largest internet companies on online child safety. In June, 2012, John was a appointed a Visiting Senior Fellow at the London School of Economics and Political Science. More: http://johncarrcv.blogspot.com
This entry was posted in Child abuse images, Self-regulation and tagged , , , , . Bookmark the permalink.

One Response to Campaign bulletin: child pornography on the internet – bad reasons and non-reasons for opposing blocking

  1. I love natural aloe vera juice of gel for sunburn treatment, and I also believe it is makes a great skin protector to
    use after shaving or on dry patches of skin. Due to the fact
    that the Aloe Vera contains so much water you’ll need to take
    this fact into mind when choosing the soil composite.
    Raw juices are typically digested and absorbed and are generally superb for those using a poor appetite,
    nausea, digestive problems or an inflamed stomach or intestines.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s