Mediacheck

Digital Child Porn Watchdog: Too Big a Bite?

'Cleanfeed' a threat to free speech say critics.

By Bryan Zandberg, 7 Mar 2007, TheTyee.ca

Girls Using Computer

How to best protect?

Last month's bust of a worldwide child porn ring was a troubling victory for child advocates and enforcement.

Although it's a major breakthrough in the fight against sexual exploitation of children, Lianne McDonald says the problem is still far more widespread than people think. She points to a study done in the United States by The National Centre for Missing and Exploited Children, which found that a single graphic sexual image of a five-year-old girl was posted to over 800,000 separate pages on the Internet in the short span of six months.

McDonald is the executive director of Project Cleanfeed Canada, a private program that aims to stamp out online child exploitation through a firewall set up in co-operation with major Canadian Internet service providers (ISPs). Following in the footsteps of Britain, this past January, Project Cleanfeed began compiling a list, which isn't released to the public, of foreign websites that host the illegal content. The filter, says McDonald, will curb both accidental and intentional viewings of child sexual exploitation.

But stopping child pornographers has proved a difficult, controversial problem, and opponents say Project Cleanfeed is riddled with flaws. Arguing it will do little to thwart serious pedophiles, critics say Cleanfeed's unpublished blacklist is an invitation for abuse. Some fear it will open the door for an "architecture of control" to be cast over what is increasingly a vital democratic medium.

'Censorship system'

Bill Thompson is a technology columnist for the BBC who has been monitoring the British version of Project Cleanfeed since it was implemented in his country two years ago.

In an interview by telephone from London, he said Canadians ought to be more critical of any plans to implement a similar filtering program here, which he describes as "a closed, privately run and privately managed censorship system" due to the fact that only the staff of Britain's Internet Watch Foundation are privy to the contents of the list.

Thompson doesn't agree with more extreme advocates of digital rights who say the blocking of any information is a form of censorship. While online access to unsavoury material like hate speech and defamation isn't illegal, intentional access to child pornography is, and Thompson thinks that's a good thing.

Thompson does, however, take issue with the way the list is compiled. "I don't mind restrictions on freedom of speech per se, because I'm not an absolutist. What I do mind is it being done in ways which are opaque and completely outside any form of governmental or judicial review."

Project Cleanfeed U.K.'s figures have been staggering: there have been up to 35,000 blocks of blacklisted URLs a day. And these are just for one ISP, British Telecom (BT), the only provider currently participating in the program, although the others will soon be legislated into following suit, according to Thompson.

But Thompson says raw numbers like these mean little when there's no way for a third party to substantiate the URLs on the list, which, once determined, are automatically fed into BT's system.

Do the numbers reflect the people who actively type in criminal URLs? he asks. Are they blocks of massive spam e-mail campaigns? Were real predators thwarted? Or are the censors blocking far too much?

"BT will occasionally release figures of how many accesses they have blocked, but even then they don't have any way of knowing."

And as to whether there have been any cases of abuse on the part of Cleanfeed -- conflict of interest, vendettas, squelching of leaked government documents -- Thompson said there is no way to tell by virtue of the perfectly closed system.

'Grey area'

Back in Canada, Project Cleanfeed Director Lianna McDonald says that her organization wants to avoid over-blocking by only blacklisting clear-cut cases of pre-pubescent child pornography -- even though under the Canadian Criminal Code the definition includes all images of individuals under 18 years of age.

In Britain, no such distinction is made, and any sexual image in which the individual appears to be under 18 is in question.

"One of the things we had to be very careful about was to make sure that we were not blocking legitimate content," she said in an interview by telephone from her office in Winnipeg, Manitoba.

"That takes away that larger grey area."

The list is generated by the staff of a parent charity organization called Cybertip, which is Canada's official tip-line for reporting all forms of online child sexual exploitation -- child pornography, luring, children exploited through prostitution and child sex-tourism.

Cybertip's track record is impressive. Through their partnership with the RCMP, their tipline has lead to 22 arrests and the shut down of 1,700 Canada websites.

But while the RCMP works closely with Cybertip to enforce infractions in Canada, foreign-hosted pages are verified solely by Cybertip. As in the U.K., if staff decide a site contains child porn, the suspect URL is added to the list, with no outside scrutiny or oversight.

To date, Bell, Bell Aliant, MTS Allstream, Rogers, Shaw, SaskTel, Telus and Videotron have voluntarily signed on to begin blocking blacklisted traffic.

To McDonald, it's simply business as usual.

"We are Canada's tip-line, so we do this already."

Fatal flaws

But Cory Doctorow, an author, blogger and Fulbright scholar, sees serious cracks in Project Cleanfeed's structure. [Doctorow was travelling at press time, and referred The Tyee to recently posted comments on the issue.]

"The idea is fundamentally broken," wrote Doctorow, who is speaking later this week at SFU to launch his new book.

"First of all, it seems to me that keeping a secret list of 'evil' content is inherently subject to abuse. This is certainly something we've seen in every single other instance of secret blacklisting: axe grinding, personal vendettas and ass-covering are the inevitable outcome of a system in which there is absolute authority, no due process and no accountability."

Doctorow also sees problems in the kind of online activity Cleanfeed purports to stop. The following is how Cleanfeed's website reads on the issue:

"The system is intended to reduce accidental access to child sexual abuse images as well as create a disincentive for those trying to access or distribute child pornography."

In terms of blocking both wilful and accidental viewers, Doctorow finds the above statement misguided.

"These methods only stop stupid child pornographers from gaining access," he said.

"Smart child pornographers use Tor, or IRC, or BitTorrent, or Usenet, or e-mail to get their material. Any dedicated child pornography collector will not be stymied by Cleanfeed."

Michael Geist, the Canada Research Chair in Internet and E-commerce Law at the University of Ottawa and a "cautious supporter" of Cleanfeed, agrees that project will do little to stop serious child pornographers from getting their hands on new material.

"It shouldn't be mistaken that Project Cleanfeed is going to solve the problem," he said in an interview by telephone at Chicago's O'Hare airport. "It deals with a relatively small slice of the issue."

As for how often Canadians inadvertently stumble across pornographic images of children, both affirmed that, in their experience, it's a rare, if not almost non-existent, occasion.

"I am probably six sigmas [degrees] above normal for adventuresome web-surfing, and I can count the number of times I've inadvertently stumbled on clear child porn on the fingers of one hand," wrote Doctorow.

Common or uncommon?

McDonald concedes that serious predators will continue to access child porn regardless of the firewall.

"Without question, we are not saying that this is going to stop any hardcore users or viewers of this type of content. At the same time, it doesn't mean that if can put something in place, we don't."

But she counters Doctorow's argument about the inadvertent viewer with the assertion that when it comes to this particular issue, anecdotal information is at best a poor indication.

"How often are people going to be open about whether or not they come across child pornography on the Internet?" she asked.

McDonald thinks inadvertent viewings happen all the time. She added that Cybertip.ca has received over 12,000 reports of various forms of online child sexual exploitation in its two years of operation. Of those, 6,000 were forwarded to police. 4,500 of those 6,000 cases were related to URLs, says McDonald.

She believes that number would be much higher if more Canadians knew the tip-line existed. Surveys show that only about 13 per cent of Canadians have heard of Cybertip. The U.S. tip-line receives an average of 700 to 1,100 reports per week.

"To have individual people saying that they're not coming across it, I don't know that that is a statistically or research-robust way of suggesting how much content [there is] or how easy it is to find.

"In terms of report volumes, that's not what we are hearing and that's not what we are seeing."

Accustomed to Customs

While there are appeals processes in both countries, critics say it's wrong that blocked parties are not informed when they are blocked.

Thompson says the only way for a person to discover if he or she is being blocked would be to try to access his or her own website through the server of the ISP that uses the filtering technology.

Doctorow thinks that's unfair. He calls the appeals process "flawed" because of the numerous censorship scenarios he thinks can be expected as a result of the program.

"There's no way that we can achieve democratic transparency while keeping this list a secret. What do we do about orphan works on the list? Works by authors who have no economic or social agency [to launch an appeal]? How do we know what's on the list if we can't see it? How do we know it's not badly construed?

"If I publish a comic that is potentially classed as child pornography (such as Alan Moore's brilliant Lost Girls), and it gets stopped by Canada Customs, that entire process is public and subject to scrutiny. The publisher, the public, an importer, a reader, or the author can all petition to have this reconsidered. In the end, the comic can be found to be artistic and taken off the blacklist.

"That's how Canada handles child porn today. It works. Canada doesn't treat the list of banned child porn as a state secret. What you're proposing is a radical shift in the transparency of Canada's official censorship regime."

While he understands that handing out a list of blocked URLs would be illegal, Thompson says people should nonetheless be told when they've been prevented from seeing an illegal web page.

"You can make an argument for the inadvertent user being protected, but you can't make an argument for not letting people know that they are going to a site that is being blocked."

Cleanfeed Canada's website offers the following detail: "Since BT implemented Cleanfeed in 2004, the IWF has not received a single request for appeal."

'Architecture of control'

More so than in the U.K., Cleanfeed Canada, now in its third month of operation, has shown itself open to allowing its lists to be vetted by third parties. One proposed plan would see a random sample inspected by expert judges.

McDonald says her organization is currently reviewing how to implement some form of judicial oversight, but to date no such protocol has been added to the project's constitution.

Regardless of whether or not that safeguard is put in place, some civil rights activists fear more blocking will be on its way once the technology is in place.

In the U.K., for example, the Internet Watch Foundation -- the group that provides the list of URLs to BT -- "is trying to extend its remit to cover incitement to racial hatred, and possibly even other material," relates Thompson.

With a near complete absence of network neutrality provisions to safeguard speech on the Internet in Canada, similar censorship could be in the works on this side of the pond, says Geist.

"Is it possible that there are those that will use this to try to seek broader filtering or blocking? I think there is that potential, and I think it's important that people guard against that and speak out against that where that happens."

John Dixon, a member of the executive of the B.C. Civil Liberties Association, is concerned about what would happen if Canada decided to change its mind about limiting blocking to pre-pubescent images and opted instead to follow the definition in the Criminal Code.

"I don't think most Canadians understand the breadth of materials that are defined as child pornography in Canada," he cautioned.

"Any attempt to firewall child pornography as defined by Canadian law, if a genuine attempt, would involve enormous abuses of the civil liberties of Canadians [and] capture so much innocent material that many freedoms would be curtailed."

Under Canada's rules, a pencil drawing of a nude 17-year-old could be taken as child porn, he said.

For Thompson, the threat of ever-expanding categories of censored material once a firewall is implemented is no light matter.

"This technology presents a very straightforward and simple mechanism to extend those restrictions onto the Internet." he said. "We have built the architecture of control, it just needs to be deployed."

'Narrow goal'

Despite McDonald's figures about the frequency of inadvertent users, Doctorow is unconvinced.

"It seems that the use-case here is quite muddled. Is this really about stopping people from 'inadvertently' seeing bad content?" he asks. "That's a pretty narrow goal for such a sweeping program.

"Like so many other systems that 'keep honest users honest,' Cleanfeed will only serve to keep honest users in chains, and allow bad actors to skip off without any substantial inconvenience."

When asked about the question of the cost of the program, in terms of its potential for abuse versus the benefit of protecting accidental viewers, McDonald maintained that even if some can't see the value, Canadians have the moral obligation to act.

"In each image is a victim," she reasoned, "If that was your five-year-old daughter who was subject to one of those photographs and you could stop -- or try and attempt to limit -- that image from being viewed, you'd want that done."

She returns to the example of that single image that fanned out to 800,000 different pages.

"What we are seeing is a proliferation or lack of control of these images once they are out."

Related Tyee stories:

 [Tyee]

Read more: Media

What have we missed? What do you think? We want to know. Comment below. Keep in mind:

Do:

  • Verify facts, debunk rumours
  • Add context and background
  • Spot typos and logical fallacies
  • Highlight reporting blind spots
  • Treat all with respect and curiosity
  • Connect with each other

Do not:

  • Use sexist, classist, racist or homophobic language
  • Libel or defame
  • Bully or troll
comments powered by Disqus