The UK government recently unveiled a strategy to make adult-oriented content harder for youth to access via filtering, and also make illegal content more difficult to find via search. This was brought to the world’s attention in a speech by PM David Cameron calling for greater protections on childhood innocence.
The filtering, which households can opt-out of, will be done at the ISP-level, and be rolled out voluntarily by ISPs without the need for new government legislation or regulations. Initially, it will be up to new Internet subscribers in the UK to opt-out of this filtering, but eventually all UK Internet subscribers will have to let their ISP know they want access to filtered material.
The nation-wide filter will block sites that contain pornography as well as a range of other material deemed objectionable such as web forums, violent material, and tools for circumventing filters as well as content related to extremism, terrorism, anorexia, eating disorders and suicide, according to advocacy organization Open Rights Group. Of course, since what is considered “adult” is subjective, ORG anticipates that many sites will be miscategorized and blocked.
Beyond these concerns, there’s little evidence that the website of a typical web host in the UK or elsewhere will be affected by UK filtering as long as it doesn’t display adult material on its website. According to University of Cambridge security researcher Richard Clayton, it depends on how the filtering is implemented by individual ISPs.
“In general the block will be applied at the domain name level rather than the IP level,” Clayton says.
This means that if pornographic pictures and family vacation photos share an IP and server space, one wouldn’t cause the other to be filtered as long as they’re on different domains. Furthermore, Clayton explains that because of their high storage and bandwidth requirements, most pornography sites are hosted on dedicated servers rather than on shared hosting.
A site that draws content from other sites, like Google Image Search, could be held accountable for the material that appears on the final website. And because of the likelihood that filtering will be done at the domain level, a site like Tumblr, which provides users different sub-domains, could potentially fall under adult content simply because of its most racy users.
Yet both of these examples, Google Image Search and Tumblr, already classify content as adult or NSFW. This provides features such as “safe search” to avoid mature results and other indications of the age-appropriateness of the content they essentially host. While certainly subject to over-filtering and under-filtering, these self-imposed content filters do take into account that audiences will have different goals when coming to a site.
These tactics can’t guarantee getting whitelisted, but making sure users can “turn off” mature content could well improve their chances of being approved by those responsible for evaluating content, as well as avoiding public complaints, says Clayton. “They’re not going to block Google”, he says, but it could potentially make it harder for UK visitors to access lower profile services with diverse content on the same domain.
When it comes to legal adult material, there is little to fear for web hosts except that customer sites could be blocked by households that choose not to opt-out of filtering, are unaware they can opt-out, or don’t have a means of circumventing filters. Also, the regular procedure for the identification and prosecution of those in possession of illegal adult material will remain largely unchanged – still involving the issuance of a warrant, then searching and seizing evidence on the suspect’s systems.
In fact, the relative ease at which hosts will be able to deal with these changes may be indicative of how the UK plan doesn’t deal with the real issues at play.
“A lot of people have argued that this basically provides a false sense of security for parents in that they’re going to believe that nothing can be seen by their precious little Johnny,” Clayton says. “And once you start taking technical measures in order to evade these blocks, they can see anything they want.”
Cameron’s speech brought up many issues related to how the Internet is affecting British society. And it’s clear that web hosts are implicated even if they’re not directly affected. In instances where they are (such as many social media and free blogging sites), service providers are taking measures to make sure their content is clearly labeled based on its age appropriateness.
When it comes to the eradication of illegal material such as child pornography, the government’s plan to make such material more difficult to find by search should also bear no impact on typical web hosts.
But it also won’t affect what Clayton suspects is the wider breadth of illegal content that doesn’t appear in Google results, and that would be easily found by law enforcement officials. Instead, he says, most participants in illegal networks strive to be untraceable. Prosecution is easier when people “are foolish enough to access the sites directly,” he says, “rather than using something like Tor.” Anonymizing technologies like Tor are becoming well-known among the tech savvy set not just for accessing illegal content but for circumventing surveillance.
Rather than blocking content outright, labeling and categorizing mature content and alerting authorities to illegal content can often be a more effective solution than driving people to seek greater online anonymity and forcing content further underground to where law enforcement has difficulty policing. Web hosts can and often do play a large role in what’s available online, and establishing a balance between free speech and harmful content is in a host’s – and country’s – best interests.