Tuesday, 17 September 2013

Web filtering - making the best of a bad job

I've had an idea.  It probably isn't a very good one, but I offer it as a provisional solution to the looming problem of overblocking by internet filters that are supposed to protect children from porn but invariably end up by annoying everyone, children and adults alike.

It came to me while watching a debate on "protecting children from harm on the internet" at the Lib Dem conference on Sunday.  The motion being debated was proposed by the former Play School presenter Floella Benjamin, now apparently a baroness.  It was a strongly worded, Mumsnet-friendly motion, proclaiming inter alia that "the long term effects on young minds of early exposure to often violent and abusive sexual material is hightly damaging to impressionable young people and may significantly alter their attitudes to sex and violence."  It called, not merely for opt-in filters as the government wants and as the ISP industry has agreed to deliver, but for mandatory age-verification for sites offering explicit material. 

There was also an amendment tabled by Julia Cambridge, which offered slightly different proposals.  It demanded that filters should be offered to all new broadband customers and on an annual basis to existing customers.  But it also - and here's the interesting bit - demanded that:
"...when a customer who does not have a filter installed or who has a filter switched off starts loading a website which would have been filtered out that customer is made aware they have not got a filter installed/switched on and is provided with the the opportunity to install/swtich on a filter."

Neither the motion nor the amendment found much favour with the hall.  Most who took the floor spoke out against it, many making the point that filters are notoriously unreliable and catch material that is far from pornographic, including sites that are important educational resources for enquiring teenagers.  Cambridge councillor Sarah Brown complained that her own blog was banned from filtered public wifi.  Jess Palmer, a possible Lib Dem star of the future (assuming the party has a future, of course) spoke passionately and movingly about growing up as a lesbian and finding on the internet resources denied to her either at school or at home, as well as of the joys of fanfic that would undoubtedly be banned by the web filters that will some be the default setting for UK households.

Sarah Brown compared web filters to her inaccurate SatNav that apparently confused Plymouth with Warsaw.: "Automatic systems behave like this because they don’t know enough to realise when they’re doing something obviously ridiculous. They just do it anyway."

But such concerns, though true, don't seem to have deflected the government from its determination to be seen to be doing something to protect children from the big, bad internet, and now that the larger ISPs are on board it seems inevitable that the majority of the population - those who "simply click through", as David Cameron put it, or who are too embarrassed to demand adult content, or fearful of being put on some GCHQ list of porn-addicts - will soon be consigned to the "family-friendly" web.  Others, perhaps including major news sites and many independent writers and bloggers, will censor themselves rather than run the risk of being mislabelled as "adult" or "explicit".  Discussions will be circumscribed, language will be censored, creativity will be stifled.  Caution will run riot.  Most people's online experience will be of an internet designed for children and for corporate giants.

How to prevent that scenario, which even politicians acknowledge is undesirable but which seems inevitable given the fallible and play-safe set-up of algorithmic filters?

I suggest crowdsourcing.  The way to do it is hinted at in the aforementioned amendment. 

If you're a customer of TalkTalk, the one major ISP that currently offers default filtering at source, all your traffic is routed through servers run by the Chinese company Huawei.  Even customers who do not want filtering still have their traffic routed through the system, although matches to Huawei's database of blockable sites are in that case disregarded.  It seems likely that other filter systems will work the same way - opening up the possibility of customers who haven't opted for blocking nevertheless being informed that they are about to visit a contentious site. 

The conference amendment suggested that customers would then be offered the chance to change their minds about filtering, a proposal that seems frankly bizarre.  Perhaps, though, they could be given a different choice - a button to report the block as inappropriate.  Sites reported by users as having been improperly flagged could then be reviewed by human beings and, if found to be educational, or literary, or simply not pornographic, then placed on a safe list.  Thus customers who choose not to have their web experience mediated by cyber-nannies could help improve the service for those who do, mitigating the worst aspects of default filters.

I don't pretend that this is a perfect solution, or even a workable one.  But default ISP-level filtering is now an inevitability in the UK, whether we like it or not.  The question is how to make the best of a bad job.