Abusing the evidence
Yesterday the Guardian delivered the coup de grace to one of the Home Office's most notorious and shameful campaigns of public disinformation. In two impeccably researched and faultlessly argued pieces, Nick Davies ripped apart the statistics that the government has used to justify its policies on people trafficking and prostitution more generally. One of his reports showed how, despite high-profile police raids and the lavish use of resources to tackle what was claimed to be a widespread problem, very few perpetrators have been brought to justice. A six-month campaign by "government departments, specialist agencies and every police force in the country" - Operation Pentameter 2 - was hailed by Jacqui Smith as "a great success", yet yielded only 15 convictions.
Davies' other article laid bare the process by which the figures commonly used by ministers and repeated uncritically by the media actually came into being. First, estimates were made using flawed methodology and unjustified assumptions - often by researchers with a settled view (that all prostitutes are by definition abused victims, for example, or that all foreign sex-workers are by definition "trafficked"). Next, caveats were disregarded and figures rounded up. Then different sets of dodgy statistics were lumped together without regard to accepted scientific practice. "Up to" became "at least" and then "by the most conservative estimate: the actual figure is probably much higher".
Some numbers were plucked out of thin air to decorate speculative press articles, from whence they were inserted uncritically into Home Office reports and thus became official statistics. In the end, a figure which had itself no basis in research became so widely quoted that its very repetition lent it weight, until it came to be described routinely (and not inaccurately) as "the most widely-accepted figure".
In this way, a survey which identified 71 probable victims of sex trafficking yielded a figure of 4,000. This bogus statistic then served as the basis for sensationalist reporting, policy formulation and finally legislation.
This dubious use of statistics has continued in the face of strongly voiced opposition from more careful researchers, such as Dr Belinda Brooks-Gordon (who gives her response on Charlotte Gore's blog) and Dr Petra Boynton. And it's not hard to see why. As Boynton notes:
In the end, it's the women the government claims to want to help who are the main victims of the government's warped priorities. They're also the least likely to be listened to. As Brooks-Gordon points out,
Davies has done more than anyone in Britain to expose bad journalistic practice, above all in his book Flat Earth News. The Guardian, by contrast, has done more than any other media outlet to propogate the views of anti-prostitution campaigners like Julie Bindel for whom the statistics - real or invented, plausible or absurd - are in any case of less importance than their ideological crusade. Their convictions have an almost religious intensity: it is not because of the evidence that they believe in the prevalence of sex trafficking, but despite it.
Rahila Gupta, for example, chides Nick Davies for not mentioning "the report into trafficking by a home affairs committee, published in May, which gave an estimate of 5000 trafficked women and children in the UK, based on an aggregation of the figures provided by those working in this field", apparently oblivious to the fact that those were precisely the figures whose credibility he had so comprehensively destroyed. Meanwhile Denis Macshane, who used an unsubstantiated headline in the Mirror to claim in Parliament that there were no fewer than 25,000 victims of trafficking, describes those people who bother to check their facts as "self-appointed experts indulging in a futile war of statistics". Priceless. Or, as Brooks-Gordon puts it, "beyond parody".
Part of the problem is that "trafficking" is hard to define. At one extreme, unwilling girls are kidnapped, drugged or otherwise coerced from their home countries and end up as sex slaves in brothels run by criminal gangs. This is what most people think of when they hear the phrase "sex traffic"; and, as Davies stresses, it does occur. The government, however, has sought to conflate this fairly small-scale problem with all instances of women coming into Britain, legally or otherwise, and ending up as prostitutes. And in so doing they ignore good evidence that a significant proportion of such women are already involved in the sex trade and see Britain as a more lucrative market. According to Davies, even mail order brides are counted as trafficked women in some officially-approved surveys.
But it would be a mistake to blame confusions of this type for prompting the government's legislative folly. Nor have they been hoodwinked by either radical feminists or the religious groups who with whom feminist campaigners have entered into an unholy alliance. Davies tellingly compares the government's wilful misuse of research evidence to the scaremongering over Saddam Hussein's alleged WMD. But such an abuse of research has become second nature in the Home Office.
A more recent and even closer parallel can be found in the document put out to justify the government's proposals to continue with its illegal storage of DNA data of the innocent. The research used was tendentious and was subjected to devastating scrutiny by others in the field, but even those responsible for it distanced themselves from the way the government had attempted to use it. Last week, in what was widely seen as a climbdown (but which was more likely a way of heading off an expected Lords defeat) the government shelved proposals to write their proposals into law. The retention of DNA samples by the police still goes on, however, in defiance of rulings by the European Court of Human Rights, and will do for as long as the government to get away with it.
The pattern described by Davies recurs time and time again. The government, often already decided a particular policy, adopts or commissions parti pris research which it proceeds both to cherry-pick and to exaggerate. Its press releases, backed by authoritative-sounding "official figures" are then regurgitated by the press, exaggerated some more, emotionalised (and usually personalised via some well-chosen case study), editorialised, action is demanded, until some minister shows up on the Today programme to be berated by a fired-up John Humphrys demanding "what are you doing about X?" X, of course, being the problem (or imagined problem) highlighted and exaggerated by the government's misuse of statistics and research. And the minister proudly announces the policy the government had had in mind all along - only to be told by a spokesperson for a pressure group or fake charity that it's a good start, but not nearly enough.
We've seen this process in campaigns to tackle binge drinking or obesity (aka "the obesity epidemic"), in demographic predictions and assessments of terrorist threats, in education policy, in foreign policy - in fact, there can be few areas of government policy untainted by the selective use of statistics and tendentious research. The government has loudly proclaimed its commitment to "evidence-based policy-making" while instead pursuing policy-based evidence making. In the short run, this has produced some extremely bad legislation. In the longer term, it risks destroying public faith in any evidence put forward by the government. Even when their figures are accurate no-one will believe them.
There's a simple way round this: for all research reports and statistics issued by government departments to be independently validated and peer-reviewed, and departmental press releases checked for accuracy and balance by outside experts before being issued to the media. That, though, would present politicians with a dilemma. Either they really would have to base policy-making on evidence - which might lead generally to better laws, but would have the effect of increasing their dependence on experts, which risks reducing democracy to a cipher. Or they could drop the pretence and admit that scientific research is only one of several factors that influence policy-making, others being popular prejudice, knee-jerk responses to moral panics and the desire to please their supporters and backers. At least then we would be free to have a proper democratic debate.
Davies' other article laid bare the process by which the figures commonly used by ministers and repeated uncritically by the media actually came into being. First, estimates were made using flawed methodology and unjustified assumptions - often by researchers with a settled view (that all prostitutes are by definition abused victims, for example, or that all foreign sex-workers are by definition "trafficked"). Next, caveats were disregarded and figures rounded up. Then different sets of dodgy statistics were lumped together without regard to accepted scientific practice. "Up to" became "at least" and then "by the most conservative estimate: the actual figure is probably much higher".
Some numbers were plucked out of thin air to decorate speculative press articles, from whence they were inserted uncritically into Home Office reports and thus became official statistics. In the end, a figure which had itself no basis in research became so widely quoted that its very repetition lent it weight, until it came to be described routinely (and not inaccurately) as "the most widely-accepted figure".
In this way, a survey which identified 71 probable victims of sex trafficking yielded a figure of 4,000. This bogus statistic then served as the basis for sensationalist reporting, policy formulation and finally legislation.
This dubious use of statistics has continued in the face of strongly voiced opposition from more careful researchers, such as Dr Belinda Brooks-Gordon (who gives her response on Charlotte Gore's blog) and Dr Petra Boynton. And it's not hard to see why. As Boynton notes:
Davies mentions that academic research was ignored or exaggerated. But there’s more to this story than that. Politicians have systematically disregarded academic research, holding in higher regard studies without ethical approval and full of methodological flaws. It’s not that politicians such as Harriet Harman and others were not aware of the wider evidence that discusses the real health and social needs of prostitutes. It isn’t that academics have not been trying very hard to explain to journalists and ministers that there is reliable evidence that could underpin policy. It’s that a number of politicians have deliberately disregarded the advice from academics specialising in prostitution research, opting instead for cherry picked studies and unreliable statistics to suit their agenda.
In the end, it's the women the government claims to want to help who are the main victims of the government's warped priorities. They're also the least likely to be listened to. As Brooks-Gordon points out,
For all her frothing at the mouth over ordinary punters, [Harriet] Harman will not listen to ordinary (ie active) sex working women and to my knowledge has refused to meet with any of the real sex worker organisations so far, preferring second hand information from those who have received money from her department.
Davies has done more than anyone in Britain to expose bad journalistic practice, above all in his book Flat Earth News. The Guardian, by contrast, has done more than any other media outlet to propogate the views of anti-prostitution campaigners like Julie Bindel for whom the statistics - real or invented, plausible or absurd - are in any case of less importance than their ideological crusade. Their convictions have an almost religious intensity: it is not because of the evidence that they believe in the prevalence of sex trafficking, but despite it.
Rahila Gupta, for example, chides Nick Davies for not mentioning "the report into trafficking by a home affairs committee, published in May, which gave an estimate of 5000 trafficked women and children in the UK, based on an aggregation of the figures provided by those working in this field", apparently oblivious to the fact that those were precisely the figures whose credibility he had so comprehensively destroyed. Meanwhile Denis Macshane, who used an unsubstantiated headline in the Mirror to claim in Parliament that there were no fewer than 25,000 victims of trafficking, describes those people who bother to check their facts as "self-appointed experts indulging in a futile war of statistics". Priceless. Or, as Brooks-Gordon puts it, "beyond parody".
Part of the problem is that "trafficking" is hard to define. At one extreme, unwilling girls are kidnapped, drugged or otherwise coerced from their home countries and end up as sex slaves in brothels run by criminal gangs. This is what most people think of when they hear the phrase "sex traffic"; and, as Davies stresses, it does occur. The government, however, has sought to conflate this fairly small-scale problem with all instances of women coming into Britain, legally or otherwise, and ending up as prostitutes. And in so doing they ignore good evidence that a significant proportion of such women are already involved in the sex trade and see Britain as a more lucrative market. According to Davies, even mail order brides are counted as trafficked women in some officially-approved surveys.
But it would be a mistake to blame confusions of this type for prompting the government's legislative folly. Nor have they been hoodwinked by either radical feminists or the religious groups who with whom feminist campaigners have entered into an unholy alliance. Davies tellingly compares the government's wilful misuse of research evidence to the scaremongering over Saddam Hussein's alleged WMD. But such an abuse of research has become second nature in the Home Office.
A more recent and even closer parallel can be found in the document put out to justify the government's proposals to continue with its illegal storage of DNA data of the innocent. The research used was tendentious and was subjected to devastating scrutiny by others in the field, but even those responsible for it distanced themselves from the way the government had attempted to use it. Last week, in what was widely seen as a climbdown (but which was more likely a way of heading off an expected Lords defeat) the government shelved proposals to write their proposals into law. The retention of DNA samples by the police still goes on, however, in defiance of rulings by the European Court of Human Rights, and will do for as long as the government to get away with it.
The pattern described by Davies recurs time and time again. The government, often already decided a particular policy, adopts or commissions parti pris research which it proceeds both to cherry-pick and to exaggerate. Its press releases, backed by authoritative-sounding "official figures" are then regurgitated by the press, exaggerated some more, emotionalised (and usually personalised via some well-chosen case study), editorialised, action is demanded, until some minister shows up on the Today programme to be berated by a fired-up John Humphrys demanding "what are you doing about X?" X, of course, being the problem (or imagined problem) highlighted and exaggerated by the government's misuse of statistics and research. And the minister proudly announces the policy the government had had in mind all along - only to be told by a spokesperson for a pressure group or fake charity that it's a good start, but not nearly enough.
We've seen this process in campaigns to tackle binge drinking or obesity (aka "the obesity epidemic"), in demographic predictions and assessments of terrorist threats, in education policy, in foreign policy - in fact, there can be few areas of government policy untainted by the selective use of statistics and tendentious research. The government has loudly proclaimed its commitment to "evidence-based policy-making" while instead pursuing policy-based evidence making. In the short run, this has produced some extremely bad legislation. In the longer term, it risks destroying public faith in any evidence put forward by the government. Even when their figures are accurate no-one will believe them.
There's a simple way round this: for all research reports and statistics issued by government departments to be independently validated and peer-reviewed, and departmental press releases checked for accuracy and balance by outside experts before being issued to the media. That, though, would present politicians with a dilemma. Either they really would have to base policy-making on evidence - which might lead generally to better laws, but would have the effect of increasing their dependence on experts, which risks reducing democracy to a cipher. Or they could drop the pretence and admit that scientific research is only one of several factors that influence policy-making, others being popular prejudice, knee-jerk responses to moral panics and the desire to please their supporters and backers. At least then we would be free to have a proper democratic debate.
Comments