Addiction Commonality

Alcohol, Opiates, Fat and Sugar are all Addictive Substances: this blog is about that "addiction sameness".

Butter Pig Family

* A butter sculpture of a sow and her piglets

Wednesday, December 29, 2010

‘Alcohol is more harmful than heroin’ part 2: Subjectivity Overdose?

‘Alcohol is more harmful than heroin’ part 2: Subjectivity Overdose?

‘Alcohol is more harmful than heroin’ part 2: Subjectivity Overdose?

Author:Diamanto Mamuneas and Maria Viskaduraki

"If scientists are not allowed to engage in the debate at this interface (between scientific advice and policy making) then you devalue their contribution to policy making and undermine a major source of carefully considered and evidence-based advice."

heroin bottle - Photo: Wikimedia Commons

Heroin use has changed since this

bottle was on open and legal sale.

Photo: Wikimedia Commons

These were Professor David Nutt’s words, as reported by the BBC on Friday 30th October 2009, soon after Alan Johnson MP – then Home Secretary –forced him to resign as head of the UK’s Advisory Council on the Misuse of Drugs. The dispute had been over the reclassification of cannabis from a Class C to a class B drug; Prof. Nutt had said that the recreational drug poses only a “relatively small risk” of psychotic illness. Just over a year and several more resignations later, Nutt is back with a paper in The Lancet (November 2010) that claims to have settled this dispute using the “multicriteria decision analysis (MCDA) approach”.

“Dawkins’ Law of the Conservation of Difficulty states that obscurantism in an academic subject expands to fill the vacuum of its intrinsic simplicity.”

The MCDA approach is a prime example of this principle. In essence, “multicriteria decision analysis” is an obscurantist name for what ordinary people call “open discussion”. Where is the “scientific” and “evidence based” approach in MCDA? Where is the data? A short description of MCDA shows that there is precious little of either.

So how does MCDA work in this case? Over the course of one day, a group of experts convened to decide how each of 20 drugs deemed relevant in the UK scored (from 0-100) in terms of 16 criteria. This was no technical endeavour – the group got together for a discussion where “scores [were] often changed… as participants share(d) their different experiences and revise(d) their views” One must wonder how accurate these experts could be if they didn’t agree to begin with, and what effect a single dominant voice could have had on the group’s output. Clearly, these experts were providing subjective opinions - and some were learning as they went along!

The facilitator of these discussions was an “independent specialist in decision analysis modelling” who applied “techniques that enable groups to work effectively as a team”. This individual’s illustrious title is another example of Dawkin’s Law of Obscurantism, although the particular teamwork-promoting techniques employed are not elucidated. (Could it have been offering to get in the coffee and biscuits?)

“Dawkins’ Law of the Conservation of Difficulty: obscurantism in an academic subject expands to fill the vacuum of its intrinsic simplicity.”

- Richard Dawkins

The authors suggest that at least two drugs were assigned the top score of 100 in all criteria: “Weighting subsequently compares the drugs that scored 100 across all the criteria…” Rather than allowing a tie, the panel of experts was then also allowed to choose which they thought was more harmful. Though this isn’t made clear, the implication is that the top two most harmful drugs (at least) were considered equivalent. In other words alcohol and heroin are equally dangerous. The group then chose to consider alcohol more harmful, in line with Professor David Nutt’s past assertions.

Even if Prof Nutt’s publication had had the use of appropriate data, the 16 arbitrarily chosen criteria could not have been used to assess harm in a statistically valid manner. Because what exactly is the ‘harm’ that drug addiction causes? Consider my (mythical) friend Ben, who has recently acquired an addiction to heroin. He suffers health problems from an overdose, which costs the taxpayer (via the NHS) some money. Meanwhile, his wife leaves him, which causes him some emotional distress and he takes this out on his children, leaving them with serious injuries (that also cost the taxpayer). His wife comes back but kicks him out so he experiences a loss in tangibles, leading him to hold up a bank to obtain cash to replace his losses and fund his addiction… And so on. One thing leads to another and Ben now has first-hand experience of Dependence, Drug-specific Damage, Family Adversities, Economic Cost, Crime, Loss of Tangibles, Injury, Loss of Relationships and Drug-related Impairment of Mental Functioning – nine of those 16 judgement-criteria at least! And those criteria are clearly linked with each other. With the criteria being so interdependent it would be impossible to decide the relative correspondence of each with a given drug. Any attempt at weighting or ranking would be skewed by the correlations between different criteria, and different drugs may behave differently in terms of which criteria are inter-dependent, making comparisons impossible.

Nutt and colleagues attempt to compare their own conclusions to those of reports published in the USA and Netherlands and suggest that a handful of correlations between these experts’ subjective judgements and the results generated in these foreign examples lend credibility to this paper. However, they go on to point out, themselves, that “availability and legal status” varies across countries and that these contexts matter to many of the harmful effects of drugs. Furthermore, finding a correlation between an expert’s opinion on heroin and a published study is not surprising. Changes of mind aside, you would expect an expert to have had knowledge of such studies and to be informed by them (after the fact). In other words, the correlation isn’t a chance event that lends credence to the experts’ expertise – it is simply a sign that they keep track of the studies relevant to their field. We do not know that their judgement translates well to a UK context (far lower correlation was reported in the one example given) or that they are capable of judging the effects of different drugs, where the experts are essentially expressing informed opinion, at best, or guessing, at worst. I would be far more impressed if these experts were able to generate predictions that are then shown to hold true by ‘proper’ data.

The thing about ‘proper’ data, however, is that for many of the chosen criteria, any attempt to collect data would be ambitious. Though the authors imply that user numbers were taken into account for some of the criteria (assuming illegal drug use can reliably be estimated), the effects of illegality are more widely spanning. By way of illustration, one is more likely to drive while under the influence of a legal drug than an illegal one because, while both may lead to conspicuous erratic driving behaviour, the overall cost of being caught is greater in the latter case. By the author’s own admission, legal status is linked to effect ,so any data collected in the current legal context could not be used as a counter-argument to impact on a different legal context. For example, data collected now, while cannabis is illegal, cannot predict the effect of any decision to reclassify cannabis as legal: the very act of reclassifying it will change user behaviour, and make the previous data inapplicable.

Unfortunately, the method employed by this paper to implicitly support reclassification does more to damage trust of science than to lend credence to the ideology. Government is so often the target of demands for transparency, and politicians are often so ill-equipped to interpret multifaceted evidence, that science that seeks to inform policy-making ought to stand on the side of clarity and aim to lend some objectivity to what is already an intrinsically highly subjective process.



References

  • Cannabis Row Drugs Adviser Sacked, BBC News
  • Nutt D J, King L A, Philips L D, 2010, Drug harms in the UK: a multicriteria decision analysis, The Lancet 376(9752): 1558-1565.



No comments: