Today’s BBC Article About Skunk is Objectively Terrible. Here’s Why.

Woman smoking marijuana (stock image)

“Good science should be about proving something beyond any reasonable doubt, not about drawing hasty conclusions because of a correlation.”

 

I was very disappointed to see that the BBC decided to run this story about the dangers of skunk today – not because it says something bad about marijuana, but because it’s actually based on a study that is, objectively, an absolutely terrible example of investigative research and analysis.

Now, before I break down the reasoning behind my firm belief that this study is terrible, I just want to start by saying that I do understand that scientific research is incredibly important for a number of very good reasons. Without proper testing, it would be all but impossible to make educated (and informed) decisions about our lifestyle choices – irrespective of whether they relate to drug use, or recreational activities. It would also be all but impossible to remove the stigma associated with marijuana if we didn’t have genuine, scientific proof that it is not a danger for the overwhelming majority of people who use it.

 

That is why it is vital to have fair and unbiased studies into the impact of marijuana use on our health – particularly if such studies conclude that certain marijuana strains or variants are actually detrimental to our overall well-being. Studies that are not fair and unbiased are a very different story though. I am completely convinced that bad science is the enemy of everything good. Bad science obscures truth, muddies the water, and gives rise to the kind of hysterical soapbox narratives that first inspired this War on Drugs.

The BBC's article: 'Skunk-like cannabis' increases risk of psychosis, study suggests

The BBC’s article: ‘Skunk-like cannabis’ increases risk of psychosis, study suggests

 

How does this apply to the study mentioned in today’s BBC article? Well, if you take a look at the actual study that the article references, it quickly becomes apparent that it is very, very flawed in a number of relatively straightforward ways. Firstly, the validity of their variables is incredibly questionable – not least because they rely upon the honest testimony of their participants to quantify their skunk use. Unverified data is all but useless, and has absolutely no place in a genuine scientific study. Secondly, the researchers make no attempt to rule out other possible causes of psychosis. It has long been established that it is possible to suffer from a genetic predisposition for psychosis, yet the study makes no attempts to rule out first-time psychosis sufferers with a family history of the condition.

 

It is also interesting to note that we’re meant to assume that the participants are all regular people from the same area of south London, despite the researchers’ failure to provide pertinent data about the way in which the control group was selected. Again, there is no evidence that members of the control group had not consumed a lot of skunk (without negative effects) for years. Just as there is no proof that half of the patient group that reported regular skunk use were not either exaggerating their consumption or unaware of the marijuana variant they have been consuming.

This is a retroactive study – it’s focussed on analyzing a potential cause for pre-existing conditions. It’s like walking into a doctor’s waiting room, noticing that 50% of people are wearing red sweaters, and assuming that wearing a red sweater makes you more likely to catch a cold. Except for the fact that we’d all assume, in this example, that something else was to blame here, because we know damn well that the red-sweaters-to-cold ratio is an absolute coincidence, irrespective of the implied statistical significance of the results. We’re assuming the opposite here just because it’s a drug, and I think that is a real problem. Good science should be about proving something beyond any reasonable doubt, not about drawing hasty conclusions because of a correlation.

If the researchers at Kings College London wanted to prove that skunk actually causes psychosis, what they should have done is had one group consume varying (controlled) levels of skunk over a period of time, and then compared the amount of psychosis sufferers in that group to the amount of psychosis in a similar control group who were forbidden from smoking skunk during the same time period. It would also be done on a thousand or so people too, so that the groups in question were actually statistically significant. That is how the clinical trial for a new medicine would be run. I suppose it’s possible that such a study would be almost impossible, given that skunk – and marijuana in general – is a controlled drug. However, that does not excuse the reliance on terribly unscientific practice, particularly when that same unscientific practice is being used to justify incredibly bold claims, like: “scientists found the risk of psychosis was five times higher for those who use it every day compared with non-users.”

 

Now, it may well turn out that these concerns about skunk are perfectly valid, but it will take a lot more than the simple identification of a correlation to establish that. Until robust scientific analysis proves that we should be avoiding skunk, we’re absolutely not on-board with vilifying it “just because”.

 

 

 

 

 

 

Authors

Related posts

Top