I’ve never particularly disliked math itself; there’s something comforting about having at least one endeavor in life that has an identifiable “right” answer. In fact, if you’re ever stressed about a major life decision, try reviewing high school algebra; balancing equations can be oddly – yet incredibly – therapeutic (Yes, “Nerd alert.” But don’t knock it till you’ve tried it).
Statistics, however, takes all that comfort of certainty away.
In my first statistics class, I once made the fatal flaw of statistical newbie-ism and used the word “prove” in a sentence. Before I could finish saying “The results prove that…” the professor held up his hand and said – what has since become my mantra when it comes to statistics – “We don’t PROVE anything in statistics.”
Yes, statistics is a powerful tool for exploring relationships and probabilities, but it doesn’t necessarily set out to prove things.Take vitamins for example: A hypothetical statistical study can show us that people who take vitamins tend to be healthier than those who don’t. Most of us would interpret this to mean that taking vitamins makes people healthier – a causal relationship.
BUT, the statistical conclusion doesn’t actually say that: Yes, the statistics demonstrated that people who take vitamins tend to be healthier, but not that the vitamins CAUSED this health. Maybe it was healthy user bias – the type of people who take vitamins are probably more health conscious in the first place, and will therefore be healthier regardless. Or perhaps those who can afford the additional expense of purchasing vitamins can also afford gym memberships and more preventative health care. Either way, the news article would lead with something like, “Studies show vitamins make you healthier,” and even though there’s some shy statistician in the background saying, “Wait, I didn’t quite say that,” we all go out and stock up on Centrum anyway.
I should probably insert a hackneyed “correlation does not equal causation” statement here, but it’s more fun just to look at it: Here’s a fabulous website showing factors that, from a statistical standpoint, are highly correlated, but logically have nothing to do with each other. Like this 99% correlation between the divorce rate in Maine and Per capita margarine consumption in the U.S.
Forget marriage counseling, look how much we as a nation have helped those couples in Maine by consuming less margarine. It’s laughable, but don’t forget that our initial vitamin conclusion was based on the same thought process.
As Charles Wheelan, author of Naked Statistics says, “Our ability to analyze data has grown far more sophisticated than our thinking about what we ought to do with the results... Statistics cannot be any smarter than the people who use them.”
This is why I say statistics lacks the comforting certainty of other kinds of math: Other mathematical genres tend to measure and quantify in relatively objective, authoritative ways: the angles in a triangle are going to add up to 180° whether you like it or not. I find statistics to be much more interpretive. And here’s the scary part about that: With any other realm in which people make interpretive claims – we question these subjective claims with healthy skepticism. Statistics, however, has all these numbers, facts, and graphs that make it look like people are making objective, irrefutable conclusions – interpretations guised as mathematical certainties. That’s terrifying.
So isn’t the solution just to learn more about doing statistics? Not necessarily. Again, as Wheelan put it:
“…statistical malfeasance has very little to do with bad math. If anything, impressive calculations can obscure nefarious motives…. Judgment and integrity turn out to be surprisingly important. A detailed knowledge of statistics does not deter wrongdoing any more than a detailed knowledge of the law averts criminal behavior. With both statistics and crime, the bad guys often know exactly what they’re doing!”
(For a good book about these “bad guys,” pick up Joel Best’s Damned Lies and Statistics.)
So for me, studying statistics has had the same effect that some of my theologically-inclined friends say comes with studying religion: The more you lean, the less certain you are about what you, or anyone, actually knows.
More and more, when I read statistical claims, I feel like I’m hanging out in a room full of double agents – I’m always on guard and don’t know who is plotting to murder me. In other words, not a room I enjoy being in…
So I guess that’s my new years resolution: Not just to learn more about statistics, but finding ways to identify, trust, and dare I say like the “good” statistics.
Knowing myself, this will be an uphill endeavor, so if anyone has any tips on how to facilitate this, I’m all ears….
Feel free to comment below or on the blog’s Facebook page.
Follow on Twitter @chriskbacon