Statistics are the table saw of truth discovery
4/4/13 / Beth Mulligan
Along with most of the statistically savvy world, we’re excited that 2013 is the International Year of Statistics. With so much new interest in numeracy and successful statistical prediction models, like Nate Silver’s impressive election model, we’re hoping that people begin to think of statistical analysis less as “lies, damn lies, and statistics” and more as a “tool for truth discovery”. However, this ‘statistics tool’ can be dangerous when used without proper training – like a table saw. (For statistical support for this analogy, read this.)
Yes, statistical analysis is the table saw of truth discovery. DIY projects may go horribly awry for the uninitiated. You can either use at your own risk or hire an expert.
Experts, as noted by Carl Bialik in a recent Wall Street Journal article, like to see certain pieces of information when numbers are reported. For the most part, we look for: margins of error, sample size, and how the data were collected.
Why do we want to know those things? Because that’s the bare minimum you need to assess quality. And when you’re going to put your money on those numbers (or your clients are) you want quality.
Of course, that doesn’t cover everything you need to know to assess research quality. Getting accurate results from research (market research or otherwise) requires making the right decisions at every stage: what to measure, how to measure it, data collection procedures, and data analysis procedures. There is expertise involved at every step, and for the best results, you want someone who can lead you through the whole process. Being able to design a study with the analysis in mind is a key factor in making sure you can answer the questions you had in mind at the start.
Because each stage of research is important, whenever we encounter an article or report with statistical findings, we immediately look for the methodology and caveats so we can assess how much faith to place in the results, and determine how far the results apply. We never take stray numbers at face value. If we can’t find any details on the methodology our skepticism radars go off. Higher quality data can often be identified by the extensive discussion of caveats that accompany them. Strong research methodology and careful interpretation allow us to cut through the numbers without cutting off a finger.
So if in 2013 you want to start building up your organization on an insightful foundation, don’t be the amateur with a table saw.
Photo Credit: Orlando Lane