big dataIn July, I brought my 15-month-old and 4-year-old sons to a medical center in Palo Alto. Eight weeks earlier, I had moved to California to launch my company Hexigo in the U.S., and it was time to see a doctor so my family could have health insurance.

At the Palo Alto Medical Center, we handed over documents from our insurance company detailing what tests had to be done before we could get coverage. It turned out the insurance company required my baby and toddler to get their cholesterol tested which required drawing blood from their little arms.

Now, anyone with a grain of common sense knows that a toddler is not going to test positive for cholesterol problems—how many 15-month olds have a heart attack from cholesterol?

Stunned, my doctor called the insurance company to ask what in the world they were thinking. A representative on the phone, who was clearly reading the standard company policy, insisted the doctor had to test for cholesterol in order to get my kids insured.

Why were my kids subjected to completely unnecessary, and painful, medical testing? For that matter why are some of our experiences with businesses becoming less pleasant than a trip to the DMV (a “welcome to America” rant I can save for another occasion)?

The answer is that as companies come to rely more heavily on data, they risk applying data-driven solutions without any regard for common sense and the human experience.

If you look at the entire U.S. population, yes, testing cholesterol levels is good practice for determining the right price for coverage. This after all is a country where more than one third of adults and 17 percent of kids are obese, according to the CDC. This costs the US more than $147 billion per year. As an insurer, you want to charge a premium if the data suggests you’ll be paying for a customer’s triple bypass or stroke in a few years.

However, testing babies for cholesterol is not something a human being comes up with—it is clearly a decision driven by machines and, one can only assume, was missed by rational human beings. It is big data run amok, decision-making without the human touch.

The hype will have you believe that any company that doesn’t leverage (the favored term) big data will fail. Amongst this noise, one critical fact gets overlooked: big data is only as useful as the human decisions behind it. It’s the people who interrogate the data, make hypotheses, test conclusions and then determine the final direction that make big data succeed or fail. One of these data scientists, however, overlooked the fact that babies and toddlers don’t have cholesterol problems—they don’t need to be subjected to expensive, unnecessary testing.

The expression “big data” has become so over-used that most people don’t know what it means anymore. Is it processing huge volumes of data, is it business intelligence or is it filling spread sheets and rifling through reams of information? Is it the technology or the processes? Confusingly, it seems to be all of these things lumped into one.

The most common interpretation of big data is the systematic analysis of huge volumes of data to find patterns and behaviors that are not readily apparent. It is finding that diamonds in the rough.

Big data has rapidly created an entire sub-industry that generated $11.59 billion in 2012, according to the research community Wikibon. By 2017, they predict the big data market will be worth $47 billion. The International Data Corporation (IDC) reports, that the digital world will grow 300-fold between 2005 and 2020 to contain 40 trillion gigabytes of data. Yet, only one percent of this data is currently being analyzed. By James Cattermole Read more