big-dataBig data. Anyone sick of it yet? Not the technology: the pulling together and cross-matching of disparate data on a massive scale is arguably producing more social, commercial and political change than any other innovation over the last couple of decades. No, we’re talking about the term: “big data”.

Once a handy collective noun for a bunch of technologies such as Hadoop, NoSQL and distributed computing that evolved in tandem to handle the extreme end of the storage and analytics spectrum, big data has been adopted by an enormous volume and variety of marketing departments seeking to rebadge their products for a big data age, to such an extent that as a descriptive term “big data” is now virtually meaningless.

It has been noticable that, in contrast to previous years, many of the key industry players we’ve interviewed during 2013 have displayed a marked reluctance to use the term ‘big data’ when explaining what they do.

The result of attempts by every man and his elephant to crash the big data party has, inevitably, been a growing dispondency and cynicism around the term, which is unfortunate as there are many interesting and varied use cases emerging around these nascent technologies, some of which are featured below, with doubtless many more to come as projects that are currently in pilot phase (as, let’s face it, most are) come to fruition.

So, perhaps this time next year we won’t be featuring the top 10 big data stories at all, instead higlighting advances in real-time or streaming analytics. But for now here they are: the top ten Computing big data stories of 2013.

10. Big data hype manufactured by analysts and media, says SAS CEO Jim Goodnight

One person who was keen to differentiate his organisation from what he views as “big data hype” is SAS CEO Jim Goodnight.

Goodnight argued in October that “big data” is just another buzzword following on from other recent trends in the IT industry, even suggesting that analysts promote them in order to generate business.

“The term big data is being used today because computer analysts and journalists got tired of writing about cloud computing. Before cloud computing it was data warehousing or ‘software as a service’. There’s a new buzzword every two years and the computer analysts come out with these things so that they will have something to consult about,” he said.

9. ‘Lump in the learning curve’ for big data adoption – Met Office

James Tomkins, Met Office portfolio technical lead has no doubts about the potential of big data technologies to reveal new patterns by analysing large datasets. However, he cautioned in August that moving away from the traditional relational data model to the schemaless NoSQL architecture can be quite a challenge.

“In terms of non-relational data structure, I think there’s a definitely a lump in the learning curve for people to take their first steps away from the more traditional, more well-known relational data-model handling,” Tomkins told Computing.

However, he went on, despite the difficulties it is a nettle that organisations like his definitely need to grasp.

“I think there will always be a place for relational data, but I think that as we look more and more around the organisation now, we’re identifying requirements where, perhaps, we’re not modelling and storing and using our data as best we can, because we’ve had to always stick with the one representation of our data,” he said. By John Leonard read more