7 Reasons Big Data Analytics initiatives fail
Every business worth its multi-million-dollar tagline wants to understand Big Data Analytics and leverage it. What I have realised is that it takes not only skill but finesse as well to understand and appreciate Big Data in all its beauty and derive true and timely business value from it.
Of the many reasons that Big Data Initiatives could fail, I have picked out seven. Bear in mind that these are purely from my own experience in the field. For better context, I have put the reasons under three separate themes, organizational quagmire, old habits dying hard and new approaches creating new problems.
Organizational Quagmire
- Queue of Unabated Requests
An investment bank elevated a lead technologist to Chief Data Officer. On taking up office, she was inundated with requests.
Here’s a look at the top five requests:
The CRO wanted to address regulatory and risk management gaps using Big Data solutions. The CMO wanted deeper customer understanding to inform marketing spends and drive revenue through personalization.
The COO wanted to benchmark operations against competition and build dashboards to optimize and improve efficiencies. Trading Managers sought solutions around portfolio optimization, automated research and similar information. The CIO wanted to reduce costs by leveraging artificial intelligence for production monitoring and other technology operations.
Big Data can address a range of problems but the creation of a steering group to funnel, prioritize, filter and sequence requests becomes critical. Otherwise, there’s a risk that the Big Data initiative becomes the ‘panacea’ to all the organization’s problems.
- Lacking Synergy and Accountability
A retail company’s IT team was structured horizontally with sub-teams for program management, business analysis, user experience, deployment and database management. Each team had their own lead and sub-culture. A little down the line, awareness and interest in Big Data led the company to add three more sub-teams, namely, data science, data engineering and data visualization.
Soon, a high profile Big Data project came along and people from each of the IT sub-teams were assembled to collaborate and deliver a solution. However, communication between the individual groups was poor. With each transfer, a lot was lost in translation.
A lean startup delivery mode coupled with collective responsibility to the product could have mitigated this scenario. It also helps to ensure strong vertical synergy between the Data Scientists, Data Engineers, Data Visualisers and the Business Analysts on the project.
- Poor Business User Analysis and Engagement
The IT arm of a retail organization sponsored a prestigious Data Analytics group. The group quickly gained free rein within the organization because business viewed it as an ‘IT project’ and did not intervene. Six months went into building sophisticated models for demand forecasting and inventory management.
When a demo was arranged with the business users, a reasonably expected response ensued – ‘How does this model compare with our however archaic, older model?’ and ‘The new model is not really necessary. The old model does what we want, at least for now.’
- Careless Positioning Statements
The aforementioned retail company could also have suffered from misinterpreted internal PR.
Quite often Big Data Analytics tools are meant to provide inferences such as trading recommendations, process and asset recommendations and more. This crosses paths with what senior business analytics and functional experts also work on, therefore it’s important to be prudent when positioning these tools for the larger organization.
People should see Data Analytics as a force multiplier rather than a threat. This helps create the right human-machine symbiosis and interactions.
Old Habits Die Hard
- Falling for Big Data, Unprepared
A product company’s CTO placing Big Data at the fulcrum of his discretionary investment secured a sizeable budget for Data initiatives through sheer passion. The CTO invited popular Big Data firms to present demos and the company, blown away, bought the best looking algorithm for 50% of their hardware budget.
A year down the line, the company brought on board ten new customers and data volume increased a hundredfold. But the product company’s current infrastructure only supported 10% of their use cases and the CTO was caught between a rock and a hard place.
New approaches create New Problems
- Expanding Without Foresight
A heavy equipment manufacturer wanted to use predictive analytics for effective lead generation. It planned to aggregate wear and tear telematics data from on-field heavy equipment along with machinery specifications and customer service schedules. The information was going to predict service visits and recommend preventive maintenance interventions to customers.
The manufacturer then decided to use brute force to integrate the accumulated data into the legacy Data Warehousing and Business Intelligence infrastructure. They wanted to build a stored procedure engine for preventive predictions.
The result was an engine seize.
The reason was that it’s never going to be enough to re-label a company’s DW/BI investments to Big Data Analytics. Big Data is a specific set of tools that is meant to solve a specific set of problems. There is a deeper data mindset that needs to be introduced into the organization when expanding its offerings to include the likes of predictive analytics.
- Building Organizational Trust in Data products and algorithms
An oil major wanted Big Data Analytics to provide insights and optimize their oil pipeline and storage assets. One of their quantitative experts developed an ingenious algorithm that pulled in all pipeline and storage capacity and consumption details plus demand and supply scenarios. Using this information, the algorithm presented recommendations.
For the leads to trust the algorithm and the results it promised however, the Data team had to develop detailed and creative data visualizations. This requires excessive pre-planning and proper instrumentation of code and pipeline to record the right intermediate snapshots. Via Prakash Kini