5 Big Data Trends Impacting Financial Institutions in 2016
The adoption of industry standards and more mature platforms will shift big data’s focus from IT-driven infrastructure projects to business-driven data solutions. Those who adopt big data strategies early and aggressively will realize operational efficiencies and top-line growth.
In 2016, we will see solutions to many of these challenges and the emergence of powerfully differentiated strategies from organizations that leverage their big data assets. Here are some key developments to look for in 2016.
- The Emergence of Powerful Big Data Use Cases
One of the challenges with big data solution adoption has been a disconnect between business and IT. In many instances, IT has led the way, building out a big data infrastructure and adopting a myriad of new tools, often without the context of a specific business problem. The result is frequently a solution looking for a problem to solve.
Smarter organizations have taken a different approach, building solutions to specific business problems or building a data-as-a-service offer, giving the business the flexibility to select the tools they need to solve their specific problem. In 2016, we will see much more of these two approaches.
Some of the key use cases driving big data adoption include compliance, regulatory risk reporting, cyber security and trade surveillance. In 2016, we will see increased interest in revenue-generating use cases such as customer 360.
- The Smart (Semantic) Data Lake
In 2015, we saw the emergence of the data lake — a single store for all enterprise data characterized by the ability to collect vast amounts of data in its native, untransformed format at a very low cost.
The data lake offers much promise but it also has limitations. Cataloging data sources, harmonizing disparate data and adding meaning to the data continue to be challenging for many organizations.
- Democratization of Data Access
The smart data lake tools also solve another challenge with data lakes: end-user access. Most data lake solutions require manual coding for transforming and preparing data for consumption by BI tools. With a smart data lake, the semantic models used to add meaning to the data can also be used to provide critical end-user capabilities, such as: data cataloging, data meaning, data provenance and self-service data analytics.
Data described by semantic models does not presuppose the queries and analytics it needs to support. The semantic descriptions enable end users to find the data they need and to query it in business terms, without any coding.
- Broad Deployment of Big Data solutions to Mid-Sized Organizations
Thus far, the complexity and immaturity of the tools required to implement big data solutions has kept it mainly in the domain of large, technically sophisticated organizations. There is also a perception that “big” is the most important dimension of big data. However, variety is an equally important dimension and organizations of all sizes have variety in their data.
- The Rise of Big Data Governance
A recent survey by the EDM Council highlighted that we are at an inflection point for data governance in the financial services industry. Most organizations have recognized the need for enterprise data governance and have set, or are setting, their strategy. BCBS 239 is the key driver for these programs but we are in the infancy of being able to measure their business value. In 2016, big data initiatives will create even more demand for data governance, as processes, controls and security, currently applied in data silos, will need to be enforced on the shared enterprise data lake. Source