Big data has been defined as data which occurs in volumes, varieties and velocities which cannot be handled by traditional database technologies. To this end a large number of database management systems, distributed database architectures and associated utilities have emerged which handle these requirements in a manner that is affordable. And yet handling the data is only a prerequisite for the real task of creating value. In order for this to be achieved it is necessary to have a stack of analytics tools which not only exploit big data, but allows users to explore it in a manner that hides the considerable complexities. Indeed complexity is the necessary price paid for handling massive volumes of greater diversity of data.
The four traditional methods of deriving value from data apply equally to big data, and in this sense there is nothing new here. These methods attract different applications:
- Business Intelligence has been democratized during recent years, with business users gaining direct access to the data and tools which support their own needs. Big data should not inhibit this process, and so tools are needed which provide not only business users, but also analysts and data scientists with a broad range of capabilities to explore data and discover relationships within it. Hiding complexity is a key requirement here.
- Business rule management is concerned with the application of rules to operational activity, and these rules are often numbered in the thousands in many organizations. Since these rule are not only manually created but result from activities such as data mining, their number can increase dramatically, and so a robust, high volume business rules management system is essential.
- Predictive analytics, which use data mining techniques to discover patterns in historical data, represents a core value creation method. With the emergence of big data we are witnessing the creation of many more predictive models, and greater rigor in needed to manage these models in an effective manner. This is required by regulators in many cases, but is also needed to ensure the accuracy, easy modification, and adequate monitoring and documentation of models. A failure to do this results in predictive model chaos, presenting real dangers to the organization.
- Prescriptive analytics uses optimization techniques to find the best deployment of resources based on a set of constraints and objectives. Big data broadens the use of these techniques considerably, supporting more complex optimization problems, which in turn drive greater efficiency.
- Decision management is an overarching infrastructure for solutions which accelerate the cycle from analyzing data to delivering operational decisions that improve efficiency and efficacy. It gives business experts greater control to manage and improve business strategies through the use of business rules, predictive modeling and optimization technologies.
Big data brings with it big analytics – otherwise the data will simply be underutilized. The analytics in turn enable the automation of decisions, and this is the new territory that is up for grabs in all businesses. Managers are becoming increasingly aware that efficiencies and efficacy are no longer simply the province of transaction and process automation, and that decision automation can deliver very high returns on investment – in some cases an order of magnitude greater that traditional systems and applications investments. Just as transaction and process automation require an integrated infrastructure, so do decision automation systems, and this will become a more pressing issue with the adoption of big data infrastructure, and the proliferation of decision automation it will enable. Decision management provides the infrastructure, methods and disciplines required to address this.
The previous article in this series is Why Big Analytics?
The next article in this series is Big Analytics Methods