Prescriptive Analytics – Business Efficiency Redefined.
The central problem for most businesses is this: how to best use a limited set of resources, given various constraints, so that some measure of performance (profit usually) is maximized. The resources of a retail business for example, will include shelf space, staff, warehousing and cash. Constraints will include legal working practices, the maximum throughput of various stores, and the need to keep customers happy (minimum waiting times at the checkout for example). The measure of performance will revolve around profit, but can include measures of customer satisfaction, employee churn – and so on. Obviously this is simplified in the extreme, but it gives a feel for the nature of the problem.
Once upon a time the technology and methods used to solve problems of this nature came under the heading of operations research, but this has become unfashionable, and so it has been renamed prescriptive analytics. Optimization of resource allocation, given a set of constraints and the need to maximize some measure of performance, is nothing new. The methods date back to the Second World War, and saw rapid adoption in the 1950s. However, much has changed since then, and most important is the availability of data captured from business operations, and from external sources (economic and social data for example). We also have much more powerful computers and algorithms, and so the scale and complexity of optimization problems that are addressable is much greater. It is not uncommon to find problems which involve hundreds of thousands, and even millions, of decision variables and constraints.
Make no mistake however, optimization is an enterprise undertaking. If the operational activity of a business is to be optimized it will need buy-in from senior management, and cooperation between people with diverse skill sets. While the technology has evolved to the point where some optimization problems can be executed in near real-time, the success of such a project depends not on technology, but on in-depth understanding of the business, and an ability to translate this understanding into a model that is amenable to optimization. A solution that is optimal for the business will often be sub-optimal for various departments, and so there needs to be sponsorship at the highest level to smooth ruffled feathers.
Until recently optimization was a lengthy affair. Building a model might take many months or even years, and compute time measured in days or weeks. This was not the end of it however, since many optimal solutions might exist, and there was a need to make sure they actually worked. Today it is still the case that model creation is the major part of the exercise, but this might be measured in weeks or a few months instead of years. It was also the case that data was scarce, and so models were built using best approximations (how the demand for a product might vary with price for example). Today we have the data, and the most exciting aspect of optimization is the ability to feed real parameters into a model on a near real-time basis – thus optimizing business operations as conditions change.
On the technology front we now have a wide range of algorithms that deal with a much broader range of problems. In the early days it was necessary to assume that everything was linear. So the relationship between the cost of a component and the quantity used was linear in nature (double the quantity used and we double the cost). Today we can use nonlinear models, models with integer variables (the number of factories to build for example – since we can’t build half a factory!), and variations on these. Even more exciting is the prospect of combining predictive analytics with optimization. A predictive model may be used to forecast sales, raw material prices, or any item that is amenable to predictive modeling. Since these models often produce probabilities, these can be fed into an optimizing model using stochastic optimization – finding the best use of resources based on the probabilities of various situations arising.
Optimization was once a very stand-alone activity. There was little or no integration with operational systems and other analytical activities. The main challenge today is to integrate with the various data sources and with operational applications. The profusion of data presents its own problems, since it becomes necessary to establish which data is actually important to a model. Data mining can be used to aid in this task, as can other forms of business analytics (data exploration and visualization for example).
Prescriptive analytics is the key component in making analysis actionable. Other forms of analysis may highlight anomalies, show trends, establish correlations and so on, but it is prescriptive analytics, and specifically optimization, that puts this analysis to good use by specifying exactly how resources should be deployed for maximum effect.
Suppliers of optimization technology include:
FICO and IBM are probably the frontrunners for large scale optimization problems simply because of the integrated toolset and application integration capabilities. However, depending on the complexity and scale of the optimization task, the others are also good options that should be explored.