The Business Intelligence Productivity Paradox


You may, or may not be aware that the use of information technology in business has never been correlated with productivity. This goes back a long way – way back to the famous statement by Robert Solow in the 1980s: “You can see the computer age everywhere but in the productivity statistics.” More recently I knew someone who analyzed in great detail the spend on IT and the success of the organization. He found no correlation. In fact his research was simply seen as a skunk at an otherwise very lucrative party, and so he abandoned it and joined the party.

Fast forward to today, and once again productivity is stagnant, despite huge investments in big data, analytics, business process management etc etc. There is actually an unwelcome truth associated with technology, and it is mentioned in The Signal and the Noise by Nate Silver. Whenever businesses get hold of new technologies, they tend to screw up, and this has an adverse impact on productivity. It is no mistake that the productivity paradox, as it was called, was noticed in the 70s and 80s, just as businesses started to adopt the PC and smaller computers. Silly things were done that had almost no bearing on reality (complex statistical and analytical models for example) , and produced a negative effect overall. Today we see the same phenomenon taking place. Everyone is a ‘data ninja’ producing largely meaningless analysis, employing very easy to use visual tools, and twisting the facts to suit their own purposes.

I have to admit that I’ve been emboldened to write this by the appearance of increasing numbers of authors prepared to present sane analysis of the big data and analytics fantasy. One such article is The stupidity of Business Intelligence and why this ‘hot’ sector needs an A.I. overhaul – written by the former CTO of a technology company. Meaningful business analytics may be beyond the ability of the human intellect. Sure, there is low hanging fruit, but surely that has been picked by now. We see increasing numbers of data sources, increasing data complexity, proliferating analytical techniques and the ridiculous notion that by playing around with charts and graphs we can somehow get a handle on this. Artificial intelligence is needed to plough through data combinations, analytical techniques and various parameters to spit out the relatively small number of meaningful signals from the ocean of meaningless ones. Fortunately such products are starting to appear – BeyondCore (now snapped up by Salesforce) and a product called Stories, both of which take the grunt out of looking for meaningful patterns in data.

It seems quite likely that naive analytics is doing more harm than good, and indeed I have spoken with senior managers who see the dangers. They really are not happy at the prospect of having a good fraction of their employees producing unskilled analysis – particularly when they should be doing other things. The success of easy to use analytics tools is easily understood – and boils down to personal drivers.

The next few years will see something of a shakeout as the adverse productivity implications of widespread use of analytics technologies becomes better understood, and as the technology itself makes such laborious analysis unnecessary – namely in the form of AI assisted analysis.

Whether we like it or not machines will soon be analyzing our data far more efficiently and far more meaningfully than even the most skilled human effort. Resistance will be futile.