Tableau has become the most widely used visual analytics platform thanks to its relentless focus on the visual paradigm, excellent sales and marketing, and significant ongoing investment in product development. This focus on visual analytics means there is no scripting language in Tableau, and every effort has been made to ensure that functionality is available via menu systems and a drag-and-drop interface. However the world of business analytics is moving fast and converging. If Tableau had insisted on a visual only product, with no support for other forms of analytics, it would have found the market drifting away from it. More recently it has provided excellent integrations with R and python, and is in the process of rolling out a beefier analytics database engine under the name of Hyper. It was quite well known that for analysis involving several large data sets, Tableau could experience scaling problems. Hyper will alleviate this issue in many, but not all, instances.
The visual environment provided by Tableau is relatively easy to use, although meaningful analysis will require some level of training. It also automates many common tasks, and while this is often useful it can sometimes assume too much – by treating a measure as a measure for example, when the user wants to treat it as a dimension. Even so users generally like the attractive interface and visuals, and Tableau has recently put great emphasis on the mapping functionality within the product. Data preparation has often been cited as a weak point in Tableau, but here again the product will soon see the addition of extra capabilities. Maestro is the name given to new functionality, and true to style Tableau is placing great emphasis on the visual interface so that users can see a real-time sample of data that reflects the transformations they are making.
Positioning
Tableau really grasped the moment when visual analytics became fashionable. When a vendor experiences something like this it is all too easy for them to believe that what fueled success three years ago will continue to do so. It doesn’t work like that in the largely fashion driven IT industry. What was cool last year may be decidedly uncool next year. If Tableau was absolutely resolute in its belief that visual analytics would continue to be flavor of the month for the next five years then we might see its fortunes change. The good news is that it does seem to accept that people may want to do more than look at graphs and maps, by providing good integrations with R and Python. It is also anticipating the growing need to handle greater data volumes and data streams with its Hyper database engine. But the reality for visual analytics is that it is the presentation front-end of analysis. Behind this needs to be robust analytical capability and a rich data environment. Tableau actually comes out quite well when we consider these trends, and its quest to sell very large numbers of licenses into businesses may be rather more successful when it is seen as a front-end rather than an analytical platform per se. Obviously there is a population of users in all businesses who need to slice and dice data on a regular basis, but most users simply want support for their day-today operational activities, and an analytical user interface fulfills this need. Just to add to the dynamics of this situation, the embedding of visuals into production applications is also becoming more popular, and so the visual interface is becoming disassembled to some extent as it is inserted into other applications. Again Tableau offers embedding capability, but it does not compete with the likes of Izenda or Logi Analytics – this was never its forte.
Overall Tableau is making some good moves, but the notion that large numbers of businesses will want hundreds or thousands of users slicing and dicing data on a regular basis is just misguided. Oddly enough it’s becoming very well positioned to become an analysis front-end, in the same sense that Microsoft Windows is simply a user-friendly front end for an operating system. Compared with other platforms it continues to move up the ladder of sophistication, although products such as Spotfire and SAS Visual Analytics set the benchmark in this respect. The next few years could see a transformation in the positioning and technology stack offered by Tableau, and with it considerable acceleration in the growth of the company. The worst thing Tableau could do is continue beating the ‘everyone is a data hero’ drum – it was a sexy message in its day, but things are moving on. And finally, much analytical activity will be driven by AI within five years, and so the need for hundreds of people slicing and dicing data will be reduced even more. Exciting times ahead, and treacherous waters for technology suppliers.
Product Versions
Tableau Desktop can connect to a wide variety of data sources and with the new Hyper data engine offers considerably increased speed and an ability to handle much larger data volumes. Resulting dashboards can be shared using Tableau Server or Tableau Online. It comes in two editions – the Professional Edition supports more data connections and the sharing options mentioned previously. The Personal Edition has fewer data connections and results can be shared by packaging up the data and visualizations in a file.
- Tableau Server comes with additional facilities for managing a distributed analytics capability. It supports scheduling of data refreshes, authorization and authentication, and broadcasting of visualizations to the community of users. Mobile support includes native iPAD and Android apps.
- Tableau Online is a cloud hosted version of Tableau Server with access through a web browser.
- Tableau also offers Tableau Public, which is a web based facility for creating visualizations which can then be incorporated into a web site, and Tableau Reader for viewing Tableau visualizations.
Advanced Analytics
All business intelligence platforms need to support more advanced forms of analytics, and Tableau has responded to this with the addition of clustering analysis within the product, and good support for R and Python.
Tableau cluster analysis, a method of segmenting data based on the values of various attributes, allows users to identify how data might fit into naturally occurring clusters. This is often used in marketing to segment customers, and Tableau will display the best cluster analysis, although users can modify parameters if needed.
Tableau’s support for R is quite extensive, and for users it can be made almost transparent. Data are passed to R via a simple script, and once R has completed processing the results are passed back to Tableau for display. Users need to install RServe, a TCP/IP server for processing R requests, but the process is straightforward and for performance reasons can be installed on a separate server. R analysis that proves to be useful can be incorporated into the Tableau environment in a transparent manner by making functions available on the functions menu. Users will not be aware that R is being called. This extends the analytic capability of Tableau considerably, and it will make an initial suggestion at plotting the results returned from R, which can be modified if needed.
Late in 2016 Tableau announced TabPy, an API that enables evaluation of Python code from within a Tableau workbook. This gateway provides access to significant machine learning capabilities, such as those found in scikit Learn, and it is often the case that just a few lines of Python code can perform complex analysis which can then be displayed in Tableau. Once again if a model proves to be useful it can be incorporated within Tableau and simply referred to by name, and as such the code is transparent to users.
The ability to generate a meaningful text narrative from data is also available in Tableau through the use of platforms such as Narrative Science and WordSmith. These natural language generation tools simply take data from a Tableau chart and analyze it for significance. Analysis of sales data for example might result in a narrative showing how sales compare with previous periods, and which product groups have underperformed and those that have exceeded expectation.
Hyper
The Hyper database, acquired by Tableau in 2016, is the gateway to better performance, and the accommodation of streaming data. Hyper is just 7 years old and was developed at the Technical University of Munich. It possesses the rare quality of being able to handle data updates and insertions at the same time as queries. For now Tableau is focusing on query execution and is coy when pressed on the simultaneous update capability. However it is clear that this would support real-time streaming of data, although Tableau would need modifications to be able to display such data in real-time.
Hyper will replace the Tableau Data Engine and it seems that for many queries users will see something around a ten-fold increase in query performance. These performance increases come from the nature of the Hyper data structures, but also from clever use of contemporary hardware, and particularly nvRam memory. Additional cores display a linear increment in performance, and distributed query processing is likely to be a future.
It is the concurrency of updates and querying that is the most interesting feature, and like Apache Kudu, will herald in architectures where transactional activity and querying happen in an integrated environment. Internet of Things is an obvious application, with its streaming data sources, but traditional applications such as ERP and CRM will also be able to stream data into Tableau without need for a refresh. This is a very important addition to the Tableau product architecture.