Information Builders vs Tableau Summary
The only similarity between Information Builders and Tableau is that their respective platforms support the creation of charts and dashboards. Tableau is highly focused on a purely visual analytics environment, and its whole architecture is geared around the creation and sharing of visual artifacts. It does not provide any form of language and has no ambitions beyond visual analytics. Information Builders on the other hand provides a business intelligence platform in WebFOCUS, data integration capability and a level of application integration. WebFOCUS does support a language, and this adds considerable power to the platform, although the point at which coding is needed can be too low.
Tableau is an easier to use platform, and wholly geared toward visual analytics. It does not support paginated reporting, or what is sometimes referred to as legacy BI functionality – although businesses do still need conventional reports. There are definite limitations associated with Tableau, and complex analytics may require additional products. However, since visual analytics is currently the vogue, Tableau scores much higher than Information Builders on features such as collaboration, ease-of-use and the management of the visual objects environment.
Information Builders will be of more interest to very large organizations with integration issues and a need to address conventional reporting needs. It is to some extent a one-stop-shop and is better compared with the likes of IBM Cognos and Microstrategy. However its broad capability comes at the price of being something of a ‘jack of all trades and master of none’. It is quite feasible that businesses would want to use Information Builders for ‘legacy’ type BI needs, and Tableau for the self-service visual analytics capability. Ultimately it depends on need.
Information Builders goes back a long way – to 1975. In its early days it provided an application language and database known as FOCUS. Obviously it has moved on and now delivers a BI platform, data mining tools, integration technology, and various applications. It is one of the world’s largest privately held technology businesses, and as such numbers are hard to come by. The latest revenue figures are from 2007 at $315 million. One can only assume that revenues are now in excess of a billion dollars annually. In 2001 it launched a wholly owned subsidiary called iWay software. This is a data and process integration product exploiting service oriented architectures (SOA).
This is the product most people are familiar with, representing most of the BI functionality we have come to expect. It supports the creation of charts and dashboards, as well as more traditional reporting, and serves a range of users, from developers through to end-user self-service capabilities. The platform is very scalable, well governed and comes with a flexible scripting language. Business users are reasonably well catered for, although a common complaint is that more sophisticated needs can only be met by reverting to code. The interface is graphical in nature with drag and drop placement of various artifacts. The platform does not distinguish itself particularly, but it does provide a good set of capabilities for everyday use. Integration with the R language means more complex forms of analysis are available – but it does of course mean users need familiarity with R. Predictive analytics is perhaps the most common use, with an ability to create scoring applications such as credit worthiness.
Since the creation of reports and visuals is wholly reliant on good quality data, Information Builders offers Omni-Gen as a platform for data integration, quality and master data management. It comes in three editions. The Integration Edition supports application and data integration capabilities. The Data Quality Edition, as the name suggests, addresses data completeness, validity, timeliness and accuracy. The Master Data Management Edition supports the creation of a single version of data truth.
No BI platform would be complete without some ability to address ‘big data’. The iWay Big Data Integrator simplifies the creation, management, and use of Hadoop-based data lakes. It provides a modern, native approach to Hadoop-based data integration and management. iWay Big Data Integrator runs on all major Hadoop distributions, ensuring high portability. It ingests and cleanses traditional, mobile, social media, sensor, and other data in batch or streams, using native Hadoop facilities. It also runs under YARN, taking advantage of native Hadoop performance and resource negotiation, and leverages the Spark processing engine, if available.
iWay Tools represent a complex set of capabilities addressing B2B integration and data integration.
Without doubt Tableau set the pace for easy-to-use data visualization and exploration software. In practical terms this means business users can get to their data, typically without assistance from IT, and create graphs, charts and dashboards in a way that is most meaningful to them. Authoring takes place on Tableau Desktop which, as a stand-alone environment, can perform its own analysis, either against the Tableau in-memory database, or against external data sources – databases, cloud data sources, spreadsheets and so on. In a group or enterprise setting Tableau Server acts as a central facility for data access, delivering visualizations, enforcing security and managing user access. Tableau Server distributes visualizations through the web browser to almost any device that supports a web browser – desktops and mobile devices.
The architecture of Tableau Server is scalable, and is well demonstrated by the Tableau Public free service where millions of visualizations (albeit simple ones) are served up every day. It does support some level of extensibility, particularly the coding of bespoke applications that are not natively supported, but users have to resort to XML code to achieve this.
One of the more intriguing aspects of Tableau is its integration with the analytic language R. It is such a stark contrast – the easy to use Tableau product set, and the not so easy to use R programming language. Even so it does give advanced users, and programmers the ability to add other forms of analysis into the Tableau environment, and particularly statistical analysis and predictive analytics. This contrasts with some of the competition (Spotfire particularly) who, in addition to an easy to use visualization capability also offer easy to use statistics and predictive analytics tools.
I set out by saying that Tableau set the pace, but in reality it is now at least equalled by several other products. Qlik Sense and Spotfire have both been reengineered for an easy to use experience, and there are cloud based products such as Sisense and GoodData. And of course we should not forget Microsoft’s latest foray into the world of data visualization and exploration with Power BI Designer. It’s immature, but it will be disruptive.
As with most platforms of this type Tableau presents a drag and drop data exploration interface. It is Tableau Desktop, which can be installed on Mac and PC, that provides the visualization authoring environment. It provides most of the chart types and tabular representations a user might need, with intelligent assistance during the visualization creation phase. Tableau Desktop serves multiple purposes in addition to authoring. It allows users to manipulate metadata, and publish a workbook (a complete visualization package that can be executed by Tableau Server).
Users of Tableau Desktop can elect to load data into the columnar, in-memory, compressed database. Provided the data fits, it’s very fast – although data can also be cached on disk with an inevitable degradation in performance. This has become almost a standard way of delivering fast desktop analytics, and it’s very effective. If high performance analytics databases such as HP Vectra are used then the user can connect directly to these.
Tableau Server users are presented with ready-made workbooks displaying dashboards and reports. These are not static entities, and provide all the facilities for data manipulation a users might wish to perform – drill down and through for example. Accessing data sources is fairly straight forward, and it is a simple matter to blend data from several data sources.
The support for geographic data is particularly well regarded by Tableau users, and finds wide application where location is an important part of the information set.
Tableau will handle almost any form of data – databases, cloud data sources, OLAP cubes (with some limitations), big data databases, Excel spreadsheets – and so on. It also allows users to combine data from as many data sources as necessary. There are two basic forms of data access in Tableau. The first is the live connection where Tableau issues dynamic SQL or MDX (for OLAP cubes) directly to the data source. The in-memory database is a highly compressed in-memory engine that can hold very large amounts of data – because of the compression factor.
Extracts are a major feature in Tableau, and as the name suggests they are ready built extracts, possibly from much larger databases. These are stored in the columnar database, and most data sources can be treated in this way with the exception of OLAP cubes. The sharing of packaged workbooks depends on these extracts for sharing.