Tableau Prepares for Future with Data Preparation and Collaboration


More than 13,000 self-described “data and visualization nerds” gathered in Austin, TX, recently for Tableau Software’s annual customer conference. In his inaugural keynote, Tableau’s new CEO, Adam Selipsky, said that nearly 9,000 were first-time attendees. I was impressed with the enthusiasm of the customers who had gathered for the event, cheering as company officials reviewed product plans and demonstrated new features. This enthusiasm suggests Tableau has provided capabilities that resonate with its users. Among other things, the company used the conference to outline a number of planned product enhancements.

Not resting on its laurels as a data visualization software provider, Tableau demonstrated a variety of new capabilities for displaying data. Enhanced tool tips were common to several of the new features. For example, visualization within visualization displays a graph on top of another graph as the user hovers over data points. Hover lines, which are vertical lines on charts, show a variety of data values not just the data value in the chart. Interactive tool tips showing values used to filter the displayed data can show items with similar characteristics in the charts or graphs.

Mapping capabilities are enhanced so that layers of different sets of data or different types of location data can be displayed simultaneously. Tableau executives also demonstrated a natural-language query interface that can interpret concepts such as “near” or “expensive” using machine-learning techniques. All in all, these interface enhancements make it easier for users to get more information as they are navigating through data.

Data preparation is a hot topic, as I have written. For its part Tableau is adding self-service data preparation through an effort called Project Maestro. Data preparation will be a stand-alone application that enables users to access and prepare data prior to visualization. With interactive capabilities to join tables, split fields and profile data, users can be more certain that the data is ready for the analyses that they want to perform. Long having focused on business users, Tableau is trying to appeal to IT also by introducing new governance capabilities to Tableau Server. As data sources are published, they can be reviewed and certified. IT personnel can ensure that the proper joins, security rules and performance optimizations are incorporated. Business users can trust the content on which they are relying. The governance features will also include the frequency of usage and impact analysis to identify where data sources are used.

In some other areas Tableau is catching up with the competition, adding features that users value but were missing from its products previously. One example is alerts. Another is collaboration, including threaded discussions in which analyses can be embedded. I’ve written about the value of collaboration for years.vr_dac_25_use_of_collaboration Our data and analytics in the cloud benchmark research shows that more than half (52%) of organizations use or intend to use collaboration with analytics. Users should welcome these new features, but Tableau could make collaboration even more valuable by adding tasks or assignments so that analysis can lead to action, not just discussion. Tableau also continues to round out its cloud capabilities, but like some other vendors its cloud implementation has not achieved parity with the desktop product. As part of its cloud efforts, Tableau also announced a Linux version of its server, which is a common operating system for cloud-based systems.

Tableau will also be introducing a new data engine based its acquisition of Hyper earlier this year. Tableau uses an in-memory data engine to accelerate the speed of interacting with data, but the current engine has posed two limitations that it hopes to address with Hyper. The first is scalability, and the second is near-real-time capabilities. During the conference the company demonstrated a development version of Hyper loading more than 300,000 rows per second. The Hyper architecture allows data loading to happen simultaneously with querying, eliminating down time. Hyper will appear first in Tableau’s cloud offerings as a proving ground and later in the on-premises versions.

The conference allowed Tableau to polish its core strengths of visualization and usability. It promises a set of valuable enhancements as well as some features that will help it keep up with the competition. Most of these new features are scheduled to be released in 2017. Tableau has also advanced its mobile capabilities to address its gaps I identified in the 2016 Ventana Research Mobile Business Intelligence Value Index to more easily consume analytics on smartphones and tablets. Tableau has a loyal following of customers that allows it the luxury of time to deliver these capabilities. You may want to check these out, too, especially the clever visualization techniques being added.

Regards,

David Menninger

SVP & Research Director

Follow Me on Twitter @dmenningerVR and Connect with me on LinkedIn.

Research Agenda: Using Business Analytics to Make the Most of Data in 2016


Throughout the course of our research in 2016, we’ll be exploring ways in which organizations can maximize the value of their data. Ventana Research believes that analytics is the engine and data is the fuel to power better business decisions. Several themes emerged from our benchmark research on incorporating data and analytics into organizational processes, and we will follow them in our 2016 Business Analytics Research Agenda:

  • Analytics enabling continuous optimization
  • Streamlining analytics with data and information technology
  • Digital technologies transforming analytics for business.

Organizations generate data continuously, and they should analyze and refine it continuously – that is, optimize it – to improve their actions, decisions and processes. Our research shows predictive analytics to be a key tool for such optimization and indicates the convergence of analytics with applications to accomplish these goals. Nearly half (49%) of participants in our next-generation predictive analytics benchmark research said they prefer to deploy their predictive analytics within business applications to enable real-time execution. Finding that more than nine out of 10 (92%) vr_NG_Predictive_Analytics_08_time_spent_in_predictive_analytic_processparticipants plan to deploy more predictive analytics, we will monitor the role of data science and analytics teams in these activities. We will also follow how the exploration of big data sets intersects with the discovery portion of analytic processes.

Organizations need to use data and information technology better to streamline analytics. Our research suggests that they will need to invest in data preparation for analytics, as three out of five (62%) participants identified accessing and preparing data as a challenge in their predictive analytics process. The same percentage (62%) cited accessing and integrating data as a main reason for dissatisfaction with their analytic processes. Organizations can address some of these challenges in gaining access to information by virtualizing data. We also will follow the progress of organizations in using cloud-based technologies to help streamline their analytics; more than three-fourths (76%) of participants inour data and analytics in the cloud benchmark research said that accessing data from cloud-based systems is important. We will continue to investigate and report on these topics in the course of planned benchmark research on data preparation.

Other digital technologies will impact analytics for business. More than four-fifths (82%) of organizations in our data and analytics in the cloud research can access and review data and analytics on mobile devices. Simplification of mobile technologies can speed access to an organization’s information, yet access alone is not sufficient. We will track developments in collaboration across the organization to enable action on analytics. The Internet of Things (IoT) – the network of devices, vehicles, buildings and other items that are embedded with electronics,software, sensors and connectivity – brings data and in-motion challenges due to the combination of large volumes of information and requirements for low latency in processing. We will be studying these topics in our Internet of Things and Operational Intelligence benchmark research. Finally, we will follow the evolution of natural-language processing and cognitive systems as methods for searching and presenting information to ordinary business users.

We’ll be studying these issues and more throughout the year. Please download and review our full Business Analytics agenda. I invite you to participate in this research as it is conducted during the year. I look forward to sharing the insights it provides and to helping your organization apply those insights to its business needs.

Regards,

David Menninger

SVP & Research Director