Informatica Navigates Carefully to Broader Data Management


This has been a dramatic year for Informatica, a major provider of data integration software. In August it was acquired and taken private by Permira funds and Canada Pension Plan Investment Board for about US$5.3 billion. This change was accompanied by shifts in its management. CEO Sohaib Abbasi became chairman and now has left, and many executives were replaced while Anil Chakravathy became CEO from being the Chief Product Officer. The new owners appear to have shifted the company’s strategic priorities to emphasize profitability with reduced headcount and return on the purchase investment. Despite these changes, during the past six months Informatica has made key product announcements that will impact its future and the future of data management.

First, Informatica sharpened its focus on master data management (MDM) through a set of new applications announced in September of this year []. Along with version 10.1 of the Informatica MDM platform, it announced Customer 360, Supplier 360 and Cloud Customer 360 for Salesforce, which join the already available Product 360. The “360” moniker indicates its positioning that the applications can collect data from any interaction or system and thus provide a 360-degree view to support these applications. I hope this focus will guide its advances in integration and processing of data within the lines of business. Some of these areas are complicated; for example, Customer requires getting interaction data from contact center systems such as those of Genesys, NICE Systems, Verint and others. Previously this has not been a priority of Informatica, but it is required to get a true 360-degree view of customers. This is confirmed in our next-generation customer engagement benchmark research, vr_NGCE_15_supporting_multiple_channels.pngwhich finds that organizations support as many as 17 channels of interaction and that the most often reported issue in supporting multiple channels is difficulty in integrating systems.

Release dates vary for these new 360 applications, as do their capabilities to support mastering and integration of data. Product 360 is available now, and it earned Informatica a Hot Vendor in our 2015 Value Index on Product Information Management. It has advanced support for integration through templates that work with e-commerce products including Demandware, IBM WebSphere Commerce and Oracle ATG and can monitor data for competitive product and pricing changes. In 2014 we selected Informatica customer Geiger as the Leadership Award Winner in Overall Information Technology. At this point, however, with the expanding set of MDM applications and altered priorities after the acquisition, we cannot know to what extent Informatica will maintain its investment in and development priorities for Product 360; this will be worth watching as a gauge of the company’s focus going forward.

In another area Informatica recently announced new capabilities in its support for cloud computing. It appears to be focused on supporting its partner salesforce.com, and particularly the need for lines of business to interchange data in and out of its environments, integration with Salesforce Wave Analytics and the new Salesforce IoT Cloud for the Internet of Things. Informatica as well has advanced its ability to ensure data privacy through Informatica Cloud Test Data Management.

In recent public conversations with me at Informatica World, top company executives showed a lack of support and knowledge of the ecosystems of business applications in the cloud from major providers including Infor, Oracle, SAP and others that have experienced significant growth in customers adopting and deploying their applications.  Though two of these three top level Informatica executives are no longer with the company. It seems to be placing a strategic bet on aligning mostly with salesforce.com; here again only time will tell how well these narrow objectives compare to those of companies that have a more diverse approach to business applications in the cloud. On the other hand, Informatica has released more prebuilt connectors for cloud systems, mostly focused on marketing such as from Eloqua, MailChimp and Salesforce Marketing Cloud (formerly ExactTarget).

Informatica has progressed in integration with analytics tools as already pointed out with Salesforce Wave Analytics; it also demonstrated integration with Domo’s business data integration product at Informatica World. Our analysis is that in each of the lines of business Informatica has greater potential if it focuses on business use of cloud-based applications. Our data and analytics in the cloud research finds that the top external data source priority in three out of five (61%) organizations is business applications.

Informatica’s most important recent announcement is the release of Informatica v10, which is the foundation of its products not just for data integration but for data management overall. With this new version, the packaging options have shifted to a choice of Premium, Advanced and Standard editions to suit the needs and budgets of organizations. The highest level, Premium, includes data validation, monitoring and sophisticated data transformation; the midrange Advanced edition has increased capabilities in scaling, metadata and glossary as well as a real-time option. V10 of Data Quality has advanced profiling with visualizations, workflow and voting for feedback, and has improved its rules, monitoring and matching capabilities along with better administration through partitioning, versioning and parameterization. All of these are essential for enterprise-class processing of data quality processes. V10 of Data Integration Hub supports publish and subscribe interactions, management for cloud and on-premises systems, secure FTP and Hadoop for storage and indexing.

On yet another front Informatica has advanced its big data management technologies, taking a comprehensive approach that is more than big data integration and supports data quality and governance and also security of data in Hadoop deployments. Our research in 2014 on big data integrationvr_BDI_03_plans_for_big_data_technology identified issues in these areas that create difficulty and increase costs in using Hadoop, which our research found is at the top of the list for plans in adopting big data technology.  Informatica’s new release supports dynamic mapping and smarter execution of jobs in Hadoop environments and adds Live Data Map, a metadata catalog for data assets. In addition Informatica has created Blaze, a big data environment that runs in-memory and utilizes YARN and Apache Spark to optimize computation and processing of data. The most valuable element I believe is that design and development can be done in one computing environment but deployment can be done in a range of execution systems including Hadoop. Flexibility in a distributed data architecture is critical for any enterprise deployment that spans on-premises and cloud environments.

I see significant potential in Informatica Secure@Source, which addresses data security. Informatica has been advancing security across its products from monitoring to masking, but its focus on data security with this product has real promise. It earned a VR2015_InnovationAwardWinnerVentana Research Technology Innovation Award for Overall Information Technology in 2015. This product aims to ensure appropriate levels of discovery in data assets through notifications, analytics and visualization – capabilities in which IT management should invest. Hopefully Informatica will continue to invest in data security as its initial improvements appear to be heading in the right direction.

I have been involved in the database industry for three decades, from being a database architect to helping launch the data warehouse and OLAP markets and now analyzing the market for 15 years, and I am impressed by Informatica’s efforts to help IT organizations manage diverse database portfolios. Informatica finally has moved beyond integration alone to management, which is a simple-seeming but critical shift to take advantage of what its product portfolio actually does and provide what its customers expect. Informatica continues to expand the range of data management challenges it addresses for IT and now lines of business. In data preparation its Informatica Rev product has promise but will need better marketing and sales to analysts and operations teams which so far are adopting this type of product from other providers.

Looking to the near future, Informatica has an opportunity to position itself and further unify its products for supporting IoT, which includes streams and processing of events for a wide range of operational and analytical needs. It has demonstrated how it can process log data and parse data for Amazon technologies and has depth in its near-real-time, zero-latency Ultra Messaging products. Expanding into this set of data processing requirements is natural for Informatica with its range of processing and management tools.

Technologically Informatica is poised to meetvr_DAC_08_types_of_data_integration_processes the diversity of data integration process needs that operate across on-premises and cloud computing in just about any configuration today. Our data and analytics in the cloud research identifies the growth in exchange of data between cloud systems and with on-premises ones, which more than half of organizations said they will increase into 2017. Stepping up its commitment to data management is a subtle but critical shift for Informatica. While it will be under more management scrutiny on its priorities and its contribution to financial growth to meet its new owners’ expectations, those owners will need to realize that investment is still required not just to innovate but also to help its customers succeed while hopefully exceeding expected profit margins.

I expect people coming into and leaving the organization throughout 2016 as employees determineVR2015_LeadershipAwardWinner whether the new Informatica is right for them. But Informatica has a vocal set of customers that appreciate its technology; as a case in point, we selected its customer CIT Group and BJ Fesq our 2015 Overall Information Technology Leader Award. Its move to a subscription pricing model for its on-premises software from the past license and maintenance approach should bind customers to it more tightly for them to be able to readily gain access to the latest capabilities; this is a wise step that is occurring widely across the software industry. As organizations consider how best to manage and use data assets located in any application or system, Informatica remains a vendor to have on the short list of those that can fit the organization’s needs for managing and processing data assets.

Regards,

Mark Smith

CEO and Chief Research Officer

Integration of Big Data Involves Challenges


Big data has great promise for many organizations today, but they also need technology to facilitate integration of various data stores, as I recently pointed out. Our big data integration benchmark research makes it clear that organizations are aware of the need to integrate big data, but most have vr_BDI14_performance_01_overallyet to address it: In this area our Performance Index analysis, which assesses competency and maturity of organizations, concludes that only 13 percent reach the highest of four levels, Innovative. Furthermore, while many organizations are sophisticated in dealing with the information, they are less able to handle the people-related areas, lacking the right level of training in the skills required to integrate big data. Most said that the training they provide is only somewhat adequate or inadequate.

Big data is still new to many organizations, and they face challenges in integrating big data that prevent them from gaining full value from their existing and potential investments. Our research finds that many lack confidence in processing large volumes of data. More than half (55%) of organizations characterized themselves as only somewhat confident or not confident in their ability to accomplish that task. They have even less confidence in their ability to process data that arrives at high velocity: Only 29 percent said they are somewhat confident or not confident in that. In dealing with the variety of big data, confidence is somewhat stronger, as more than half (56%) declared themselves confident or very confident. Assurance in one aspect is often found in others: 86 percent of organizations that said they are very confident in their ability to integrate the variety of big data are satisfied with how they manage the storage of big data. Similarly 91 percent of those that are confident or very confident with their data quality are satisfied with the way they manage the storage of big data.

Turning to the technology being used, we find only one-third (32%) of organizations satisfied with their current data integration technology, but twice as many (66%) are satisfied with their data integration pro­cesses for loading and creating big data. A substantial majority (86%) of those very confident in their ability to integrate the needed variety of big data are vr_BDI_03_plans_for_big_data_technologysatisfied with their existing data integration processes. Those that are not satisfied said the process is too slow (61%), analytics are hard to build and maintain (50%) and data is not readily available (39%). These findings indicate that making a commitment to data integration, for big data and other­wise, can pay off in confidence and satisfaction with the processes for doing it. Additionally, organizations that use dedicated data integration technology (86%) are satisfied much more often than those that don’t use dedicated technology (52%).

New types of big data technologies are being introduced to meet expanding demand for storage and use of information across the enterprise. One of those fast-growing technologies is the open source Apache Hadoop and commercial enterprise versions of it that provide a distributed file system to manage large volumes of data. The research finds that currently 28 percent of organizations use Hadoop and about as many more (25%) plan to use it in the next two years. Nearly half (47%) have Hadoop-specific skills to support big data integration. For those that have limited resources, open source Hadoop can be affordable, and to automate and interface with it, adopters can use SQL in addition to its native interfaces; about three in five organizations now use each of these options. Hadoop can be a capable tool to implement big data but must be integrated with other information and operational systems.

Big data is not found only in conventional in-house information environments. Our research finds that data integration processes are most often applied between systems deployed vr_BDI_07_types_of_data_integration_processeson-premises (58%), but more than one-third  (35%) are integrating cloud-based systems, which reflects the progress cloud computing has made. Nonetheless, cloud-to-cloud integration remains least common (18%). In the next year or two 20 to 25 percent of organizations plan additional support for all types of integration; those being considered most often are cloud-to-cloud (25%) and on-premises-to-cloud (23%), further reflecting movement into the cloud. In addition, nearly all (95%) organizations using cloud-to-cloud integration said they have improved their activities and proces­ses. This finding confirms the value of inte­gration of big data regardless of what types of systems hold it. With a growing number of organi­za­tions using cloud computing, data inte­gra­tion is a critical requirement for big data projects; more than one-quarter (28%) of organizations are deploying big data integration into cloud computing environments.

Because of the intense need of business units and process for big data, integration requires IT and business people to work together to build efficient processes. The largest percentage of organizations in the research (44%) have business analysts work with IT to design and deploy big data integration. Another one-third assign IT to build the integration, and half that many (16%) have IT use a dedicated data integration tool. The research finds some distrust in involving the business side. Almost one in four (23%) said they are resistant or very resistant to allowing business users to integrate big data that IT has not prepared first, and the majority (51%) resist somewhat. For more than half (58%) the IT group responsible for BI and data warehouse systems also is the key stakeholder for designing and deploying big data integration; no other option is used by more than 11 percent.

It is not surprising that IT is the department that most often facilitates big data and needs integration the most (55%). The most frequent issue arising between business units and IT is entrenchment of budgets and priorities (in 42% of organizations). Funding of big data initiatives most often comes from the general IT budget (50%); line-of-business IT budgets (38%) are the second-most commonly used. It is understandable that IT dominates this heavily technical function, but big data is beneficial only when it advances the organization’s goals for information that is needed by business. Management should ensure that IT works with the lines of business to enable them to get the information they need to improve business processes and decision-making and not settle for creating a more cost-effective and efficient method to store it.

Overcoming these challenges is a critical step in the planning process for big data. My analysis that big data won’t work well without integration is confirmed by the research. We urge organizations to take a comprehensive approach to big data and evaluate dedicated tools that can mitigate risks that others have already encountered.

Regards,

Mark Smith

CEO and Chief Research Officer