In its February 2010 edition, the Economist, through an ominously titled cover story, warned of the oncoming ‘data deluge’, highlighting the challenges and the opportunities that were just starting to become apparent with the arrival of ‘Big Data’. Today, over half a decade later, research validates many of the concerns brought forth by the Economist: enterprises have access to large amounts of customer data while the ability to turn it all into actionable information remains elusive for most. In a recent health industry survey (KPMG, May 2015), only 10% of the participating organizations reported to use data analytics to its full potential.
What is causing this disconnect between the ready availability of customer data and the dearth of high quality data analysis? The answer lies in what enterprises should recognize as the new normal of customer data.
CUSTOMER DATA MIX: THE NEW NORMAL
Not long ago, building a data warehouse meant integrating your Oracle, SAP, IBM and a handful of other on-premise structured data systems. Still a complicated endeavor, the limited number of data sources made it straightforward enough for most IT departments to tackle it in-house with limited outside support.
The proliferation of mobile devices, applications and social media combined with public, on-premise and hybrid cloud applications have exponentially exacerbated the data handling challenge. Speed, volume and disparate data formats mean that the enterprise data landscape has fundamentally changed. Introduction of unstructured data (email, social apps, etc.) is just one aspect of this new data mix. Customer communication platforms do a decent job of generating a multichannel output, however, there are no out-of-the-box solutions available for collecting and integrating data from these disparate channels for enhanced engagement.
ENTERPRISE DATA HUB: A SINGLE SOURCE OF TRUTH FOR ALL CUSTOMER ENGAGEMENT
Enterprise data, stored in database management systems, ensures quality and performance. On the other hand, very large data sets, residing in modern Hadoop file systems (HDFS), serve cloud applications. Before any meaningful insights can be gleaned, these disparate data sets need to be brought together to deliver a single source of truth.
Over the past year, the number of organizations with implemented data-driven projects has increased by 125% (IDC, March 2015). However, recent studies (Robert Half, April 2014) indicate that the vast majority (96%) of IT directors say that it had been either ‘somewhat’ or ‘very challenging’ to implement such solutions.
Apexon’s reconfigured connectors, an open API framework and custom web services, all delivered by a veteran team of data experts, provide a rapid solution for enterprise data acquisition, integration and transformation. With a 100-day data-to-output delivery model, Apexon’s experts, armed with a purpose-built analytics platform and vast CCM experience, eliminate the disruptive IT challenges to deliver solutions that reduce both cost and risk.
Want to learn more about Apexon? Consult with an expert here.