Can data processing performance affect bottom line?

24th August 2016 by Brytlyt

In a recent survey, conducted by PwC, senior executives stated, “They wanted decision-making to be faster, and more sophisticated”. Data analytics remains an important role in enterprise by expanding big data into a competitive advantage. Developing awareness aims at helping maximise the return on investment for data analytics.

…and yet,

72% of respondents claim they still struggle to perform sophisticated analytics according to The logic for improving both speed and sophistication? The right investment in data analytics can develop capabilities that push boundaries for efficiency.

Already in-memory databases are shifting how an enterprise gathers, stores, and aggregates data, but the fundamental objective in an analytics strategy should be how instantaneously an enterprise can improve on their sophisticated decision making. 1,189 business leaders across numerous industries were asked during a survey by if they had an overall big data strategy and only 23.5 % responded in having one.

Corresponding to the 2,100 executives surveyed by PwC, improving decision- making by 2020 would entail two dimensions defined by the survey. Speed indicated “Getting the right information to the right places at the right time” and sophistication “Applying the right level of insight to the right problem to create the right value”. Nonetheless, sophistication is, to a large extent, a function of speed. Consequently, decision-making will require a truly fast and truly sophisticated data processing system.

Legacy systems with speed capabilities is one solution, but processing at these extreme speeds can still take hours, days or even weeks – and warehousing vast amounts of data in a strategy is unproductive for an enterprise. For example, 20% of Heads of Data Analytics interviewed by said, they are struggling to overcome challenges when working with legacy systems in their fortune 500 companies. Warehousing data could potentially be used in forecasting, but hold no value for effective real-time insight.

Therefore, what can truthfully drive ROI from data analytics? A platform with an agile in-memory database component that can integrate with any legacy system. With data analytics agility, the platform will be able to absorb data quicker and will be able to apply real-time analytics. According to Michael Minelli, “It’s all about pure speed” Ensuring that there’s speed accessible for analysts, to do on-the-fly business intelligence guarantees an enterprise to value faster performing analytics.

Embracing analytics technology with a faster performing platform will lead an enterprise to establish a set of key differentiators against the competition. Recently, a total of 1,189 business leaders, participated in a survey by and 23% reported an increase in efficiency, 16% improved their decision making and 11% achieved financial savings. Hence, leveraging speed and sophistication to augment data quality and value creation increases market share for an enterprise.

The Brytlyt data processing platform uses Graphics Processor Units to massively accelerate data processing performance. Incredibly, benchmarking shows Brytlyt is an order of magnitude faster than the world’s fastest in-memory database.

Over the course of my next three posts I will talk about how data processing performance can be augmented with key capabilities in integration, ease of use and extendibility to further lift ROI.