Optimised servers are now big data analytics’ next step
Etymologists of IT jargon have hesitated to downgrade big data’s currency as an industry buzzword, but times appear to be changing. Enterprise IT planners are certainly aware that its status in business-enabling programmes is evolving, as well as the technological developments that are causing this. A bit of history helps clarify what’s afoot, especially why big data has been waiting for vendors such as Lenovo to provide the platforms that enable it to deliver its full ROI potential.
At the start of the decade, the big data proposition was seen somewhat as a solution in search of an application. Business computing boffins knew that the voluminous datasets generated by global enterprise IT systems contained trend information and other useful statistics that could inform business intelligence (BI), reveal commercial opportunities and identify operational inefficiencies, among other valuable data.
The drawback was that standard data processing platforms at that time were hardly up to the task of tearing through the terabytes to retrieve the useful stuff within a budget that could justify the investment required. Even massed server farms took too long – by the time today’s actionable insights were gleaned, they’d become yesterday’s lost chances.
There were, however, enough pioneering big data projects being run on specialist high-performance systems in academic and scientific environments – the only places that had access to high-performance computing to process data and derive meaning from the exercise. Online forces like Google and Amazon were also starting to make excessive compute power central to their business models.
Big data, big rethink…
Nonetheless, the potential for big data-derived business advantage continued to fascinate enterprises. This was particularly the case for those concerned by the ever-expanding datasets they were obliged to preserve at escalating storage costs, but which did not seem to be ‘earning their keep’ in terms of retention value.
Neither did the IT industry itself give up on the ultimate promise of big data, but came to realise that to achieve appreciable uptake among customers, the big data proposition had to be reconsidered. One realisation was that big data was mislabelled as a solution, rather than an infrastructural resource that delivered the most value when serving under other fields such as artificial intelligence (AI) and machine learning (ML). On the data-processing side, in-memory computing (IMC) and real-time analytics (of the market-leading kind provided by SAP HANA, as powered by Lenovo) increasingly came to provide the extra layers of ‘stack’ that brings the big data proposition alive.
Faster analytics, speedier insights
Real-time analytics is key to understanding the technological forces that transform big data into something more valuable: ‘fast data’. Fast data is the application of big data analytics to datasets in near-real or real-time as they are being processed to extract greater business value. Allied to effective BI systems, fast data enables decision-makers to action insights more speedily, and thereby win competitive advantage.
Matt Turck, VC at FirstMark, says the real-time coherence of big data, fast data, analytics and intelligence adds up to the emergence of “a new stack”. This is where big data technologies are used to handle core ‘data engineering’ requirements, and AI/ML is used to extract value.
“Big data provides the pipes, and AI provides the smarts,” Turck says. “This symbiotic relationship has existed for years, but its implementation was available only to a privileged few.”
This ‘privileged few’ refers to the academic and scientific research mentioned earlier, but Turck believes startups and Fortune 1000 companies will now leverage this new stack model.
And indeed they are. IDC forecast worldwide revenues for big data and analytics of over $150 billion for 2017, an estimated increase of 12.4 per cent over 2016. Purchases of BDA-related hardware, software and services are expected to maintain a CAGR of 11.9 per cent through 2020, when revenues will be more than $210 billion.
Servers are innovation engines
With investment levels like these, big data analytics solutions like those from Lenovo and its partners must deliver big-time ROI and drive tangible insights that bring quantifiable business advantage. But any driver is only as good as the engine that powers their vehicle: the better the engine is tuned to the task, the better the car performs and the better the driver can respond.
This is a fancy way of explaining the advantages that Lenovo servers – such as System x3550 M5 and System x3650 M5 / ThinkSystem SR630 and ThinkSystem SR650 ranges – have in the big data, analytics and AI/ML spaces.
First, in terms of innate power, Lenovo servers have proved their superiority in benchmarks and other field tests. Lenovo servers’ client roster also attests to the confidence many organisations have in the hardware, from market-leading business entities to world-changing research institutes.
High uptime just as critical
Server superiority is not only about speed and workload optimisation – it’s about reliability. The advantages of these applications also bring risks. The more value that’s whizzing through your analytical systems, the more there is to be lost should a system fault cause downtime. As Rod Lappin, SVP/Global Sales & Marketing, Lenovo Data Centre Group, explains, downtime can be incredibly costly, and real-time big data analytics downtime can prove the costliest.
On top of power and reliability, servers should also be optimised to the workload they are assigned to. To gain better business insights, the volume, variety and velocity of data must be managed while applying analytics. It’s essential that the underlying hardware platform comes optimised to the task so that performance and CAPEX/OPEX efficiencies can be fully realised.
Lenovo supports the solutions with leading big data and analytics platform providers – Apache, Cloudera, Hortonworks, IBM BigInsights and MapR – and provides reference architectures through which the business benefits of data analytics can be fully exploited. Analytics ‘engines’ running on Lenovo’s best-of-breed solution building blocks can tackle the gigantic task of taming big data analytics for the benefit of any organisation. Their reliability also enables best use of valuable human expertise in this burgeoning IT discipline.