Search
Languages
<
8 min read

VMblog DataCore Predictions: Parallel Processing Software Will be a ‘Productivity Disrupter’ and Game Changer in 2017

VMblog 2017 prediction

VMblog 2017 Virtualization and Cloud Prediction 
Contributed by George Teixeira, President and CEO, DataCore Software

With so much computing power still sitting idle – despite all of the
incredible technology advancements that have occurred – in 2017, the time
is right for parallel processing software to go mainstream and unleash the
immense processing power of today’s multicore systems to positively disrupt
the economic and productivity impact of what computing can do and where it
can be applied.

New software innovations will make 2017 a breakout year for parallel
processing. The key is that the software has to become simple to use and
non-disruptive to applications to allow it to move from specialized use
cases to general application usage. By doing so, the impact of this will be
massive because application performance, enterprise workloads and greater
consolidation densities on virtual platforms and in cloud computing that
have been stifled by the growing gap between compute and I/O will no longer
be held back. This will be realized with new parallel I/O software
technologies now available that are easy to use, require no changes to the
applications and are capable of fully leveraging the power of multicores to
dramatically increase productivity and overcome the I/O bottleneck that has
been holding back our industry; this is the catalyst of change.

Parallel processing software can now go beyond the realm of specialized
uses such as HPC and areas like genomics that have focused primarily on
computation, and impact the broader world of applications that require
real-time responses and interactions. This includes mainstream applications
and storage that drive business transactions, cloud computing, databases,
data analytics, as well as the interactive worlds of machine learning and
the Internet of Things (IoT).

The real driver of change is the economic and productivity disruption.
Today, many new applications such as analytics are not practical because
they require hundreds if not thousands of servers to get the job done; yet
each server is becoming capable of supporting hundreds of multi-threading
computing cores, all available to drive workloads that until now have sat
there idle, waiting for work to do. We are ushering in an era where one
server will do the work of 10 — or 100 servers — of the past. This will
be the result of parallel processing software that unlocks the full
utilization of multicores, leading to a revolution in productivity and
making a new world of applications affordable to mainstream IT in 2017.


The Impact on Real-time Analytics and Big Data Performance will be
Profound

The combination of faster response times and the multiplying impact on
productivity through parallelization will fuel the next step forward in
‘real-time’ analytics, big data and database performance. DataCore sees
this as the next step forward in 2017. Our background in parallel
processing, real-time I/O and software-defined storage has made our company
uniquely well positioned to take advantage of the next big challenge in a
world that requires the rate and amount of interactions and transactions to
happen at a far faster pace with much faster response times.

The ability to do more work by doing it in parallel — and to react quickly
— is the key. DataCore sees itself as helping to drive the step function
change needed to make real-time analytics and big data performance
practical and affordable. The implications on productivity and business
decision making based on insights from data in areas such as financial,
banking, retail, fraud detection, healthcare, and genomics, as well as
machine learning and Internet of Things type applications, will be
profound.


The Microsoft Impact Arrives: Azure Stack, Hybrid Cloud, Windows and
SQL Server 2016

The success and growth of Microsoft’s Azure Cloud has already become
evident, however the real impact is the larger strategy of how Microsoft
has worked to reconcile the world of on-premise and cloud computing.
Microsoft was one of the first cloud vendors to recognize that the world is
not just public clouds but that it will continue to be a mix of on-premise
and cloud. Microsoft’s Azure Stack continues to advance in making it
seamless to get the benefits of cloud-like computing whether in the cloud
or within a private cloud. It has become the model for hybrid cloud
computing. Likewise, Microsoft continues to further integrate its Windows
and server solutions to work more seamlessly with cloud capabilities.

While Windows and Azure get most of the attention, one of the most dramatic
changes at Microsoft has been how it has reinvented and transformed its
database offerings into a true big data and analytics platform for the
future. It is time to take another look at SQL Server 2016; it is far more
powerful and capable, and now deals with all types of data. As a platform,
it is primed to work with Microsoft’s large eco-system of marketplace
partners, including DataCore with its parallel processing innovations, to
redefine what is possible in the enterprise, the cloud, and with big data
performance and real-time analytic use cases for traditional business
applications, as well as new developing use cases in machine learning,
cognitive computing and the Internet of Things.


Storage has Transformed; It’s Servers + Software-Defined
Infrastructure!

We are the midst of an inevitable and increasing trend in which servers are
defining what storage is. Escalating this trend DataCore used parallel I/O
software technologies to power off-the-shelf multicore servers to drive the
world’s fastest storage systems in terms of performance, lowest latencies
and best price-performance. Traditional storage systems can no longer keep
up and are on the decline, and as a result, are increasingly being replaced
by commodity servers and software-defined infrastructure solutions that can
leverage their power to solve the growing data storage problem. The storage
function and associated data services are now being driven by software and
becoming another “application workload” running on these cost-efficient
server platforms, and this wave of flexible server-based storage systems
are already having a disruptive industry impact.

Marketed as server-SANs, virtual SANs, web-scale, scale-out and
hyper-converged systems, they are a collection of standard off-the-shelf
servers, flash cards and disk drives – but it is the software that truly
defines their value differentiation. Storage has become a server game.
Parallel processing software and the ability to leverage multicore server
technology is the major game-changer. In combination with software-defined
infrastructure, it will lead to a productivity revolution and further
solidify “servers as the new storage.” For additional information, see the
following report:
http://wikibon.com/server-san-readies-for-enterprise-and-cloud-domination/

What’s Beyond Flash?

Remember when flash was the next big thing? Now it’s here. What is the next
step — how do we go faster and do more with less? The answer is obvious;
if flash is now here and yet performance and productivity are still an
issue for many enterprise applications especially database use cases, then
we need to parallelize the I/O processing. Why? It multiplies what can be
done as a result of many compute engines working in parallel to process and
remove bottlenecks and queuing delays higher up in the stack, near the
application, so we avoid as much device level I/O as possible and drive
performance and response times far beyond any single device level
optimization that flash/SSD alone can deliver. The power of the ‘many’ far
exceed what only ‘one’ can do – combining flash and parallel I/O enables
users to drive more applications faster, do more work and open up
applications and use cases that have been previously impossible to do.


Going Beyond Hyper-Convergence: Hyper-Productivity is the Real
Objective

As 2017 progresses, hyper-converged software will continue to grow in
popularity but to cement its success, users will need to be able take full
advantage of its productivity promise. The incredible power of parallel
processing software will enable users to take advantage of what their
hardware and software can do (see this video from
ESG
as an example).

Hyper-converged systems today are in essence a server plus a
software-defined infrastructure, but often they are severely restricted in
terms of performance and use cases and too often lack needed flexibility
and a path for integration within the larger IT environment (for instance
not supporting fibre channel, which often is key to enterprise and database
connectivity). Powerful software-defined storage technologies that can do
parallel I/O effectively provide a higher level of flexibility and leverage
the power of multicore servers so fewer nodes are needed to get the work
done, making them more cost-effective. Likewise, the software can
incorporate existing flash and disk storage without creating additional
silos; migrate and manage data across the entire storage infrastructure;
and effectively utilize data stored in the cloud.

Data infrastructures including hyper-converged systems can all benefit from
these advances through advanced parallel I/O software technologies that can
dramatically increase their productivity by untapping the power that lies
within standard multicore servers. While hyper-converged has become the
buzzword of the day, let’s remember the real objective is to achieve the
most productivity at the lowest cost, therefore better utilization of one’s
storage and servers to drive applications is the key.


The Next Giant Leap Forward – Leveraging the Multiplier Impact of
Parallel Processing on Productivity

This combination of powerful software and servers will drive greater
functionality, more automation, and comprehensive services to productively
manage and store data across the entire data infrastructure. It will lead
to a new era where the benefits of multicore parallel processing can be
applied universally. These advances (which are already before us) are key
to solving the problems caused by slow I/O and inadequate response times
that have been responsible for holding back application workload
performance and cost savings from consolidation. The advances in multicore
processing, parallel processing software and software-defined
infrastructure, collectively, are fundamental to achieving the next giant
leap forward in business productivity.

Data Storage Solutions for Your Every IT Need

Talk with a solution advisor about how DataCore Software-Defined Storage can make your storage infrastructure modern, performant, and flexible.

Get Started

Related Posts
 
Blueprint for Scalability: Tackling Exponential Data Growth
Vinod Mohan
Blueprint for Scalability: Tackling Exponential Data Growth
 
AIOps in Action: Revolutionizing IT Operations for the Digital Era
Vinod Mohan
AIOps in Action: Revolutionizing IT Operations for the Digital Era
 
The Crucial Role of Persistent Storage in Modern Data Centers
Alexander Best
The Crucial Role of Persistent Storage in Modern Data Centers