Gerardo Dada

Software-Defined Storage is Critical for Modernizing the Data Center

The line between ‘traditional’ data centers and the cloud is increasingly blurring; many data centers today primarily consist of virtualized resources in a co-location environment. As part of this continuing evolution, IT will require more flexibility and freedom than ever before.

Software-defined infrastructure is defined by four main characteristics:

  • Pooled resources that eliminate silos, delivered as a service
  • Virtualization: hardware abstraction and vendor-agnostic
  • Automation: software-based provisioning, control, and reporting, via API or management interface.
  • Services that enhance capabilities

To effectively modernize the data center, an organization must break the silos, achieve vendor independence, and remove vendor-imposed hardware refresh cycles – while achieving freedom and flexibility to be ready for the challenges of the future. Solving these issues is the promise of software-defined infrastructure. And the promise is real and available today.

Hardware refresh cycles in particular represent one of the most challenging aspects for IT—especially storage hardware. An IT department will typically go through a storage refresh cycle every three to five years, but in some cases the hardware can be used for a longer period of time.

These requirements are the primary reasons why software-defined is quickly becoming critical as the foundation for the modern data center. Software-defined storage is extremely flexible and enables new storage and technologies to be added—whether AFAs, NVMe, containers or cloud storage—non-disruptively.

When new hardware is released, software-defined storage enables it to be easily integrated into the environment, helping to increase infrastructure agility. There’s no need to rip and replace. Software-defined storage also makes it easier to adopt different types of storage (direct attached, arrays, JBODs, NVMe, etc.) and new configurations like hyperconverged or hybrid-converged.

In many IT environments today, storage teams spend up to 50% of their time migrating data. Once you have adopted software-defined storage, migrations become a thing of the past. Data is moved seamlessly, leveraging machine learning-assisted auto-tiering across storage arrays to dynamically optimize performance and make best use of the performance profile of each array. Capacity planning is simpler, as the system manages capacity at a pool level and allows thin-provisioning to maximize utilization.

The modern data center also requires data to be reliable and available all the time. This usually means incorporating storage technologies that support synchronous in local and metro clusters, asynchronous replication for disaster recovery, and continuous data protection (which is similar to a time machine that is able to undo any damage from ransomware attacks).

Those who have gone through a disaster or a storage failure understand that the most important aspect of ensuring availability is the recovery process—and it should be instantaneous and automatic, should not require human intervention, resulting in zero impact for users and applications. Once the problem is averted and the failed storage system is back online, it is ideal to have zero-touch fall back and re-build to go back to the original state and be ready for a future failure.

As IT departments look to reap the benefits of the software-defined data center, the performance, the freedom and flexibility advantages of software-defined storage will be increasingly realized – in addition to performance and availability. This will help them spend less time on repetitive tasks and expand the technology to cover more of their IT footprint, including additional workloads or data centers.

Related Posts
 
Gerardo Dada
Growing Architecture Firm Achieves 350,000 IOPS and Storage Freedom
This is a post summarizing the conversation that took place on a webcast recorded in late January. You can click here to watch the entire…
 
Gerardo Dada
Emerging Storage Challenges as Container Adoption Grows
According to research from Gartner, more than 50% of companies will use container technology by the year 2020. This comes as no surprise as containers bring unprecedented mobility, ease and efficiency…
 
Rizwan Pirani
Building a Data-Driven Culture
Data is everywhere. By 2020, it is estimated that 1.7 Mb of data will be generated every second for every person on Earth! Without the…