White Papers

The white papers below will help you to better understand technical aspects of DataCore solutions. Some provide more detailed descriptions of specific use cases, while others drill down on the underlying technology.

IDC: ExpertROI Spotlight
Healthcare providers worldwide are under tremendous pressure to change because of
increased demand and spiraling costs. IDC interviewed SRH Klinikverbund, a German healthcare organization, to assess and measure the impact of its DataCore SANsymphony software-defined storage deployment. SRH has been able to offer significantly higher levels of service to its IT users and patients, including productivity gains, infrastructure cost savings, and the elimination of storage system downtime. 

A Kusnetzky Group Report
Compact, low-cost, hyper-converged systems built on standard x86 servers using Microsoft Hyper-V hypervisor and DataCore Virtual SAN software are particularly well suited for ROBO, VDI and even latency-sensitive transaction processing databases. Kusnetzky Group Analysts have determined that these scenarios will greatly benefit from equipment consolidation, faster performance, and easier administration, while significantly reducing costs.

By Storage Switzerland
The data center is increasingly becoming denser; more virtual machines are stacked on virtual hosts, legacy applications are expected to support more users per server, and more IOPS are expected from the storage infrastructure. While the storage infrastructure now has the right storage media (flash) in place to support the consolidation of the data center, the storage software needs to support the available compute power.

The problem is that compute power is now delivered via multiple cores per processor instead of a single processor. Storage software that has parallel I/O will be able to take full advantage of the processor reality and support these dense architectures with a storage infrastructure that is equally dense. 

With the introduction of Adaptive Parallel I/O technology, DataCore Software has found a way to gain significant application performance and cost efficiencies from hyper-converged infrastructure appliances. Recent Storage Performance Council SPC-1 Benchmark results confirm the acceleration and latency reduction that DataCore is achieving with its “new” technology – which in fact derives from concepts and algorithms developed for multi-processor computers in the 1970s. Besides delivering significant gains in both non-virtualized database and virtual machine performance, DataCore’s Adaptive Parallel I/O effectively decouples performance from cost, enabling “Tier 0” latency and throughput at “Tier 3” storage prices. Taken together, the improvements delivered by Adaptive Parallel I/O might just be establishing a new “tick-tock” in application performance.

By the Kusnetzky Group
While hyper-converged systems promise to reduce complexity, space and power
consumption, current implementations also present challenges in the areas of storage
compatibility, integration and management.

In particular, organizations need ways to complement storage inside hyper-converged
systems with those housed in established storage assets, or face recreating costly islands of storage.

DataCore software-defined storage solutions address many of those challenges and
shortcomings by integrating hyper-converged systems with SAN and Cloud storage. They
help enterprises fully realize the combined value of their varied storage assets.

By the Enterprise Strategy Group
To compete in the modern information age, data is a core competitive asset driving business advantage and impacting every industry. As innovations, such as solid-state, software-defined storage, and multi-core processing, arise to better serve that data, archaic storage architectures often hold back its potential. Simply storing and protecting data is not enough. DataCore’s Parallel I/O technology enables software-defined storage to cost-effectively deliver the extreme and adaptable performance required to power the most demanding business-critical databases, virtualization projects, and web-scale environments by harnessing the full power of multi-core servers.

By the Data Management Institute
Unplanned interruption events, aka “disasters,” hit virtually all data centers at one time or another. While the preponderance of annual downtime results from interruptions that have a limited or localized scope of impact, IT planners must also prepare for the possibility of a catastrophic event with a broader geographical footprint. 

Such disasters cannot be circumvented simply by using high availability configurations in servers or storage. What is needed, especially for mission-critical applications and databases, are strategies that can help organizations prevail in the wake of “big footprint” disasters, but that can also be implemented in a more limited way in response to interruption events with a more limited impact profile.

DataCore Software’s storage platform provides several capabilities for data protection and disaster recovery that are well-suited to today’s most mission-critical databases and applications.

A Kusnetzky Group Report
Distributed enterprises with numerous remote offices and branch offices (ROBO) face unique needs for reliability, simplicity and cost-effectiveness. Industries such as retail, hospitality, manufacturing, healthcare and financial services often speak of those challenges.

ROBO sites often suffer from little available floor space in which to deploy IT systems. The limited staff charged with managing and maintaining these environments can’t possibly specialize in line of business applications, operating systems, databases, networks and storage technology. Nor do budgets allow for traditional three tier architectures where compute, networking and storage elements and responsibility for each are cleanly split. Needless to say, these satellite locations operate under very tight IT budgets.

With the emergence of hyper-converged solutions, such as that offered by DataCore, it is now possible to address these requirements and to "right size" ROBO IT and create independent and highly autonomous "IT cells". Application access can be delivered where it is needed while critical data assets can be maintained wherever it is most convenient, most cost-effective and using the best technology, regardless of whether it is onsite, back at headquarters or out in the cloud.

By the Kusnetzky Group
Enterprises are now facing pressure to optimize the use of their IT resources and to reduce their overall IT investment. They are addressing this requirement by deploying virtualization technology to consolidate workloads onto a smaller number of systems. They are also using storage virtualization technology that allows them use all of their different types of storage as a unified pool.

They expect this technology to overcome the storage silos they created at an earlier time and allow them to find and use under-utilized storage resources rather than having to get out their checkbook and purchase more storage every time an application needs more resources.

Enterprises that have adopted VMware-based virtual environments want to get the benefits VMware's VVols offers. They also want to make it possible for vSphere administrators to be in charge of their own storage resources, thus eliminating the administrator dance.

Technology from DataCore can address these business requirements today.

A Data Management Institute Report
When hypervisor-based computing was pressed forward as application hosting option in the early 2000s, a large part of the value proposition had to do with the benefits of server consolidation that would be realized from server virtualization. Leading analysts claimed that the consolidation value would be so significant that consumers would see a sharp decline in CAPEX spending that would more than compensate for the cost of hypervisor software and services. Plus, with the reduction in physical hardware kit and the resulting reduction in floor space and environmental costs, IT shops would also realize OPEX advantages: fewer servers translating to fewer server administrators, smaller IT staff and reduced labor costs.

To many firms, this was a compelling value case. Only, it didn’t materialize as planned. One obstacle to realizing consolidation goals has been the perpetuation of legacy storage arrays and storage area networks (SANs). With the continuing evolution of technologies for deploying converged and hyper-converged infrastructure (HCI), this obstacle might be surmounted in many cases, enabling “hyper-consolidation” along the way. That’s where DataCore Software technologies can play a big role.

By the Data Management Institute
For the idea of cloud technology to move from the realm of interesting to a status of operational in larger data centers, there must be no compromise in the performance and availability of what organizations have today. Cloud must deliver greater agility to minimize cost and maximize responsiveness to business needs, thereby improving business productivity.

Storage is the main obstacle to realizing cloud objectives. Organizations confront a need to rationalize storage behind different application platforms with different requirements at different physical locations, and to simplify and make seamless the ability to blend and leverage on premises with public cloud-based services for maximum value to the organization.

Of the solutions that are available in the market today, only DataCore Software provides the robust support for the diversity of storage requirements manifested by applications and database workloads generated by organizations today. Worth a look. 

dmi-building-high-perf-databases

Databases drive 60% of the applications in most companies. So, the performance of databases is extremely important to an organization's ability to process transactions as well as analyze data. Recently, many technology vendors have submitted all-flash products or other storage wares as “the right infrastructure” for databases and database-dependent applications. However, companies purchasing and deploying these high-performance servers and storage kits are still finding their databases to be underperforming.

By Augie Gonzalez, Virtualization Expert at DataCore Software
In this paper, we will discuss DataCore’s underlying parallel architecture, how it evolved over the years and how it results in a markedly different way to address the craving for IOPS (input/output operations per second) in a software-defined world.

By George Teixeira, CEO of DataCore Software
The major bottleneck holding back the industry is I/O performance. This is because current systems still rely on device -level optimizations tied to specific disk and flash technologies since they don’t have software optimizations that can fully harness the latest advances in more powerful server system technologies such as multicore architectures.

By the Data Management Institute (DMI)

This paper examines storage cost of ownership and seeks to identify ways to bend the cost-curve without shortchanging applications and their data of the performance, capacity, availability, and other services they require.

By the Data Management Institute (DMI)
This white paper outlines best practices for improving overall business application availability by building a highly available data infrastructure.

The survey was conducted in April, 2015
For the fifth consecutive year, DataCore Software explored the impact of Software-Defined Storage (SDS) on organizations across the globe. The 2015 survey distills the expectations and experiences of 477 IT professionals that are currently using or evaluating SDS technology to solve critical data storage challenges. The results yield surprising insights from a cross-section of industries over a wide range of workloads.

Solving the problems of traditional shared storage
One of the fundamental requirements for virtualizing applications is shared storage. Typically, shared storage takes place over a storage network known as a SAN. However, SANs typically run into issues in a virtual environment, so organizations are currently looking for new options. Hyper-converged infrastructure is a solution that seems well-suited to address these issues.

By Storage Insider
This paper highlights how a software-defined storage solution can help you manage all storage resources from one central location and automatically – irrespective of hardware manufacturer.

By Storage Insider
This paper highlights how a software-defined storage solution can help you optimize your existing various storage devices to remove I/O bottlenecks, meet your application performance requirements, and enable your infrastructure to operate at peak performance levels.

By Storage Insider
This paper highlights practical considerations on how using the integrated features of this software-defined storage solution can help you guarantee High Availability and overcome the threat of data loss for your business.

SDS Safeguards, Stores, and Accelerates DataSoftware-defined Storage is a proven solution that addresses the serious challenges experienced in the healthcare industry. These challenges include: the enormous amount of data growth projected to grow by 75% per year, legal requirements to protect data, and the reality that hospitals can not afford to experience downtime have led to an enormous rise in costs and serious logistical IT problems.

A Perspective for Buyers by Data Management Institute
Learn about hyper-converged storage, Virtual SANs and how savvy firms choose the best hyper-converged solution for their business.

The Gold Standard for Software-Defined Storage

SANsymphony-V storage virtualization software by DataCore addresses the biggest problem in modern IT infrastructure: an outmoded and broken, hardware-centric approach to storage.

Learn more: SSG-NOW Snapshot Report

A Software-defined Storage Solution
How will you stay ahead of Data Growth and the demands it places on your IT infrastructure? Learn how you can maintain uninterrupted service, scale with increasing capacity demands, deploy high-performance architectures, and reduce costs with SANsymphony-V 10.

A Deep Dive into Converged Storage
DataCore Virtual SAN introduces the next evolution in Software-defined Storage (SDS) by creating high-performance and highly-available shared storage pools using the disks and flash storage in your servers. It addresses the requirements for fast and reliable access to storage across a cluster of servers at remote sites as well as in high-performance applications.

Enterprise Strategy Group Storage Systems Brief

DataCore recently launched the newest version of its storage virtualization software, SANsymphony-V R9, expanding its capabilities to become a “Storage Hypervisor for the Cloud.” With increased flexibility, scalability, and management automation, R9 is ideally suited for midsize & enterprise IT organizations alike.

Learn more: ESG: The Storage Hypervisor for the Cloud

nt4admins-review

By NT4ADMINS Magazine

Greater scalability, improved administration functions and close integration with vSphere environments and system management suites are the main core characteristics of Release 9 of the SANsymphony™-V storage hypervisor. Above all, the 'group operations' make life easier for the administrator.

Download: SANsymphony-V R9.0 Product Review

An IDC Whitepaper
IDC assesses DataCore in the SDS space. IDC believes that SDS is going to become the preferred architecture for storage environments in the near future, and DataCore is well positioned to help customers make the transition rapidly and smoothly.

By George Teixeira, DataCore President & CEO 
There are plenty of reasons to adopt a software-defined storage platform... Here we provide you with the 10 most compelling reasons to adopt DataCore Storage Virtualization.

DataCore Market Research conducted in March 2014
DataCore Software conducted a survey of global IT professionals to identify current storage challenges that organizations are facing and what forces are driving demand for software-defined storage. This year’s State of Software-Defined Storage report shows that these institutions look for SDS to both simplify management of their diverse storage devices and enable them to future proof their infrastructure.

Clearing the Performance Hurdle
This paper covers our solution responsible for circumventing the physical constraints of storage devices, while making the best use of their capacity and connectivity. The solution has been shown to affordably overcome the storage-related bottlenecks encountered when virtualizing Tier 1 apps.

By DataCore CEO George Teixeira
What does “software-defined” really mean? This paper explores the true definition and cuts through the misconceptions and vendors’ claims that deliver hardware but talk “software-defined.”

Storage Strategies NOW Snapshot Report
The virtualization and consolidation of business-critical applications is a high priority for IT operations in organizations of all sizes. Explore solutions to virtualize business-critical applications with this Storage Strategies NOW Snapshot Report.

Keeping it Light and Inexpensive
In the past 2 to 3 years, the storage side of private clouds has been weighed down by heavy iron, big spending and a lot of complexity. Throughout this white paper you’ll get acquainted with recent software developments and practices which reshape storage into a far more appealing and inexpensive component of your next generation IT environment.

Intelligent tradeoffs between cost and performance
The science of automated storage tiering distills down to monitoring I/O behavior, determining frequency of use, then dynamically moving blocks of information to the most suitable class or tier of storage device. DataCore™ SANsymphony™-V software automatically manages your blocks to best allocate your storage.

IDC Whitepaper
Are you struggling with how to choose the right storage virtualization solution, or just looking to achieve a scalable software-based storage virtualization solution that fits your budget?

SANsymphony-V Storage Virtualization Software: Music to the Ears of Server Virtualization Users Stalled by Storage Challenges
With server virtualization ubiquity as a foregone conclusion, the next logical areas to investigate are those which hinder its progress. Storage, at least storage as it has been done for decades, is a significant obstacle that is slowing or stalling the successful and optimal advance of IT.

Sell Compelling Benefits and Business Value
How do you get CIOs to jump on the storage virtualization bandwagon if they’re not on it already? Use these five compelling points to persuade them that storage virtualization is right for their organization.

A $32 hardware cost for Virtual Desktop configuration 
This number is achieved using a configuration with dual node, cross-mirrored, High Availability storage. In comparison to previously published reports, which tout the storage infrastructure costs alone of VDI at from fifty to several hundred dollars per virtual machine, the significance of the data becomes self evident. 

Storage Virtualization: An Insider's Guide Part 1
Written by Jon Toigo, Chairman of The Data Management Institute LLC. This paper introduces storage virtualization and surveys the challenges that it can help to solve in the critical areas of engineering, operational and financial efficiency.

Storage Virtualization: An Insider's Guide, Part 2
This paper examines the role of storage virtualization as an enabler of effective capacity management from an architectural, operational and financial standpoint. And examines some of the features of storage virtualization such as Dynamic Storage Pooling, Thin-Provisioning & more.

Storage Virtualization: An Insider's Guide, Part 3
Storage needs to be nimble, delivering not only capacity sufficient to meet the growing volume of data but also the speed to write data as rapidly as it is presented and to retrieve it immediately upon requests from applications and decision-makers. Storage virtualization has the potential to break speed barriers without breaking the storage budget.

Storage Virtualization: An Insider's Guide, Part 4
A confluence of three trends is making disaster preparedness and data protection more important than ever before. These trends include the increased use of server and desktop virtualization, growing legal and regulatory mandates around data governance, privacy and preservation, and increased dependency on automation in a challenging business environment as a means to make fewer staff more productive.

Storage Virtualization: An Insider's Guide, Part 5
IT management is being challenged to demonstrate the business value of storage investments. They need to find ways to demonstrate that they are containing the costs of storage, while at the same time minimizing risk and improving productivity. The good news is that storage virtualization can help achieve these goals.

Storage Virtualization: An Insider's Guide, Part 6
IT administrator's want storage to be more manageable and predictable in terms of cost and performance – and even power consumption – while delivering measurable efficiency improvements in every category that matters to the business they serve. Ultimately, they want to be technology leaders who deliver improved service levels while cutting costs of operation.

Storage Virtualization Finds Its Mojo in the Server Consolidation Market
DataCore has made it easy for either an IT department or an IT integration partner to set up a SAN that can serve physical or virtual machines (VMs). What’s more, the software can run within a physical environment, such as a blade server, or in many of the top VM environments, including VMware, Xen, Microsoft and others.

Synchronously Mirrored Across Metropolitan Hot Sites
In this paper you will learn how DataCore Software solves a longstanding stumbling block to clustered systems spread across metropolitan sites by providing uninterrupted access to the CSV despite the many technical and environmental conditions that conspire to disrupt it.

An IDC Viewpoint Paper
Virtualization is among the technologies that have become increasingly attractive in the current economic climate. Organizations are implementing virtualization solutions to obtain the following benefits: Focus on efficiency and cost reduction, Simplify management and maintenance, and Improve availability and disaster recovery.

Internet Research Group Paper
In this brief we’ll review what’s going on with the “business continuity journey” every IT organization faces, and discuss how device-independent storage virtualization software not only makes business continuity more affordable, but offers other cost savings and can have performance advantages as well.

Compelling Storage Virtualization Software
This ESG Lab Validation documents the results of hands on testing of DataCore SANsymphony-V R8 Storage Virtualization Software. The goal was assessing the value of infrastructure-wide storage virtualization including centralized management, ease of use, capacity efficiency, data mobility, and performance.

Migrating VHD LUNs between Storage Devices without Downtime
DataCore™ SANsymphony™-V software migrates VHD LUNs non-disruptively behind the scenes and later reclaims the disk space from the decommissioned location. Transparent virtual LUN migration is one of several device-independent functions provided by SANsymphony™-V for the Microsoft Windows Server 2008 R2 with Hyper-V platform.

Eliminate Storage-related Downtime, Not Just Hardware Failures
If you are serious about high availability, look for storage virtualization software that provides the ability to stretch the two mirrored locations, potentially several kilometers apart yet still present mirrored virtual disks to applications as if they were single, high-performance, multi-ported drives.


Why Windows?

Curious why we didn’t choose some open source kernel or Linux-derivative to host our storage hypervisor? Learn the technical and commercial reasons that drove DataCore to select Windows as the underlying operating system for its SANsymphony™-V product line. It’s all explained here.

Ready to see how DataCore can take your enterprise IT to the next level?
Request a personalized demo today!

 
Get a Demo