IT teams are often constrained by budget restrictions. Especially during the current market recession, many IT organizations are dealing with budget cuts and prioritizing their spend. Infrastructure expansion projects and hardware refresh initiatives are being put on hold. With the ever-increasing requirement for more storage capacity caused due to data growth, how can IT meet this demand on a budget?
Watch this short 30-minute recorded webinar from DataCore and learn how to optimize your existing storage resources and get the most out of current infrastructure investments. Find ways to maximize capacity while not putting a strain on your IT budget with new storage equipment purchase. Lower costs and improve ROI.
Vinod Mohan: Greetings from DataCore Software. Welcome to the shock webinar. My name is Vinod Mohan and I’m a Senior Product Marketing Manager at DataCore, I will be your presenter today. Today’s topic is how to reduce data storage costs and save on IT expenses. In the wake of the ongoing Coronavirus pandemic and the consequent market recession, let us discuss how IT budgets are getting impacted. And how IT teams can manage and foster their data storage requirements without having to spend a lot of money on procurement of expensive storage hardware. IT challenges during market recession, budget cuts are something IT teams deal with quite often, it could be cause of the slowdown in the company’s revenue, perhaps a bad quarter.
Or the failed IT project that led to tighter budgets and scrutiny on new investments. But even a global recession, as the one that we are facing now, just backing the overall market. In the present scenario of the Coronavirus pandemic economists are predicting a potential recession. Amidst this uncertainty, organizations have already started cutting down their budgets. And speaking to our customers, we found that many IT infrastructure expansion projects or hardware initiatives are being stalled IT teams are prioritizing their spend between what’s absolutely essential, such as remote connectivity and troubleshooting tools, security tools, VP and gateway, etcetera.
And most employees are working from home and they’re analyzing what spend can be put on the back burner for now. When data continues to grow and storage demand continues to surge it is a challenge for IT teams to manage capacity on a budget. IT teams must look for smarter and cost effective ways to optimize existing storage resources to support business demand without having to break the bank with new storage hardware purchases, at least for the time being. With lesser budget, IT teams will certainly be under pressure to accomplish more with fewer resources. Also IT has to prioritize between what’s essential and what’s not and choose their spend wisely.
Whether you and your employees are working from home or not, as long as there’s business and its operational, data continues to grow and data storage needs to continue to grow along with it. There are more files being shared, there are more meetings being recorded and stored, increased use of [unintelligible 00:02:24] desktops may increase storage requirements too. Even if your organization has temporarily adopted remote work due to the COVID 19 crisis, being on the IT team you need to be able to make sure to address capacity issues so that business applications are not impacted. One of the most significant impacts of the Coronavirus pandemic is being felt on the global supply chain where production has slowed and suppliers are not able to fulfill commitments on delivering storage hardware.
We are hearing of disk shortages from our customers and partners worldwide due to manufacturing slowdowns. Even the large storage manufacturers such as Dell, HP, etcetera are facing this impact. So when you’re not able to get your expected storage hardware delivered on time, you’ll resort to look for a cost effective alternative. Given all these challenges, many IT organizations, as we discussed, are constantly pushing out their storage expansion projects and hardware refresh initiatives for the moment. Therefore the best course of action for now, for ITT’s is to optimize their existing storage as much as possible to address both your demands and sort of immediately resulting to new hardware purchase; there will always be a new capacity in your storage devices.
Let’s now dive into three best practices to optimize your existing infrastructure, which will give you the chance to [defer] new purchases some more time. The first best practice is how to pool your entire storage infrastructure in a centralized manner so you get centralized control of all available and allocated capacity. Second, is how to automatically place data on the most appropriate storage to ensure that your fastest storage doesn’t get used up soon with warm or cold data. The third and last one being, how to gain independence from storage hardware vendor lock-in so you can leverage attractively priced alternatives for buying storage instead of the expensive ones thrown at you by your primary vendor.
In all three best practices, we’ll see how DataCore’s software-defined storage solutions can help you achieve capacity optimization. Best practice number one, pool your entire storage, let’s see how to do this. Over years of adding, removing and changing storage, many organizations run and do storage silos where storage devices are tied to specific application [workloads]. Every storage block becomes its own island leading to a situation where your primary and most expansive storage gets filed up very quickly. If we can assimilate together all your unlike and disaggregated storage into a virtual storage pool, you can gain centralized visibility and control across your entire storage infrastructure.
When you have islands of storage, you often have a new capacity in different devices that you wish you could reclaim and use. When you pool together that storage into a logical layer abstracted away from the hardware layer, it becomes easy to reclaim unused storage from anywhere. Once you have all storage available from a central storage pool, can easily [unintelligible 00:05:30] volumes of storage capacity and allocate them to a specific workloads. Let’s take the example of an IT environment with different applications being connected to disparate storage resources, a side of workloads running on virtual machines is connected to a storage cluster in the backend. A standalone application server is connected to some [unintelligible 00:05:52]. A premium storage device is [unintelligible] up a set of mission critical workloads, again hosted on a virtual environment.
And we have a database connected to flash storage and an HDD standard rate. Now we see that each of the storage devices is at a different level of capacity utilization, some at full capacity, some near capacity limit and some still having a lot of free storage space. Typically if you have reached full capacity for one of your storage devices, you will tend to add additional hardware to expand capacity serving its corresponding application. Instead of that, you can use DataCore’s software-defined storage solutions to pool your storage into a central logically partitionable entity. You could have in your data center any solid state disks on NVME flash arrays or any traditional HDD-based [SAN] arrays or even [JBODS].
Any storage array, whether it’s direct attached or externally connected stand, or fiber channel or [ICCE] can easily be pooled using DataCore. This example highlighted storage pooling in a block storage environment. If you have a file and object storage setup in your date center and would like to advocate storage capacity across your NAS devices, while servers and objects stores both on premises and in the cloud, DataCore can also help you with that. After all capacity is pooled, you can balance capacity uniformly across all devices without having to fill up one or more of your storage devices. You can avoid siloed islands of storage filling up quickly and instead you can balance capacity uniformly across different storage hardware equipment.
This helps you avoid buying new storage and instead use up existing capacity to its fullest. End result is you have significant cost savings for your budget constrained IT department. The second best practices, automatically placing your data on the most appropriate storage. A big challenge that storage administrators face is to control what data gets to be stored on what storage equipment. There may be data that is coming from a mission-critical database that you would want to store on the fastest flash available and there may be a lot of archival data that is not accessed much, which you want not to put on your primary storage but on a secondary commodity hardware.
What would be helpful is to have an intelligent medium that learns data access patterns based on how frequently data is being accessed by the application. DataCore software-defined storage solutions have this build in machine learning capability to distinguish between hard, warm and cold data and move them to appropriate storage tiers as defined by the storage administrator who can ensure only your mission-critical workloads get to use your fastest storage and it can avoid using up space on your premium storage devices for non-critical data. On this slide you can see various storage devices connected to different data sources. This is the state of isolation before we bring in DataCore into the IT alignment.
After introducing DataCore to the environment, as we already saw, all storage is advocated into a storage pool with logical storage classification. This means that there needs to be no tightly coupled connection between the application and the backend storage, that’s just a logical connection which can change over time giving the advantage of moving data between storage tiers for cost and performance needs. DataCore will automatically move data by doing storage tiers based on data access patterns and the application on the frontend may never need to know which storage device is being used for its data storage requirement. DataCore uses machine learning to understand data access temperatures.
Based on custom defined storage tier’s set by the IT administrator DataCore automatically moves hard data premium storage and warm data to secondary storage and cold data to low-cost commodity storage and even to the cloud. Device [unintelligible 00:10:01] is also possible, while core data is stored on commodity device, suddenly access much and DataCore will recognize this and move it back to the storage tiers where hard data is stored. Because all this is automated, and happens independent of the application’s knowledge, there’s no impact on application performance. You can customize and define which tiers of storage you want to be used according to data temperatures and DataCore will automatically read data access patterns and place data accordingly to the respective tier.
Also because there is control movement and placement of data to the appropriate storage equipment, IOS, bites and storage hotspots can be avoided and high performance of the storage here can be ensured. This automatic data placement capability allows you to save up space on your premium storage as much as possible before throwing more hardware whenever your premium storage is at full capacity. For the unstructured data, stored on file and [object] storage, DataCore’s software-defined storage solutions use custom defined policies to move data between storage tiers. Based on access frequency of a file, age or type of file or other criteria data can be moved to appropriate storage equipment, including the cloud.
In the case of cloud, you can easily move, backup and add [unintelligible/ 00:11:19] data from your data center to low cost ST-based cloud storage. The end result is significant cost savings again for the IT team. The best practice number three, gain independence from hardware vendor lock-in. Many organizations often end up buying most of their storage equipment from a single manufacturer, this is definitely not a bad thing. But when you are on a shoestring budget, and your supplier is a high end manufacturer you will have to shell out a lot of money to expand your storage capacity. Buying hardware from the same vendor also leads to a vendor lock-in scenario, your vendor may increase the price or you may encounter delivery delays or even support issues.
A specific model, which you have sought, and want to expand may have reached its end of life or end of manufacturing date and it’s not available anymore. IT teams must have the flexibility to choose the vendor, storage type and model of their choice based on their requirement and budgets. Existing hardware make should not dictate what new gear to purchase. We need to be able to shop around for best value. A big challenge that storage administrators face today is to control what data gets to be stored on which storage equipment. There may be data that is coming through a mission-critical database that you want to store on your fastest flash drive.
You need the flexibility of choice and capability to manage a [unintelligible 00:12:41] storage environment that you choose to put together. You need to be able to add any new storage easily, and if needed, you need to be able to replace old storage with new ones of your choice non-disruptively. Data, as we know, is growing exponentially. In most enterprises there’s more inactive data stored in comparison to active data. Inactive data is typically data that is not very frequently accessed and it occupies about 80% or so of the storage. As we earlier saw, with auto tiering and auto placement of your data, you can leverage your secondary storage and even the cloud for storing inactive data which will significantly add to your data storage cost savings.
You may have storage resources from more than one storage vendor. You may have different models of storage devices from the same vendor. Or you may even have different proprietary extensions or replacements to your storage hardware. The flexibility of adding new storage of your choice is always a good thing, but adding new storage, whether it’s from a different manufacturer or a different model than what you have currently, is not always an easy task. Also when you have new storage from a different vendor brought in to replace an outdated or expensive equipment you may run into challenges with data migration. And data migration usually sucks up a lot of time and money and effort, which translates into overhead cost.
Production could be impacted until data migration is complete leading to productivity losses and associated costs. Let’s now see how DataCore addresses these challenges. When you have a low cost storage brought in to replace an existing storage add-in, it can easily be done with DataCore as all storage is now centrally pooled into a logically partitionable cluster. New storage capacity can easily be added to the storage pool. Without having to stop application activities data migration can happen transparently and nondestructively in the background. DataCore software-defined storage solutions and enables the seamless data migration from one target storage device to another allowing IT teams to cut over to a new storage device without storage downtime and production impact.
Likewise, older storage can be smoothly decommissioned out of the storage pool after data migration is complete. The seamless data migration capability allows you to add in any storage of your choice to your storage infrastructure; your applications need not be tied to any one specific storage type or vendor. The end result, you can gain complete vendor independence and scale capacity cost effectively. Let’s now quickly go where some of the key capabilities of DataCore software-defined storage solutions. As we saw earlier, DataCore allows you to pool all your storage capacity into a logical cluster and centralized control of your primary and secondary storage from one place, regardless of the make or model of storage device you have in your infrastructure.
You can use DataCore to manage storage capacity assignment and optimization. Because you have complete visibility and control of all your storage capacity, you can then provision resources and increase allocation on demand; this helps you avoid over provisioning of storage and the capacity getting allocated and remaining unused. You can consolidate mixed protocols, hardware vendors and models under the hood. Without the application having to know which storage equipment is provisioning its resources. You have full control to move data to any storage device and manage a completely heterogeneous storage environment that you choose to have.
No more vendor lock-in. You can automate data placement on appropriate storage tiers based on how hot, warm or cold the data access is, this allows you to ensure your fastest increment storage device doesn’t get used up with inactive and noncritical data. And because DataCore allows you to add any storage to the DataCore managed storage pool, you can easily expand your storage capacity by adding new storage devices of your choice. Migrate data to a new storage device, you just added and refresh old hardware nondestructively, no impact on your production and no storage downtime.
Load balancing capacity between unlike storage systems becomes easy now, just choose any I device you want and balance capacity uniformly so that no single storage device gets filled up before the rest and available capacity is uniformly distributed across all storage hardware. DataCore offers two best-in-class software-defined storage solutions to cater to your structured and unstructured data management needs. For the structured data stored on block storage, DataCore offers SANsymphony, a purpose-built solution that helps virtualized your storage hardware across externally attached SAN devices, all flash areas and direct [unintelligible 00:17:37] storage.
SANsymphony is a [unintelligible] destined block-based SDS solution that has been on the market for over 20 years and been implemented across diverse customer environments around the world. Some noteworthy features of SANsymphony include, automated data tiering, synchronized [unintelligible 00:17:56] for high availability, a synchronized replication for disaster recovery, snapshots, point and time data rollback for data protection and so many more. For the unstructured data, stored on file and object storage, DataCore offers vFilO, which is a software-defined storage solution that helps unify and manage the data scattered across diverse [unintelligible] shares object shares and cloud storage.
You can use vFilO to aggregate multiple NAS systems, files and shares into a single virtual name space for simplified and unified access to all your unstructured data. You can also use vFilO to move data to low-cost object storage and the cloud for storage cost reduction. In summary, don’t throw hardware at the problem, whenever you run into a capacity shortage. Let DataCore help you optimize and make the most out of your existing storage resources. You don’t have to keep using up your reduced budget on new hardware procurement especially on the storage front. Optimization will take you a long way in saving money and allowing you to use your current capacity most efficiently.
When cost reduction is a priority for your organization, you can easily achieve significant savings on your storage hardware expenses by using DataCore. Optimize your current capacity, customize your storage resource allocation and economize your spend by gaining the freedom to add the storage of your choice to your infrastructure. Here are some benefits that thousands of our customers across the globe have realized after using DataCore’s software-defined storage solutions, gain full control of your storage resources across your infrastructure. Use your current storage capacity to its fullest. Defer purchase of new storage hardware and achieve the negotiating clout to get the best deal on future purchases from your suppliers.
You can ensure high availability of your storage and save manual efforts with relates to time and cost in managing your storage. For a live demonstration of how our products work you can visit datacore.com/try-it-now. And schedule a personalized demo with our solution architects. I’m sure we’ll be able to help you reduce your total cost and ownership of your storage infrastructure and help you make your storage great again. Thank you very much for your attention and I hope you all have a great day ahead and stay safe with your family and friends as we all get through this Coronavirus outbreak. Take care, everyone. Goodbye.