The "X" Factor in Virtualization and Building Private Clouds

I was very interested to see VMware’s announcement claiming that it can implement private clouds for customers within 30 days. It’s certainly a brave new virtual world and VMware has done a great job speeding these types of deployments with their platform.

In eWeek’s coverage of the news there is an insightful comment that deserves a bit more exploration. In weighing the validity of VMware’s vCloud Accelerator service and the ability to build a private cloud in 30 days, Chris Preimesberger wrote “It’s not an earthshaking claim; after all, as long as there is enough storage, server power and networking bandwidth in a data center or small IT setup, a private cloud is mostly software that simply needs to be installed, tested, loaded with virtual machines, made secure and deployed. But no other company yet has been specific to claim a ceiling of 30 days on such an installation/deployment.”

Did you catch that? Almost subtle enough to miss, he hit on a critical piece to enabling this kind of rapid deployment that is more often than not the biggest obstacle to server and desktop virtualization deployments; the storage problem. VMware’s claim assumes that you can commit the storage capacity needed to make this kind of deployment work, and maybe some organizations do. But most organizations likely don’t have a suitable storage environment. And once the private cloud is configured they could be facing a number of application performance, availability, and disaster recovery issues as all their applications compete for the newly centralized environment.

So how then do you get highly available, dynamic and responsive access to storage in a private cloud infrastructure without busting the budget? The answer is to virtualize the physical storage devices using storage virtualization software, thereby decoupling the virtual infrastructure from the underlying disks. This optimizes the I/O response obtained from standard storage devices that would have otherwise needed to be augmented or replaced. If you do end up needing to expand capacity at some point, you’ll also then have the option to choose the best deal at that time from a number of competitive suppliers, rather than be limited to a single vendor’s hardware.

Is your organization debating a private cloud deployment in 2011? If so, take into account that you’ll need to discuss the third dimension of virtualization (alongside server and desktop virtualization). Virtualizing your storage assets will be critical in order to get off the ground and scale out successfully.

Get a Live Demo of SANsymphony

Talk with a solution advisor about how DataCore Software-Defined Storage can make your storage infrastructure modern, performant, and flexible.

Get a Live Demo