Search
Languages
<
4 min read

How M&E Orgs Can Lower Storage TCO & Improve Efficiency

As studios & post-production houses develop new content, it is critical to lower storage TCO & improve efficiency for remote video workflows

In today’s economic climate, it’s necessary to do more with less—perhaps more now than ever before. Projects are being greenlighted for studios and production houses but issues with global supply chains and complexity in shooting and producing make it difficult to invest in new infrastructure. Even when the need is there.

Tier 1 Storage Capacity for Mission-Critical Applications & Data

With tier 1 storage (also referred to as production storage), available capacity must be there when it is needed for a project. Media organizations look to tier 1 storage to provide high performance that meets the demanding requirements of applications. It also must provide protection against hardware failures, data durability, and flexibility to support multiple diverse applications simultaneously.

But with these requirements, comes cost. For example, Quantum’s f1000, entry level NVME SAN provides 38 TB of capacity for just over $80,000 MSRP. That is over $2,000 per TB of storage. In this day and age, it’s easy to see how quickly that can be consumed when an uncompressed, hour-long 4k video file often exceeds 80GB.

What About Data and Content That is Aging?

Even though tier 1 storage is expensive and intended/designed for appropriate high performance, it tends to also house aging data/content that needs to be instantly accessible. Most tier 1 storage is designed for a three-year lifecycle, and then often becomes cost-prohibitive to maintain.

While most organizations backup files to tape storage or cloud storage, a substantial amount of this content remains on tier 1 storage. This may be due to a client or user wanting it online and available if needed, or due to a project that is still in process with already rendered content. Other times, data growth has spun out of control and has become difficult to manage. The latter is a reality (and maybe a nightmare) for many organizations today.

Implementing a second tier of less expensive storage while maintaining the resilience and instant availability of the content is the best way to solve this challenge.

How Often Do You Access Data on Primary Storage?

By only keeping recently accessed content on Tier 1 and shifting less frequently accessed content to a second tier of storage, it allows an organization to keep costs down. For example, if you pair DataCore Swarm with a 1u standard x86 server optimized for video you can get 168 TB of Swarm including hardware, software and support for 80% less than the Quantum model mentioned above.

With the tiered model, you would have approximately 20% of the capacity in tier 1 (38 TB) and about 80% in tier 2 (168TB). When you scale this to the PB range how much would your organization save?

Dc Blogpost Lowerstoragetco Content

Should I Move Some of My Data Off of Tier 1 Storage?

While lowering TCO on tier 1 storage and using it more efficiently is important, it’s equally if not more important to be able to access that data quickly when needed. And, you need to ensure that it’s on reliable and scalable storage designed for the long term.

DataCore Swarm Object Storage is a great fit for that. It uses standard servers and is scalable to hundreds of PBs. Swarm also has built-in data protection, security, multi-tenancy, metadata search and accessibility via S3 and HTTP(S).

How Can I Move My Data Off Tier 1 SAN and NAS Storage to Object Storage?

By integrating with tier 1 storage platforms such as NetApp, EditShare, Pixit, StorNext, etc., content and data can easily be transferred from tier 1 production storage to your Swarm cluster. With most media asset managers (MAMs) supporting S3, you can easily transfer data between storage tiers. With Swarm’s built-in versioning, you can easily select the object with the corresponding metadata that needs to be retrieved.

Swarm also has built-in, browser-based content management, so that objects can be securely viewed or streamed directly from our storage and then shared with another user. Lastly, Swarm can back up to any S3-compliant third party service or S3 endpoint. This allows organizations to have data/content on prem for a certain period of time if required for the client, and then back it up to a public cloud such as Amazon or Wasabi for long-term storage.

Ready to Learn More About How to Lower Your Storage TCO?

If you are ready to learn more about how to lower your storage TCO while keeping your content and data protected and accessible, contact us today!

Helpful Resources

Get Started with
Swarm Object Storage

Ensure rapidly scaling datasets are continuously protected and instantly accessible while eliminating tedious storage management, reducing TCO, and enabling distributed workloads.

Get Started Now

Related Posts
 
AIOps in Action: Revolutionizing IT Operations for the Digital Era
Vinod Mohan
AIOps in Action: Revolutionizing IT Operations for the Digital Era
 
The Crucial Role of Persistent Storage in Modern Data Centers
Alexander Best
The Crucial Role of Persistent Storage in Modern Data Centers
 
Navigating the Storm: The Broadcom-VMware Acquisition and Its Ripple Effects
Vinod Mohan
Navigating the Storm: The Broadcom-VMware Acquisition and Its Ripple Effects