As the importance of content continues to grow, so does the amount of data created. 300 hours of video are uploaded to YouTube every minute, while Facebook users are consuming 2.77 million videos per minute. If you’re a savvy, modern organization, it’s likely that you’re producing ample content of your own and find these statistics unsurprising. Given this, it is also likely that you’re aware of one of the leading problems with all of this content: how to deal with the rising cost of data storage.
More versions mean more data
The problem of data storage is compounded when companies require multiple versions of their media assets. Let’s look at a couple of standard scenarios. You’re working on a 100MB Photoshop document that you want to collaborate on from within your Media Asset Management (MAM) system. At your organization, the average number of revisions for a project such as this is ten, and the average number of similar ongoing projects per month is fifty. With most MAM repositories, you are required to ingest every revision as an entire copy of the original 100MB file. With ten versions of each project, this means that you will be required to store 50GB of new photoshop files every month.
Reduce storage requirement exponentially with front-end deduplication
This is where front-end deduplication comes into play. The process of front-end deduplication is as follows: when ingesting a new version of a file into your MAM system, the MAM compares your new version to the original. It then ingests only the data (bytes) from your version that differs from the original. What this means is that you end up with significantly less data to store than you would otherwise. Rather than a store an extra 100MB just because you want to add text to your Photoshop file, as in the example above, the change would likely result in only a few kilobytes or megabytes of new data created.
Reduced storage requirements become even more dramatic in the case of video, especially in the case where a five-minute cut may only require a five-second edit. Let’s use an example: your editor gets the raw footage, and starts cutting a a five-minute YouTube video. After each version is edited in Adobe Premiere, she exports it to the database for review. At this particular organization, these kinds of projects typically require between three and twenty-five iterations, and with each iteration normally resulting in a 400-500MB export, that would mean up to 12GB of storage is needed for all the rough cuts. But with deduplication, the export sizes are drastically reduced, so that minor edits can result in only a few additional megabytes needing to be stored in the MAM for the entire new cut. For an organization that produces hundreds of videos a week, terabytes can be saved.
Front-end deduplication also has the benefit of significantly reducing bandwidth costs for files stored in a cloud repository, such as Photoshop, InDesign, and document files. This is because deduplication happens on a user’s PC before being sent to the cloud—hence the “front-end” qualification. As a result, you’re able to sync your new versions to the MAM in the cloud in a fraction of the time it would take to upload the whole document.
To learn more about the only MAM system with fully-enabled front-end deduplication for the hybrid cloud, click here to check out Evolphin Zoom today.