As 2011 Nears a Transformation in Backup Awaits

The combination of cloud computing, cloud storage, inexpensive hardware, virtualization and heightened user demands for near real time backup and recovery are creating a crisis in traditional backup methodologies. It is a crisis in the sense that there is no way any emerging virtualized data center is going to find that how these backups work and are managed even slightly acceptable in the very near future. This suggests that in 2011 the transformation in backup that many have predicted will occur and it will go well beyond just deduplicating backup data stored to disk.

The signs that traditional backup is breaking have been there for some time anyone willing to see them. The emerging virtual data center is resulting in less people to manage more virtual machines, all applications becoming subject to 24×7 availability, heightened user expectations for near-real time recoveries from application failures, and emerging service providers creating new options for companies to failover their data centers into data centers hosted by these service providers.

Satisfying these problems cannot be solved by simply just switching from tape to disk as a backup target and then deduplicating the data stored on disk. While deduplicating backup data stored to disk has solved some of the long standing problems of traditional backup methodologies (backup windows, slow backups and recoveries), by solving them it has given companies the opportunity and time to re-examine their backup strategies as a whole.

For instance, they are beginning to realize:

  • They need to use the “right” backup software everywhere. Right now I am hearing a number of stories about how enterprise companies are separating the backup of their virtual and physical machines and then choosing the “right” backup software to protect their virtual servers and the “right” backup software to protect these physical servers. I see this approach as akin to the use of deduplicating disk-based backup appliances as a backup target. It solves the immediate problem but eventually these two approaches will need to be merged and one backup software product will need to manage both of these environments (if a physical server environment even remains.
  • Server-based backup is dying. Most traditional backup products are built around the concept of daily or weekly streaming all of the data of production servers to a backup target. While there have been modifications to this approach over the years (incremental, differentials) to reduce the amount of data sent, the point is that they put much of the workload on the server. But as organizations adopt virtualization and increase the number of virtual machines on each physical server, there is no way this approach of putting all of the workload on individual VMs is an option.
  • The majority of their virtualized application servers now use networked storage. This is an architectural shift that is still largely being overlooked as to its full impact on backup and what new options are available. While it is no secret that many if not most virtualization deployments are leveraging networked storage, most backup products are not really leveraging the features that are available in these networked storage systems such as snapshots, clones, and asynchronous replication.
  • Backup data can be multi-purposed. It is probably fair to say that most organizations don’t think much about how they can re-purpose their backup data because they are accustomed to not re-purposing it since it was on tape. But now that a near real-time copy of production data is on disk and can be relatively easily accessed and copied, why not examine new possibilities of using it for near real time recoveries, testing, data mining, quality assurance or even moving more of this data to the cloud?

It as organizations realize the benefits they stand to reap by changing their approach to backup that it only makes sense that a transformation in backup awaits. It also does not appear that this transformation will not take years or decades to occur but rather is already well underway and poised for accelerated adoption in 2011. So it only makes sense that those backup vendors that have recognized these trends and have either re-architected or properly architected their products to account for them will be in the right position to help users deliver on these new requirements.

So as DCIG and SMB Research finish their work on the current Virtual Server Backup Software Buyer’s Guide that we plan to publish no later than November, we hope to be in a better position at that time to provide users authoritative insight and guidance as to which backup software products are best positioned to deliver on this transformation in backup that seems almost destined to occur in 2011.

Jerome M. Wendt

About Jerome M. Wendt

President & Founder of DCIG, LLC Jerome Wendt is the President and Founder of DCIG, LLC., an independent storage analyst and consulting firm. Mr. Wendt founded the company in November 2007.

Leave a Reply