Data protection has evolved well beyond the point where one can backup and recover data doing once a day backups. Continuous data protection, array-based snapshots, asynchronous replication, high availability, disaster recovery, backup and recovery in the cloud and long term backup retention are now all part of managing backup. However, the real question becomes, “Can one product even manage all of these different facets of backup and recovery? Or should a backup solution even try to accomplish this feat?” In this fifth installment of my interview series with Brett Roscoe, General Manager, Data Protection for Dell Software, we discuss this very important question of whether one backup product can do it all in today’s data center.
Category Archives: Deduplication
Hybrid storage arrays, which dynamically place data in storage pools that combine flash memory and HDDs, are rapidly expanding their market share in the enterprise space. These arrays use the latest generation of hardware – including multi-core CPUs and DRAM and flash caches – to offer high levels of performance and inline data optimization. However, the ZS4-4’s underlying architecture and its unique ability to integrate with Oracle Database 12c make it a superior storage platform to accelerate Oracle Database performance and reduce storage capacity requirements.
Rarely does a day go by at DCIG when deduplication is not mentioned in some context. Instead of storing every chunk of data, deduplication removes redundant data and stores unique recording data just once across the network. Offering up to 20x reductions in data, data deduplication directly equates to lower backup storage costs for almost any size data center as less hardware is needed for storage backup.
DCIG is pleased to announce the availability of its DCIG 2014-15 Deduplicating Backup Appliance Buyer’s Guide that weights, scores and ranks over 100 features on 47 different deduplicating backup appliances from 10 different providers. This Buyer’s Guide provides the critical information that all size organizations need when selecting deduplicating backup appliances to protect environments ranging from remote offices to enterprise data centers.
There are so many options available in today’s next generation of backup and recovery tools that sometimes it can be tough to prioritize which features to implement. In this third installment of my interview series with Dell Software’s General Manager, Data Protection, Brett Roscoe, we discuss four (4) best practices that organizations should prioritize as they implement next generation backup and recovery tools.
As DCIG readies its third release of the DCIG Deduplicating Backup Appliance Buyer’s Guide, it always encounters certain trends and the emergence of new features in the products covered in each respective Guide. DCIG’s experience was no different in its preparations for this Guide. Virtual appliances and scale-out and scale-up architectures in particular caught our eye as DCIG prepares to release this Guide.
Physical, purpose-built deduplicating backup appliances have found their way into many enterprise data centers as they expedite installation and simplify ongoing management of backup data. However there is a growing business case for virtual appliances that offer the benefits of deduplication without the associated hardware costs. To determine when and if a virtual appliance is the correct choice, there are key factors that enterprises must evaluate to arrive at the right decision for a specific office or environment.
The use of data reduction technologies such as compression and deduplication to reduce storage costs are nothing new. Tape drives have used compression for decades to increase backup data densities on tape while many modern deduplicating backup appliances use compression and deduplication to also reduce backup data stores. Even a select number of existing HDD-based storage arrays use data compression and deduplication to minimize data stores for large amounts of file data stored in archives or on networked attached file servers.
There is backup and then there is backup. To meet the backup and recovery needs of today’s organizations, they need to verify that the selected backup appliance includes the features needed to protect their environment today and positions them to meet their needs into the foreseeable future. In this third installment of DCIG’s interview with STORServer President Bill Smoldt, he describes the new must-have features that backup appliances must offer.
One of the more difficult tasks for anyone deeply involved in technology is the ability to see the forest from the trees. Often responsible for supporting the technical components that make up today’s enterprise infrastructures, to step back and recommend which technologies are the right choices for their organization going forward is a more difficult feat. While there is no one right answer that applies to all organizations, five (5) technologies – some new as well as some old technologies that are getting a refresh – merit that organizations prioritize them in the coming months and years.
Anyone who is close to backup recognizes that some types of data deduplicate better than others. However trying to translate that understanding of the environment into meaningful backup policies is almost impossible since it is both complicated and time consuming to successfully implement. Using the new Sepaton VirtuoSO platform, it is able to choose the best form of deduplication for each backup stream on the fly. In this third part of my interview series with Sepaton’s Director of Product Management, Peter Quirk, we discuss how its VirtuoSO platform detects the nature of incoming backup data and then automatically invokes the best deduplication method to deduplicate the data.
A trend that DCIG is seeing among more new products being introduced into the enterprise space is the proclivity to use the best of what has been previously developed in the past and combining that with new technologies that meet the emerging requirements of today’s organizations. The new VirtuoSO offering from Sepaton reflects this broader industry trend. In this second part of my interview series with Sepaton’s Director of Product Management, Peter Quirk, we discuss what features Sepaton brought forward from its existing S2100 product line and what new features its VirtuoSO platform introduced.
Ever since using disk as a preferred backup target gained momentum in the late 2000’s, there have been those who opine that disk’s life in this role would be short lived. But those providers who deliver disk-based backup solutions and are betting their future on them see no slowdown in their adoption. In this first interview with Sepaton’s Director of Product Management, Peter Quick, we discuss how databases and virtual machines (VMs) are just beginning to take full advantage of the benefits that disk offers as a backup target.
DCIG is pleased to announce the availability of its DCIG 2013 Midrange Deduplicating Backup Appliance Buyer’s Guide. In this Buyer’s Guide, DCIG weights, scores and ranks 46 midrange deduplicating backup appliances respectively from ten (10) different providers. Like all previous DCIG Buyer’s Guides, this Buyer’s Guide provide the critical information that all size organizations need when selecting a midrange deduplicating backup appliance to help protect their fast growing data-intensive applications.
DCIG is pleased to announce the availability of its DCIG 2013 Midrange Deduplicating Backup Appliances Buyer’s Guides. In these two Buyer’s Guides, DCIG weights, scores and ranks 20 and 29 midrange deduplicating backup appliances respectively from nine (9) different providers.
As DCIG prepares to release a number of Buyer’s Guides on Midrange Deduplication Backup Appliances in the next few weeks, we thought we would share some of our observations that came out of our evaluation of these products. Like all Buyer’s Guides that DCIG prepares, it did a comprehensive review of available deduplicating backup appliances in anticipation of releasing these Guides. As it did so, it uncovered that deduplication itself has moved well beyond the breakthrough technology that it was a decade or so ago to provide an assortment of features there leaves plenty for organizations to consider when buying one of these appliances.
It was not that long ago – like no more than five (5) years ago – that if as a storage administrator you could configure a storage system to provide average response times of around 2 milliseconds for any application, you were a hero to everyone you supported. Fast forward to today’s hybrid and all flash memory systems and 2 millisecond response times are the new “slow.” In this first installment of my interview series with Tegile System’s VP of Marketing, Rob Commins, we discuss how hybrid and all flash memory are redefining the “Gold” standard for performance in storage systems.
This past week I received an email from someone asking for my help in their process of buying a backup appliance. This individual had just downloaded the DCIG 2012 Backup Appliance Buyer’s Guide but, due to the number of models included in the Buyer’s Guide (over 60), was looking for some recommendations from me as to which one to buy. While I sent this individual a list of backup appliances to look at more closely, it brought to my attention that there are five questions every organization should ask and answer before buying a backup appliance.
It’s no secret that ‘Big Data’ is becoming a ‘Big Problem’ for organizations from a data and storage management perspective. However what organizations may fail to realize is that the best way to solve their Big Data problems is NOT by mindlessly throwing more resources at them. Rather it is to look at Big Data more strategically and then tackle the data management problems it creates in one fell swoop using software like CommVault® Simpana® and its OnePass technology.
In this fourth and final part of our interview series with GreenBytes CEO Bob Petrocelli, we hear about a three-second failover between canisters used in Solidarity, a solid-state storage array solution. If you’re not looking, says Petrocelli, you could miss the failover.