The phrase “Cloud Data Protection Appliance” is included in the name of DCIG’s forthcoming Buyer’s Guide but the end game of each appliance covered in that Guide is squarely on recovery. While successful recoveries have theoretically always been the objective of backup appliances, vendors too often only paid lip service to that ideal as most of their new product features centered on providing better means for doing backups. Recent technology advancements have flipped this premise on its head.
The business case for organizations with petabytes of file data under management to classify and then place it across multiple tiers of storage has never been greater. By distributing this data across disk, flash, tape and the cloud, they stand to realize significant cost savings. The catch is finding a cost-effective solution that makes it easier to administer and manage file data than simply storing it all on flash storage. This is where a solution such as what Quantum now offers come into play.
As recently as a few years ago support for private and/or public cloud storage providers by enterprise data protection products was still a hit-or-miss proposition. Those days are essentially over. The vast majority of products minimally leverage cloud providers as cloud storage targets and, in many cases, use them for more advanced recovery options. But as support for the cloud has become commonplace, three specific new features appear on more of these products making them more flexible, manageable, and scalable while also serving to foretell what all these products will offer in the very near future.
Today organizations more so than ever are looking to move to software-defined data centers. Whether they adopt software-defined storage, networking, computing, servers, security, or all of them as part of this initiative, they are starting to conclude that a software-defined world trumps the existing hardware defined one. While I agree with this philosophy in principle, organizations need to carefully dip their toe into the software-defined waters and not dive head-first.
There are two assumptions that IT professionals need to exercise caution before making when evaluating cloud data protection products. One is to assume all products share some feature or features in common. The other is to assume that one product possesses some feature or characteristic that no other product on the market offers. As DCIG reviews its recent research into the cloud data protection products, one cannot make either one of these assumptions, even on features such as deduplication, encryption, and replication that one might expect to be universally adopted by these products in comparable ways.
Backup products have always sought to differentiate themselves by offering specific features that met different organizational needs. But at the end of the day, backup products primarily had to account for protecting the data that organizations had with these products placing a lower priority on recovery and cloud connectivity. Those days are largely over with all backup products (save a few) having transformed to offer cloud data protection with many of them providing a variety of cloud recovery options.
Ever since I got my first job in IT in the mid-1990’s, everyone has used a cloud in some form. Whether they referred to it as outsourcing, virtualization, central IT, or in some other way, the cloud existed and grew but it did little to stem the adoption of distributed computing. Yet at some point over the past few years, the parallel growth of these two technologies stopped and the cloud forged ahead. This shift indicates that companies have now fully embraced the cloud but remain unclear about how best and how soon to transition their IT infrastructure to the cloud and then manage it once it is there.
Organizations have come to the realization that using disk as a backup storage target does more than simply solve backup problems. It creates entirely new possibilities for recovery. But as they recognize these new opportunities, they also see the need for backup solutions that offer them new options for application availability and recoverability backed by ease of management. The latest DataPlaform 4.0 release from Cohesity moves organizations closer to this ideal.
In today’s business world where new technologies constantly come to market, there are signs that indicate when certain ones are gaining broader market adoption and ready to go mainstream. Such an event occurred this month when a backup solution purpose built for Nutanix was announced by Comtrade Software.
The Internet has eliminated any excuses for not having access to the information that individuals need to make informed buying decisions about products and/or services. However, providing easy, ready insight that quickly and easily compares IT infrastructure solutions… well, let’s just say Google does not address that challenge. Using DCIG and its Competitive Intelligence Suite, organizations get the tools and services they need to first aggregate research on IT infrastructure solutions and then quickly and easily generate reports that compare product features and services.
While the overall economy and even the broader technology sector largely boom, the enterprise storage space is feeling the pinch. As storage revenues level off and even drop, many people with whom I spoke at this past week’s HPE Discover 2017 event shared their thoughts as to what is causing this situation. The short answer: there does not appear to be a single reason for the pullback in storage revenue but rather a perfect storm of events that is contributing to this situation. The good news is that this retrenching should ultimately benefit end-users.
It’s summer time and nothing typifies it more in the United States than a parade on one of its summer holidays. Keeping with this tradition, the Acronis Backup 12.5 release rolls out a parade of new features that help differentiate it in a crowded market. Leading its feature parade is the introduction of security software to authenticate preexisting backups; the flexibility to customize the names of archived backups; and, event-based backup scheduling all caught my eye as features that few other backup software products currently offer.
When one looks at today’s lineup of software products that one would classify as cloud data protection, one might assume that every such product natively offers source side deduplication. If you do, you would be wrong. Software such as HPE Data Protector does not natively offer source-side deduplication but its reasons for opting out make sense once one takes a deeper look at the product.
Today’s backup mantra seems to be backup to the cloud or bust! But backup to the cloud is more than just redirecting backup streams from a local file share to a file share presented by a cloud storage provider and clicking the “Start” button. Organizations must examine to which cloud storage providers they can send their data as well as how their backup software packages and sends the data to the cloud. BackupAssist 10.0 answers many of these tough questions about cloud data protection that businesses face while providing them some welcomed flexibility in their choice of cloud storage providers.
If you assume that leading enterprise midrange all-flash arrays (AFAs) support deduplication, your assumption would be correct. But if you assume that these arrays implement and deliver deduplication’s features in the same way, you would be mistaken. These differences in deduplication should influence any all-flash array buying decision as deduplication’s implementation affects the array’s total effective capacity, performance, usability, and, ultimately, your bottom line.
Detect. Protect. Recover. I often see those three words when someone discusses the best methods for companies to deal with the scourge of ransomware. But stringing three words together in a marketing slogan does not a solution make. While understanding the steps needed to protect oneself against ransomware is certainly a requirement, knowing what features that backup software should possess and which products possess those features are equally important.
A few years ago when all-flash arrays (AFAs) were still gaining momentum, newcomers like Nimbus Data appeared poised to take the storage world by storm. But as the big boys of storage (Dell, HDS, and HPE, among others,) entered the AFA market, Nimbus opted to retrench and rethink the value proposition of its all-flash arrays. Its latest AFA models, the ExaFlash D-Series, is one of the outcomes of that repositioning as these arrays answer the call of today’s hosting providers. These arrays deliver the high levels of availability, flexibility, performance, and storage density that they seek backed by one of the lowest cost per GB price points in the market.
The success and popularity of the DCIG Buyer’s Guides stem first from the methodology that DCIG uses to gather and synthesize product data and then how it publishes its findings. DCIG applies five internal guidelines to best define and identify products from a DCIG Body of Research to include in each Buyer’s Guide Edition. Further, each DCIG Buyer’s Guide discloses why it may not cover certain products and provides guidance to readers on how to best use the Guide.
Each passing week seems to bring new use cases for solid state drives (SSDs) further to the forefront and brings into question the viability of disk and tape for them. This week was no exception. The announcement of NGD Systems 24TB Catalina SSD directly targets use cases such as active archive where tape predominate but for which the 24TB Catalina SSD emerges as a potential replacement.
Last week HPE announced its acquisition of SimpliVity, a provider of enterprise hyper-converged infrastructure solutions. While that announcement certainly made news in the IT industry, the broader implications of this acquisition signaled that enterprise IT providers such as HPE could no longer sit on the sidelines and merely be content to partner with providers such as SimpliVity as hyper-converged solutions rapidly become a growing percentage of enterprise IT. If HPE wanted its fair share of this market, it was imperative that it act sooner rather than later to ensure it remained a leading player in this rapidly growing market.