The business case for organizations with petabytes of file data under management to classify and then place it across multiple tiers of storage has never been greater. By distributing this data across disk, flash, tape and the cloud, they stand to realize significant cost savings. The catch is finding a cost-effective solution that makes it easier to administer and manage file data than simply storing it all on flash storage. This is where a solution such as what Quantum now offers come into play.
Organizations have come to the realization that using disk as a backup storage target does more than simply solve backup problems. It creates entirely new possibilities for recovery. But as they recognize these new opportunities, they also see the need for backup solutions that offer them new options for application availability and recoverability backed by ease of management. The latest DataPlaform 4.0 release from Cohesity moves organizations closer to this ideal.
In today’s business world where new technologies constantly come to market, there are signs that indicate when certain ones are gaining broader market adoption and ready to go mainstream. Such an event occurred this month when a backup solution purpose built for Nutanix was announced by Comtrade Software.
The Internet has eliminated any excuses for not having access to the information that individuals need to make informed buying decisions about products and/or services. However, providing easy, ready insight that quickly and easily compares IT infrastructure solutions… well, let’s just say Google does not address that challenge. Using DCIG and its Competitive Intelligence Suite, organizations get the tools and services they need to first aggregate research on IT infrastructure solutions and then quickly and easily generate reports that compare product features and services.
While the overall economy and even the broader technology sector largely boom, the enterprise storage space is feeling the pinch. As storage revenues level off and even drop, many people with whom I spoke at this past week’s HPE Discover 2017 event shared their thoughts as to what is causing this situation. The short answer: there does not appear to be a single reason for the pullback in storage revenue but rather a perfect storm of events that is contributing to this situation. The good news is that this retrenching should ultimately benefit end-users.
It’s summer time and nothing typifies it more in the United States than a parade on one of its summer holidays. Keeping with this tradition, the Acronis Backup 12.5 release rolls out a parade of new features that help differentiate it in a crowded market. Leading its feature parade is the introduction of security software to authenticate preexisting backups; the flexibility to customize the names of archived backups; and, event-based backup scheduling all caught my eye as features that few other backup software products currently offer.
When one looks at today’s lineup of software products that one would classify as cloud data protection, one might assume that every such product natively offers source side deduplication. If you do, you would be wrong. Software such as HPE Data Protector does not natively offer source-side deduplication but its reasons for opting out make sense once one takes a deeper look at the product.
Today’s backup mantra seems to be backup to the cloud or bust! But backup to the cloud is more than just redirecting backup streams from a local file share to a file share presented by a cloud storage provider and clicking the “Start” button. Organizations must examine to which cloud storage providers they can send their data as well as how their backup software packages and sends the data to the cloud. BackupAssist 10.0 answers many of these tough questions about cloud data protection that businesses face while providing them some welcomed flexibility in their choice of cloud storage providers.
If you assume that leading enterprise midrange all-flash arrays (AFAs) support deduplication, your assumption would be correct. But if you assume that these arrays implement and deliver deduplication’s features in the same way, you would be mistaken. These differences in deduplication should influence any all-flash array buying decision as deduplication’s implementation affects the array’s total effective capacity, performance, usability, and, ultimately, your bottom line.
Detect. Protect. Recover. I often see those three words when someone discusses the best methods for companies to deal with the scourge of ransomware. But stringing three words together in a marketing slogan does not a solution make. While understanding the steps needed to protect oneself against ransomware is certainly a requirement, knowing what features that backup software should possess and which products possess those features are equally important.
A few years ago when all-flash arrays (AFAs) were still gaining momentum, newcomers like Nimbus Data appeared poised to take the storage world by storm. But as the big boys of storage (Dell, HDS, and HPE, among others,) entered the AFA market, Nimbus opted to retrench and rethink the value proposition of its all-flash arrays. Its latest AFA models, the ExaFlash D-Series, is one of the outcomes of that repositioning as these arrays answer the call of today’s hosting providers. These arrays deliver the high levels of availability, flexibility, performance, and storage density that they seek backed by one of the lowest cost per GB price points in the market.
The success and popularity of the DCIG Buyer’s Guides stem first from the methodology that DCIG uses to gather and synthesize product data and then how it publishes its findings. DCIG applies five internal guidelines to best define and identify products from a DCIG Body of Research to include in each Buyer’s Guide Edition. Further, each DCIG Buyer’s Guide discloses why it may not cover certain products and provides guidance to readers on how to best use the Guide.
Each passing week seems to bring new use cases for solid state drives (SSDs) further to the forefront and brings into question the viability of disk and tape for them. This week was no exception. The announcement of NGD Systems 24TB Catalina SSD directly targets use cases such as active archive where tape predominate but for which the 24TB Catalina SSD emerges as a potential replacement.
Last week HPE announced its acquisition of SimpliVity, a provider of enterprise hyper-converged infrastructure solutions. While that announcement certainly made news in the IT industry, the broader implications of this acquisition signaled that enterprise IT providers such as HPE could no longer sit on the sidelines and merely be content to partner with providers such as SimpliVity as hyper-converged solutions rapidly become a growing percentage of enterprise IT. If HPE wanted its fair share of this market, it was imperative that it act sooner rather than later to ensure it remained a leading player in this rapidly growing market.
Recently cloud backup and Disaster Recovery as a Service (DRaaS) have gone from niche markets into the mainstream with companies of ever larger sizes bringing these two technologies in-house. Zetta is one such provider that has largely grown up with this market having first started out as a cloud storage provider in 2008 before adding on cloud backup and DRaaS offerings in recent years. Last week I had the opportunity to speak with its CEO Mike Grossman who provided me with an update on Zetta and its technology offerings. Here are the key points that I took away from that conversation.
Approximately a month ago I posted a blog entry that examined what features constitute and separate Tier 1 providers from Tier 2 or lower providers in the market place. In that blog entry, I concluded that product features alone are insufficient to classify a provider as Tier 1. It is when one lays aside product features that four other characteristics emerge that a provider must possess – and which DCIG can objectively evaluate – that one may use to classify it as Tier 1.
In early November DCIG finalized its research into all-flash arrays and, in the coming weeks and months, will be announcing its rankings in its various Buyer’s Guide Editions as well as in its new All-flash Array Product Ranking Bulletins. It as DCIG prepares to release its all-flash array rankings that we also find ourselves remarking just how quickly interest in HDD-based arrays has declined just this year alone. While we are not ready to declare HDDs dead by any stretch, finding any sparks that represent interest or innovation in hard disk drives (HDDs) is getting increasingly difficult.
The Buyer’s Guides that DCIG produces are some of the most referenced and relied upon documents in the technology industry for evaluating data protection and data storage products. However, as individuals and/or organizations evaluate these Buyer’s Guides, they need to remain circumspect in how they view and use the information contained them. Specifically, as they use these Buyer’s Guides to create product short list, they need to take steps to verify that the product features as shown as supported in these Guides work in the manner that they need for their environment.
In almost every industry there is a tendency to use phrases such as Tier 1, Tier 2, and Tier 3 to describe providers, the products in a specific market, the quality of service provided, or some combination thereof. It is one applies these three terms to the storage industry and how to properly classify storage providers into one of these various tiers that the conversation becomes intriguing. After all, how does one define what constitutes and separates a Tier 1 storage provider from other providers in the market?
Most organizations when they look at backup appliances have to segregate them into one of two categories: those that function as integrated backup appliances (which include backup software) and those that function as target-based deduplicating backup appliances. Cohesity effectively blurs these lines by giving organizations the option to use its appliances to satisfy either or both of these use cases in their environment. In this fourth and final installment in my interview series with system architect, Fidel Michieli, he describes how he leverages Cohesity’s backup software feature for VM protection and as a deduplicating backup target for his NetBackup backup software.