Today’s backup mantra seems to be backup to the cloud or bust! But backup to the cloud is more than just redirecting backup streams from a local file share to a file share presented by a cloud storage provider and clicking the “Start” button. Organizations must examine to which cloud storage providers they can send their data as well as how their backup software packages and sends the data to the cloud. BackupAssist 10.0 answers many of these tough questions about cloud data protection that businesses face while providing them some welcomed flexibility in their choice of cloud storage providers.
If you assume that leading enterprise midrange all-flash arrays (AFAs) support deduplication, your assumption would be correct. But if you assume that these arrays implement and deliver deduplication’s features in the same way, you would be mistaken. These differences in deduplication should influence any all-flash array buying decision as deduplication’s implementation affects the array’s total effective capacity, performance, usability, and, ultimately, your bottom line.
Detect. Protect. Recover. I often see those three words when someone discusses the best methods for companies to deal with the scourge of ransomware. But stringing three words together in a marketing slogan does not a solution make. While understanding the steps needed to protect oneself against ransomware is certainly a requirement, knowing what features that backup software should possess and which products possess those features are equally important.
A few years ago when all-flash arrays (AFAs) were still gaining momentum, newcomers like Nimbus Data appeared poised to take the storage world by storm. But as the big boys of storage (Dell, HDS, and HPE, among others,) entered the AFA market, Nimbus opted to retrench and rethink the value proposition of its all-flash arrays. Its latest AFA models, the ExaFlash D-Series, is one of the outcomes of that repositioning as these arrays answer the call of today’s hosting providers. These arrays deliver the high levels of availability, flexibility, performance, and storage density that they seek backed by one of the lowest cost per GB price points in the market.
The success and popularity of the DCIG Buyer’s Guides stem first from the methodology that DCIG uses to gather and synthesize product data and then how it publishes its findings. DCIG applies five internal guidelines to best define and identify products from a DCIG Body of Research to include in each Buyer’s Guide Edition. Further, each DCIG Buyer’s Guide discloses why it may not cover certain products and provides guidance to readers on how to best use the Guide.
Each passing week seems to bring new use cases for solid state drives (SSDs) further to the forefront and brings into question the viability of disk and tape for them. This week was no exception. The announcement of NGD Systems 24TB Catalina SSD directly targets use cases such as active archive where tape predominate but for which the 24TB Catalina SSD emerges as a potential replacement.
Last week HPE announced its acquisition of SimpliVity, a provider of enterprise hyper-converged infrastructure solutions. While that announcement certainly made news in the IT industry, the broader implications of this acquisition signaled that enterprise IT providers such as HPE could no longer sit on the sidelines and merely be content to partner with providers such as SimpliVity as hyper-converged solutions rapidly become a growing percentage of enterprise IT. If HPE wanted its fair share of this market, it was imperative that it act sooner rather than later to ensure it remained a leading player in this rapidly growing market.
Recently cloud backup and Disaster Recovery as a Service (DRaaS) have gone from niche markets into the mainstream with companies of ever larger sizes bringing these two technologies in-house. Zetta is one such provider that has largely grown up with this market having first started out as a cloud storage provider in 2008 before adding on cloud backup and DRaaS offerings in recent years. Last week I had the opportunity to speak with its CEO Mike Grossman who provided me with an update on Zetta and its technology offerings. Here are the key points that I took away from that conversation.
Approximately a month ago I posted a blog entry that examined what features constitute and separate Tier 1 providers from Tier 2 or lower providers in the market place. In that blog entry, I concluded that product features alone are insufficient to classify a provider as Tier 1. It is when one lays aside product features that four other characteristics emerge that a provider must possess – and which DCIG can objectively evaluate – that one may use to classify it as Tier 1.
In early November DCIG finalized its research into all-flash arrays and, in the coming weeks and months, will be announcing its rankings in its various Buyer’s Guide Editions as well as in its new All-flash Array Product Ranking Bulletins. It as DCIG prepares to release its all-flash array rankings that we also find ourselves remarking just how quickly interest in HDD-based arrays has declined just this year alone. While we are not ready to declare HDDs dead by any stretch, finding any sparks that represent interest or innovation in hard disk drives (HDDs) is getting increasingly difficult.
The Buyer’s Guides that DCIG produces are some of the most referenced and relied upon documents in the technology industry for evaluating data protection and data storage products. However, as individuals and/or organizations evaluate these Buyer’s Guides, they need to remain circumspect in how they view and use the information contained them. Specifically, as they use these Buyer’s Guides to create product short list, they need to take steps to verify that the product features as shown as supported in these Guides work in the manner that they need for their environment.
In almost every industry there is a tendency to use phrases such as Tier 1, Tier 2, and Tier 3 to describe providers, the products in a specific market, the quality of service provided, or some combination thereof. It is one applies these three terms to the storage industry and how to properly classify storage providers into one of these various tiers that the conversation becomes intriguing. After all, how does one define what constitutes and separates a Tier 1 storage provider from other providers in the market?
Most organizations when they look at backup appliances have to segregate them into one of two categories: those that function as integrated backup appliances (which include backup software) and those that function as target-based deduplicating backup appliances. Cohesity effectively blurs these lines by giving organizations the option to use its appliances to satisfy either or both of these use cases in their environment. In this fourth and final installment in my interview series with system architect, Fidel Michieli, he describes how he leverages Cohesity’s backup software feature for VM protection and as a deduplicating backup target for his NetBackup backup software.
DCIG is pleased to announce the availability of the DCIG 2016-17 High End Storage Array Buyer’s Guide developed from the enterprise storage array body of research. The DCIG 2016-17 High End Storage Array Buyer’s Guide weights, scores and ranks more than 100 features of fifteen (15) products from seven (7) different storage vendors. Using ranking categories of Best-in-Class, Recommended and Excellent, this Buyer’s Guide offers much of the information an organization should need to make a highly informed decision as to which high end storage array will suit their needs.
Enterprises now demand higher levels of automation, integration, simplicity, and scalability from every component deployed into their IT infrastructure and the integrated backup appliances found in the DCIG’s forthcoming Buyer’s Guide Editions that cover integrated backup appliances are a clear output of those expectations. Intended for organizations that want to protect applications and data and then keep it behind corporate fire walls, these backup appliances come fully equipped from both hardware and software perspectives to do so.
Usually when I talk to backup and system administrators, they willingly talk about how great a product installation was. But it then becomes almost impossible to find anyone who wants to comment about what life is like after their backup appliance is installed. This blog entry represents a bit of anomaly in that someone willingly pulled back the curtain on what their experience was like after they had the appliance installed. In this third installment in my interview series with system architect, Fidel Michieli, describes how the implementation of Cohesity went in his environment and how Cohesity responded to issues that arose.
Evaluating product features, comparing prices, and doing proofing of concepts are important steps in the process of adopting almost any new product. But once one completes those steps, the time arrives to start to roll the product out and implement it. In this second installment of my interview series with System Architect, Fidel Michieli, he shares how his company gained a comfort level with Cohesity for backup and disaster recovery (DR) and how broadly it decided to deploy the product in the primary and secondary data centers.
Every year at VMworld I have conversations that broaden my understanding and appreciation for new products on the market. This year was no exception as I had the opportunity to talk at length with Fidel Michieli, a System Architect at a SaaS provider, who shared his experiences with me about his challenges with backup and recovery and how he came to choose Cohesity. In this first installment in my interview series with Fidel, he shared the challenges that his company was facing with his existing backup configuration as well as the struggles that he had in identifying a backup solution that scaled to meet his dynamically changing and growing environment.
Anyone who has ever to make a product choice that involves tens or hundreds of thousands of dollars knows that one of the more challenging aspects at the conclusion of the process is separating product fact from fiction. Often, the closer an organization gets to finalizing its buying decision, the more aggressive the competing vendors become in spreading fear, uncertainty, and doubt (FUD) to discredit the products and/or services of the other vendors. Using DCIG’s Competitive Research services, organizations may gain access to the critical data that they need to help separate myth from reality to reach a proper conclusion.
This year’s Veritas Vision 2016 conference held a lot of intrigue for me. The show itself was not new. The Vision show has been an ongoing event for years though this was the first time in more than a decade that Veritas was free to set its own agenda for the entire show. Rather the intrigue was in what direction it would take going forward. This Veritas did by communicating that it plans to align its product portfolio and strategy to deliver on an objective that has, to date, eluded enterprise organizations and vendors alike for at least two decades: enterprise data management.