Last week HPE announced its acquisition of SimpliVity, a provider of enterprise hyper-converged infrastructure solutions. While that announcement certainly made news in the IT industry, the broader implications of this acquisition signaled that enterprise IT providers such as HPE could no longer sit on the sidelines and merely be content to partner with providers such as SimpliVity as hyper-converged solutions rapidly become a growing percentage of enterprise IT. If HPE wanted its fair share of this market, it was imperative that it act sooner rather than later to ensure it remained a leading player in this rapidly growing market.
Recently cloud backup and Disaster Recovery as a Service (DRaaS) have gone from niche markets into the mainstream with companies of ever larger sizes bringing these two technologies in-house. Zetta is one such provider that has largely grown up with this market having first started out as a cloud storage provider in 2008 before adding on cloud backup and DRaaS offerings in recent years. Last week I had the opportunity to speak with its CEO Mike Grossman who provided me with an update on Zetta and its technology offerings. Here are the key points that I took away from that conversation.
Approximately a month ago I posted a blog entry that examined what features constitute and separate Tier 1 providers from Tier 2 or lower providers in the market place. In that blog entry, I concluded that product features alone are insufficient to classify a provider as Tier 1. It is when one lays aside product features that four other characteristics emerge that a provider must possess – and which DCIG can objectively evaluate – that one may use to classify it as Tier 1.
In early November DCIG finalized its research into all-flash arrays and, in the coming weeks and months, will be announcing its rankings in its various Buyer’s Guide Editions as well as in its new All-flash Array Product Ranking Bulletins. It as DCIG prepares to release its all-flash array rankings that we also find ourselves remarking just how quickly interest in HDD-based arrays has declined just this year alone. While we are not ready to declare HDDs dead by any stretch, finding any sparks that represent interest or innovation in hard disk drives (HDDs) is getting increasingly difficult.
The Buyer’s Guides that DCIG produces are some of the most referenced and relied upon documents in the technology industry for evaluating data protection and data storage products. However, as individuals and/or organizations evaluate these Buyer’s Guides, they need to remain circumspect in how they view and use the information contained them. Specifically, as they use these Buyer’s Guides to create product short list, they need to take steps to verify that the product features as shown as supported in these Guides work in the manner that they need for their environment.
In almost every industry there is a tendency to use phrases such as Tier 1, Tier 2, and Tier 3 to describe providers, the products in a specific market, the quality of service provided, or some combination thereof. It is one applies these three terms to the storage industry and how to properly classify storage providers into one of these various tiers that the conversation becomes intriguing. After all, how does one define what constitutes and separates a Tier 1 storage provider from other providers in the market?
Most organizations when they look at backup appliances have to segregate them into one of two categories: those that function as integrated backup appliances (which include backup software) and those that function as target-based deduplicating backup appliances. Cohesity effectively blurs these lines by giving organizations the option to use its appliances to satisfy either or both of these use cases in their environment. In this fourth and final installment in my interview series with system architect, Fidel Michieli, he describes how he leverages Cohesity’s backup software feature for VM protection and as a deduplicating backup target for his NetBackup backup software.
DCIG is pleased to announce the availability of the DCIG 2016-17 High End Storage Array Buyer’s Guide developed from the enterprise storage array body of research. The DCIG 2016-17 High End Storage Array Buyer’s Guide weights, scores and ranks more than 100 features of fifteen (15) products from seven (7) different storage vendors. Using ranking categories of Best-in-Class, Recommended and Excellent, this Buyer’s Guide offers much of the information an organization should need to make a highly informed decision as to which high end storage array will suit their needs.
Enterprises now demand higher levels of automation, integration, simplicity, and scalability from every component deployed into their IT infrastructure and the integrated backup appliances found in the DCIG’s forthcoming Buyer’s Guide Editions that cover integrated backup appliances are a clear output of those expectations. Intended for organizations that want to protect applications and data and then keep it behind corporate fire walls, these backup appliances come fully equipped from both hardware and software perspectives to do so.
Usually when I talk to backup and system administrators, they willingly talk about how great a product installation was. But it then becomes almost impossible to find anyone who wants to comment about what life is like after their backup appliance is installed. This blog entry represents a bit of anomaly in that someone willingly pulled back the curtain on what their experience was like after they had the appliance installed. In this third installment in my interview series with system architect, Fidel Michieli, describes how the implementation of Cohesity went in his environment and how Cohesity responded to issues that arose.
Evaluating product features, comparing prices, and doing proofing of concepts are important steps in the process of adopting almost any new product. But once one completes those steps, the time arrives to start to roll the product out and implement it. In this second installment of my interview series with System Architect, Fidel Michieli, he shares how his company gained a comfort level with Cohesity for backup and disaster recovery (DR) and how broadly it decided to deploy the product in the primary and secondary data centers.
Every year at VMworld I have conversations that broaden my understanding and appreciation for new products on the market. This year was no exception as I had the opportunity to talk at length with Fidel Michieli, a System Architect at a SaaS provider, who shared his experiences with me about his challenges with backup and recovery and how he came to choose Cohesity. In this first installment in my interview series with Fidel, he shared the challenges that his company was facing with his existing backup configuration as well as the struggles that he had in identifying a backup solution that scaled to meet his dynamically changing and growing environment.
Anyone who has ever to make a product choice that involves tens or hundreds of thousands of dollars knows that one of the more challenging aspects at the conclusion of the process is separating product fact from fiction. Often, the closer an organization gets to finalizing its buying decision, the more aggressive the competing vendors become in spreading fear, uncertainty, and doubt (FUD) to discredit the products and/or services of the other vendors. Using DCIG’s Competitive Research services, organizations may gain access to the critical data that they need to help separate myth from reality to reach a proper conclusion.
This year’s Veritas Vision 2016 conference held a lot of intrigue for me. The show itself was not new. The Vision show has been an ongoing event for years though this was the first time in more than a decade that Veritas was free to set its own agenda for the entire show. Rather the intrigue was in what direction it would take going forward. This Veritas did by communicating that it plans to align its product portfolio and strategy to deliver on an objective that has, to date, eluded enterprise organizations and vendors alike for at least two decades: enterprise data management.
To help organizations evaluate available enterprise storage arrays and make informed decisions about the most appropriate array for their needs, DCIG is pleased to announce the availability of its body of research into enterprise storage arrays. DCIG’s body of research into enterprise storage arrays, presented and made available through its Analysis Portal, directly addresses this specific challenge that organizations routinely encounter when buying storage arrays.
Anyone in attendance at VMworld last week in Las Vegas and walking through the exhibit hall where all of the vendors showcased their wares could hardly miss the vast numbers of hyper-converged infrastructure and hyper-converged like vendors in attendance. Cisco, Dell, EMC, HPE, Nutanix, Maxta, Cohesity, Pivot3, Rubrik, Simplivity and Datrium, just to name a few, and I am sure there were others. Yet what caught my attention is speaking to their representatives and some of their users is how storage remains a factor in the architecture of hyper-converged infrastructure (HCI) solutions.
Hyperconverged infrastructure solutions stand poised to disrupt traditional IT architectures in every way possible. Combining compute, data protection, networking, memory, scale out, storage, and virtualization on a single platform, they deliver the benefits of traditional IT infrastructures without their associated complexities. But as organizations look to consolidate on hyperconverged infrastructure solutions, they need data protection services such as Pivot3’s Quality of Service (QoS) feature now found on its vSTAC SLX Hyperconverged product that enables organizations to better protect their applications.
Each Buyer’s Guide Edition released by DCIG generates a tremendous amount of interest in the technology industry as a whole. However, as people consider each Buyer’s Guide Edition that DCIG puts out, many make the assumption that DCIG seeks to make each Guide Edition “all-inclusive.” This is a mistaken assumption which leads some to claim or position themselves as “in the know”, jumping to misleading and erroneous conclusions.
Whether companies like it or not, individuals within their organizations over the last few decades have adopted the technologies that they need in order to more effectively do their jobs. One such adoption has been the use of public file sync-n-share technologies that put data – and the control of it – outside of the purview of corporate IT. In this third and final installment in my interview series with Nexsan’s CEO Robert Fernander, he explains how Nexsan’s UNITY empowers organizations to bring this part of the world of shadow IT back under corporate control.
Yesterday I broke away from my normal routine of analyzing enterprise data protection and data storage technologies to take a closer look at enterprise security. To do so, I stopped by the Omaha Tech Security Conference held at the local Hilton Omaha conference center and visited some of the vendors’ booths to learn more about their respective technologies. In so doing, it quickly became evident from my conversations with a number of security providers that they recognize their need to introduce Big Data analytics into their products to convert the data, events, and incidents that they record and log into meaningful analysis that organizations can consume and act upon.