Both Hitachi Vantara and NetApp refreshed their respective F-Series and A-Series lines of all-flash arrays (AFAs) in the first half of 2018. While some of these changes reinforced the respective strengths of each of their product lines, other changes provided some key insights into how these two vendors see the AFA market shaping up in the years to come. Features such as host-to-storage networking connectivity, predictive analytics, support for public clouds, and data protection and flash performance optimization are key areas where these two products differentiate themselves.
Mainstream enterprise storage vendors are embracing NVMe. HPE, NetApp, Pure Storage, Dell EMC, Kaminario and Tegile all offer all-NVMe arrays. According to these vendors, the products will soon support storage class memory as well. NVMe protocol access to flash memory SSDs is a big deal. Support for storage class memory may become an even bigger deal.
Business are finally adopting public cloud because a large and rapidly growing catalog of services is now available from multiple cloud providers. These two factors have many implications for businesses. This article addresses four of these implications plus several cloud-specific risks.
DCIG is pleased to announce the availability of the DCIG 2018-19 All-flash Array Buyer’s Guide edition developed from its enterprise storage array body of research. This 64-page report presents a fresh snapshot of the dynamic all-flash array (AFA) marketplace. It evaluates and ranks thirty-two (32) enterprise class all-flash arrays that achieved rankings of Recommended or Excellent based on a comprehensive scoring of product features. These products come from seven (7) vendors including Dell EMC, Hitachi Vantara, HPE, Huawei, NetApp, Pure Storage and Tegile.
Much has changed since DCIG published the DCIG 2017-18 All-Flash Array Buyer’s Guide just one year ago. The DCIG analyst team is in the final stages of preparing a fresh snapshot of the all-flash array (AFA) marketplace. As we reflected on the fresh all-flash array data and compared it to the data we collected just a year ago, we observed seven significant trends in the all-flash array marketplace that will influence buying decisions through 2019.
Enterprise storage startups are pushing the storage industry forward faster and in directions it may never have gone without them. It is because of these startups that flash memory is now the preferred place to store critical enterprise data. Startups also advanced the customer-friendly all-inclusive approach to software licensing, evergreen hardware refreshes, and pay-as-you-grow utility pricing. These startup-inspired changes delight customers, who are rewarding these startups with large follow-on purchases and Net Promoter Scores (NPS) previously unseen in this industry. Yet the greatest contribution startups may make to the enterprise storage industry is applying predictive analytics to storage.
Early in my IT career, a friend who owns a software company told me he had been informed by a peer that he wasn’t charging enough for his software. This peer advised him to adopt a “flinch-based” approach to pricing. He said my friend should start with a base licensing cost that meets margin requirements, and then keep adding on other costs until the prospective customer flinches. My friend found that approach offensive, and so do I.
Many organizations are using all-flash arrays in their data centers today. When asked about the benefits they have achieved, two benefits are almost always top of mind. The first benefit mentioned is the increase in application performance. Indeed, increased performance was the primary rationale for the purchase of the all-flash array. The second benefit came as an unexpected bonus; the decrease in time spent managing storage. As organizations consolidate many applications on each all-flash array; and are discovering that data tiering and quality of service features are important for preserving these benefits.
When one examines enterprise data protection and data storage products through the lens of hyper-converged infrastructure (HCI) designs, one would think each product either supports an HCI architecture or it does not. But as one begins to see when one scrutinizes this topic, the answer is not a simple “Yes” or “No”. Nuancing how well or if a product fits into an HCI design, one first needs to think about the question or even the series of questions that he or she should ask to properly make this assessment.
Every organization, consciously or unconsciously, views and evaluates new technologies through a lens. In recent years, they have largely evaluated new data center technologies in the context of virtualization and how easily it enabled them to achieve that end. That viewpoint has begun to change. Having largely virtualized their infrastructure, they increasingly view and evaluate new and existing data center technologies through the lens of hyper-converged infrastructures and how well these technologies support and enable the adoption hyper-converged infrastructure platforms in their environment.
As human beings we have a proclivity to believe that whatever it is we as an individual or as a society experience, we are the first to go through an event like it. By way of example, corporate IT is undergoing a transformation as enterprises change IT to better align it with the broader business. Ironically, this IT reformation takes place on the 500th anniversary of a more well-known reformation that occurred in 1517.
DCIG Pocket Analyst Report Compares Dell EMC Data Domain and ExaGrid Product Families
Technology conversations within enterprises increasingly focus on the “data center stack” with an emphasis on cloud enablement. While I agree with this shift in thinking, one can too easily overlook the merits of underlying individual technologies when only considering the “Big Picture”. Such is happening with deduplication technology. A key enabler of enterprise archiving, data protecton, and disaster recovery solutions, vendors such as Dell EMC and ExaGrid deliver deduplication technology in different ways as DCIG’s most recent 4-page Pocket Analyst Report reveals that makes each product family better suited for specific use cases.
2017 might well be the year that backup and recovery went from being viewed as a corporate insurance policy for data to a key business enabler and for good reasons. Natural disasters and ransomware attacks have heightened the need for fast, reliable backups and recoveries while new backup product architectures are changing the conversation around what services that cloud data protection appliances can deliver. As organizations go to select one of the current generation of cloud data protection appliances, here are four considerations they should keep in mind.
Comtrade Software’s release of HYCU backup software a few months ago validated that Nutanix’s impact on the enterprise data center was real as HYCU specifically targets Nutanix environments. But one release does not a product make. That’s what makes Comtrade’s follow-on announcement notable. HYCU’s expanded support for more applications and cloud solutions, addition of new encryption features, and tweaks to more quickly complete backups reflect Comtrade’s commitment to diving deeper into the protection of Nutanix AOS environments.
DCIG is pleased to announce the availability of the DCIG 2017-18 Cloud Data Protection Appliance Buyer’s Guide developed from the backup appliance body of research. The DCIG 2017-18 Cloud Data Protection Appliance Buyer’s Guide weights, scores and ranks more than 100 features of twenty-two (22) products from six (6) vendors. Using ranking categories of Recommended and Excellent, this Buyer’s Guide offers much of the information an organization should need to make a highly-informed decision as to which cloud data protection appliance will suit their needs.
Next-generation all-flash arrays will provide dramatic improvements in performance and density over the prior generation of all-flash arrays. These new levels of performance and density will bring the benefits of real-time analysis to a whole new set of problems and organizations, creating tremendous value. They will also enable organizations to achieve significant budget savings through a fresh wave of data center consolidations. But unlocking the ability of any next-generation array to deliver these savings depends on a key set of features that enable workload consolidation and simplified management.
The phrase “Cloud Data Protection Appliance” is included in the name of DCIG’s forthcoming Buyer’s Guide but the end game of each appliance covered in that Guide is squarely on recovery. While successful recoveries have theoretically always been the objective of backup appliances, vendors too often only paid lip service to that ideal as most of their new product features centered on providing better means for doing backups. Recent technology advancements have flipped this premise on its head.
The business case for organizations with petabytes of file data under management to classify and then place it across multiple tiers of storage has never been greater. By distributing this data across disk, flash, tape and the cloud, they stand to realize significant cost savings. The catch is finding a cost-effective solution that makes it easier to administer and manage file data than simply storing it all on flash storage. This is where a solution such as what Quantum now offers come into play.
The annual Flash Memory Summit is where vendors reveal to the world the future of storage technology. Many companies announced innovative products and technical advances at last week’s 2017 Flash Memory Summit that give enterprises a good understanding of what to expect from today’s all-flash products today as well as a glimpse into tomorrow’s products. These previews into the next generation of flash products revealed four flash memory trends sure to influence the development of the next generation of all-flash arrays.
When one looks at today’s lineup of software products that one would classify as cloud data protection, one might assume that every such product natively offers source side deduplication. If you do, you would be wrong. Software such as HPE Data Protector does not natively offer source-side deduplication but its reasons for opting out make sense once one takes a deeper look at the product.