Early in my IT career, a friend who owns a software company told me he had been informed by a peer that he wasn’t charging enough for his software. This peer advised him to adopt a “flinch-based” approach to pricing. He said my friend should start with a base licensing cost that meets margin requirements, and then keep adding on other costs until the prospective customer flinches. My friend found that approach offensive, and so do I.
Many organizations are using all-flash arrays in their data centers today. When asked about the benefits they have achieved, two benefits are almost always top of mind. The first benefit mentioned is the increase in application performance. Indeed, increased performance was the primary rationale for the purchase of the all-flash array. The second benefit came as an unexpected bonus; the decrease in time spent managing storage. As organizations consolidate many applications on each all-flash array; and are discovering that data tiering and quality of service features are important for preserving these benefits.
When one examines enterprise data protection and data storage products through the lens of hyper-converged infrastructure (HCI) designs, one would think each product either supports an HCI architecture or it does not. But as one begins to see when one scrutinizes this topic, the answer is not a simple “Yes” or “No”. Nuancing how well or if a product fits into an HCI design, one first needs to think about the question or even the series of questions that he or she should ask to properly make this assessment.
Every organization, consciously or unconsciously, views and evaluates new technologies through a lens. In recent years, they have largely evaluated new data center technologies in the context of virtualization and how easily it enabled them to achieve that end. That viewpoint has begun to change. Having largely virtualized their infrastructure, they increasingly view and evaluate new and existing data center technologies through the lens of hyper-converged infrastructures and how well these technologies support and enable the adoption hyper-converged infrastructure platforms in their environment.
As human beings we have a proclivity to believe that whatever it is we as an individual or as a society experience, we are the first to go through an event like it. By way of example, corporate IT is undergoing a transformation as enterprises change IT to better align it with the broader business. Ironically, this IT reformation takes place on the 500th anniversary of a more well-known reformation that occurred in 1517.
DCIG Pocket Analyst Report Compares Dell EMC Data Domain and ExaGrid Product Families
Technology conversations within enterprises increasingly focus on the “data center stack” with an emphasis on cloud enablement. While I agree with this shift in thinking, one can too easily overlook the merits of underlying individual technologies when only considering the “Big Picture”. Such is happening with deduplication technology. A key enabler of enterprise archiving, data protecton, and disaster recovery solutions, vendors such as Dell EMC and ExaGrid deliver deduplication technology in different ways as DCIG’s most recent 4-page Pocket Analyst Report reveals that makes each product family better suited for specific use cases.
2017 might well be the year that backup and recovery went from being viewed as a corporate insurance policy for data to a key business enabler and for good reasons. Natural disasters and ransomware attacks have heightened the need for fast, reliable backups and recoveries while new backup product architectures are changing the conversation around what services that cloud data protection appliances can deliver. As organizations go to select one of the current generation of cloud data protection appliances, here are four considerations they should keep in mind.
Comtrade Software’s release of HYCU backup software a few months ago validated that Nutanix’s impact on the enterprise data center was real as HYCU specifically targets Nutanix environments. But one release does not a product make. That’s what makes Comtrade’s follow-on announcement notable. HYCU’s expanded support for more applications and cloud solutions, addition of new encryption features, and tweaks to more quickly complete backups reflect Comtrade’s commitment to diving deeper into the protection of Nutanix AOS environments.
DCIG is pleased to announce the availability of the DCIG 2017-18 Cloud Data Protection Appliance Buyer’s Guide developed from the backup appliance body of research. The DCIG 2017-18 Cloud Data Protection Appliance Buyer’s Guide weights, scores and ranks more than 100 features of twenty-two (22) products from six (6) vendors. Using ranking categories of Recommended and Excellent, this Buyer’s Guide offers much of the information an organization should need to make a highly-informed decision as to which cloud data protection appliance will suit their needs.
Next-generation all-flash arrays will provide dramatic improvements in performance and density over the prior generation of all-flash arrays. These new levels of performance and density will bring the benefits of real-time analysis to a whole new set of problems and organizations, creating tremendous value. They will also enable organizations to achieve significant budget savings through a fresh wave of data center consolidations. But unlocking the ability of any next-generation array to deliver these savings depends on a key set of features that enable workload consolidation and simplified management.
The phrase “Cloud Data Protection Appliance” is included in the name of DCIG’s forthcoming Buyer’s Guide but the end game of each appliance covered in that Guide is squarely on recovery. While successful recoveries have theoretically always been the objective of backup appliances, vendors too often only paid lip service to that ideal as most of their new product features centered on providing better means for doing backups. Recent technology advancements have flipped this premise on its head.
The business case for organizations with petabytes of file data under management to classify and then place it across multiple tiers of storage has never been greater. By distributing this data across disk, flash, tape and the cloud, they stand to realize significant cost savings. The catch is finding a cost-effective solution that makes it easier to administer and manage file data than simply storing it all on flash storage. This is where a solution such as what Quantum now offers come into play.
The annual Flash Memory Summit is where vendors reveal to the world the future of storage technology. Many companies announced innovative products and technical advances at last week’s 2017 Flash Memory Summit that give enterprises a good understanding of what to expect from today’s all-flash products today as well as a glimpse into tomorrow’s products. These previews into the next generation of flash products revealed four flash memory trends sure to influence the development of the next generation of all-flash arrays.
When one looks at today’s lineup of software products that one would classify as cloud data protection, one might assume that every such product natively offers source side deduplication. If you do, you would be wrong. Software such as HPE Data Protector does not natively offer source-side deduplication but its reasons for opting out make sense once one takes a deeper look at the product.
The DCIG 2017-18 Hyperconverged Infrastructure Appliance Buyer’s Guide weights, scores and ranks more than 100 features of twenty-four (24) products from five (5) vendors that achieved rankings of Recommended or Excellent. Using ranking categories of Recommended and Excellent, this Buyer’s Guide offers much of the information an organization should need to make a highly-informed decision as to which hyperconverged appliance will suit their needs.
The DCIG 2017-18 Small/Midsize Enterprise All-flash Array Buyer’s Guide weights, scores and ranks more than 100 features of twenty-four (24) small/midsize enterprise-class all-flash arrays that achieved rankings of Recommended or Excellent. These products come from eleven (11) vendors including Dell EMC, Fujitsu, iXsystems, Kaminario, NEC, NetApp, Nimble Storage, Pivot3, Pure Storage, Tegile and Tintri. This Buyer’s Guide offers much of the information an organization should need to make a highly-informed decision as to which all-flash storage array will suit their needs.
Detect. Protect. Recover. I often see those three words when someone discusses the best methods for companies to deal with the scourge of ransomware. But stringing three words together in a marketing slogan does not a solution make. While understanding the steps needed to protect oneself against ransomware is certainly a requirement, knowing what features that backup software should possess and which products possess those features are equally important.
The success and popularity of the DCIG Buyer’s Guides stem first from the methodology that DCIG uses to gather and synthesize product data and then how it publishes its findings. DCIG applies five internal guidelines to best define and identify products from a DCIG Body of Research to include in each Buyer’s Guide Edition. Further, each DCIG Buyer’s Guide discloses why it may not cover certain products and provides guidance to readers on how to best use the Guide.
Recently cloud backup and Disaster Recovery as a Service (DRaaS) have gone from niche markets into the mainstream with companies of ever larger sizes bringing these two technologies in-house. Zetta is one such provider that has largely grown up with this market having first started out as a cloud storage provider in 2008 before adding on cloud backup and DRaaS offerings in recent years. Last week I had the opportunity to speak with its CEO Mike Grossman who provided me with an update on Zetta and its technology offerings. Here are the key points that I took away from that conversation.
Approximately a month ago I posted a blog entry that examined what features constitute and separate Tier 1 providers from Tier 2 or lower providers in the market place. In that blog entry, I concluded that product features alone are insufficient to classify a provider as Tier 1. It is when one lays aside product features that four other characteristics emerge that a provider must possess – and which DCIG can objectively evaluate – that one may use to classify it as Tier 1.