n the early 2000’s I was a big believer in appliance and/or controller-based storage virtualization technology. To me, it seemed like the most logical choice to solve some of the most pressing problems such as data migrations, storage optimization and reducing storage networking’s overall management complexity that were confronting the deployment of storage networks in enterprise data centers. Yet here we find ourselves in 2015 and, while appliance and storage control-based storage virtualization still exists, it certainly never became the runaway success that many envisioned at the time. Here are my top 3 reasons as to what went wrong with this technology and why it has yet to fully realize its promise.
Category Archives: Networked Storage
DCIG is preparing to release the DCIG 2015-16 Enterprise Midrange Array Buyer’s Guide. The Buyer’s Guide will include data on 33 arrays or array series from 16 storage providers. The term “Enterprise” in the name Enterprise Midrange Array, reflects a class of storage system that has emerged offering key enterprise-class features at prices suitable for mid-sized budgets. The DCIG 2015-16 Enterprise Midrange Array Buyer’s Guide will provide organizations with a valuable tool to cut time and cost from the product research and purchase process.
It has been said that everyone knows what “normal” is but that it is often easier to define “abnormal” than it is to define “normal.” To a certain degree that axiom also applies to defining “high end storage arrays.” Everyone just seems to automatically assume that a certain set of storage arrays are in the “high end” category but when push comes to shove, people can be hard-pressed to provide a working definition as to what constitutes a high end storage array in today’s crowded storage space.
The use of data reduction technologies such as compression and deduplication to reduce storage costs are nothing new. Tape drives have used compression for decades to increase backup data densities on tape while many modern deduplicating backup appliances use compression and deduplication to also reduce backup data stores. Even a select number of existing HDD-based storage arrays use data compression and deduplication to minimize data stores for large amounts of file data stored in archives or on networked attached file servers.
One of the more difficult tasks for anyone deeply involved in technology is the ability to see the forest from the trees. Often responsible for supporting the technical components that make up today’s enterprise infrastructures, to step back and recommend which technologies are the right choices for their organization going forward is a more difficult feat. While there is no one right answer that applies to all organizations, five (5) technologies – some new as well as some old technologies that are getting a refresh – merit that organizations prioritize them in the coming months and years.
Establishing a standard as to how an organization uses proprietary and open source code is at best difficult for most organizations. But iXsystems has essentially bet its future on the continued use of open source code in its product line. This makes it an imperative that it get this decision right to continue fostering support for its product in the open source community. This fifth entry in my interview series with iXsystems’ CTO Jordan Hubbard discusses his thoughts on iXsystems’ responsibility toward the open source community for their contributions and how it draws the line between proprietary and open source code.
In this second blog entry from our interview with Nimbus Data CEO and Founder Thomas Isakovich, we discuss microsecond latencies and how the recently announced Gemini X-series scale-out all-flash platform performs against the competition.
Providing high levels of capacity is only relevant if a storage array can also deliver high levels of performance. The number of CPU cores, the amount of DRAM and the size of the flash cache are the key hardware components that most heavily influence the performance of a hybrid storage array. In this second blog entry in my series examining the Oracle ZS3 Series storage arrays, I examine how its performance compares to that other leading enterprise storage arrays using published performance benchmarks.
Recognized as an innovator in storage system technology, Thomas Isakovich sat down with DCIG to discuss the development, capabilities, and innovation in Nimbus Data’s latest release: the Gemini X. In this first blog entry, he guides us through the development of the X-series, and where he sees it fitting into the current market.
Anyone who managed IT infrastructures in the late 1990’s or early 2000’s probably still remembers how external storage arrays were largely a novelty reserved for high end enterprises with big data centers and deep pockets. Fast forward to today and a plethora of storage arrays exist in a variety of shapes and sizes at increasingly low price points. As such it can be difficult to distinguish between them. To help organizations sort them out, my blog entry today provides a primer on the types of storage arrays currently available on the market.
The time for the release of the refreshed DCIG 2014 Enterprise Midrange Array Buyer’s Guide is rapidly approaching. As that date approached, we have been evaluating and reviewing the data on the current crop of midrange arrays that will be included in the published Buyer’s Guide (information on over 50 models) as well as the models that will be included in DCIG’s online, cloud-based Interactive Buyer’s Guide (over 100 models.) Here is a peak into some of what we are finding out about these models in regards to their ability to deliver on data center automation, VMware integration and flash memory support.
Last week’s acquisition of NexGen Storage by Fusion-io was greeted with quite a bit of fanfare by the storage industry. But as an individual who has covered Fusion-io for many years and talked one-on-one with their top executives on multiple occasions, its acquisition of NexGen signaled that Fusion-io wanted to do more than deliver an external storage array that had its technology built-in. Rather Fusion-io felt it was incumbent for it to take action and accelerate the coming data center transformation that it has talked and written about for years.
In May 2010 DCIG released its first-ever Midrange Array Buyer’s Guide in which we covered 70+ models from over 20 vendors. Fast forward just three (3) short years later and DCIG is on track to release not one, not two, not three no, not even four Buyer’s Guides on enterprise midrange arrays but five distinct Buyer’s Guides on this topic! So what has changed in just three (3) short years that DCIG feels the need to produce so many? To understand this requires a closer look at the forces that are driving the evolution and revolution in enterprise midrange arrays.
DCIG is pleased to announce the availability of its inaugural DCIG 2013 Midrange Unified Storage Array Buyer’s Guide that weights, scores and ranks over 100 features on 30 different storage arrays from eight (8) different storage providers. This Buyer’s Guide provides the critical information that small and midsize enterprises particularly need in regards to storage arrays that will need to serve a variety of purposes within their organization. These purposes may include storing large amounts of unstructured data such as files and emails, hosting virtualized and high performance applications and even serving as a target for archival and backup data stores.
In this final installment of our blog series on WhipTail Technologies, a Solid State Drive (SSD) array provider with some impressive features and capabilities, I am continuing my discussion with WhipTail Technologies Chief Technology Officer, James Candelaria. Last time, we looked at how WhipTail implements software RAID on its devices. Today, we will be discussing the different transport protocols supported by the WhipTail array and why the FCoE and iSCSI protocols trump Infiniband in today’s SSD deployments.
Today is part 2 of an interview I recently did with WhipTail Technologies Chief Technology Officer, James Candelaria, an emerging provider of SSD storage solutions. In my last entry, he and I discussed one major roadblock to widespread enterprise SSD adoption: the performance penalty incurred by garbage collection. This time, we’ll look at how WhipTail optimizes SSD performance while minimizing the deficiencies of MLC flash.
Today DCIG is very excited to announce the availability of its updated DCIG 2012 Midrange Array Buyer’s Guide that weights, scores and ranks over 90 features on more than 50 midrange arrays from 18 different storage providers. However the reason that DCIG believes users will find this guide even more helpful and insightful than the prior DCIG 2010 Midrange Array Buyer’s Guide is that it takes an in-depth look into how well each midrange array integrates with VMware and supports its vStorage APIs.
If you are a regular follower of the DCIG blog site you may have noticed that there has been a noticeable lack of blogging activity on DCIG’s site this week. Unfortunately it is not because I have been taking a vacation, fishing or merely lounging by the lake. Rather I have been locked away in my office completing the background research associated with the upcoming release of the DCIG 2012 Midrange Array Buyer’s Guide due out in the 4th quarter of 2011. Out of that some interesting early observations have emerged.
Today DCIG, LLC, and Foskett Services, LLC, are pleased to jointly announce the availability of an Expanded Edition of the DCIG 2011 Small Enterprise Storage Array Buyer’s Guide that weights, scores and ranks over 35 small enterprise storage array models priced from $5,000 – 30,000 from 19 different vendors.
About a year ago DCIG decided to do something completely different in the analyst space: a side-by-side independent comparison of products in a particular market segment in the form of a Buyer’s Guide. The end result of that was the DCIG 2010 Midrange Array Buyer’s Guide. But believe it or not, a year has already passed since that was produced and it is now time to update and refresh that Buyer’s Guide for a number of reasons.