DCIG creates Buyer’s Guides in order to help end users jumpstart the research and decision-making process. We do this by providing an informed, third party evaluation of products that scores their features from an end user viewpoint. DCIG’s product rankings dashboards and standardized one-page data sheets enable “at-a-glance” comparisons that help organizations to quickly get to a short list of products that may warrant a closer look.
An original version of a recent article on The Register site reported on the release of the DCIG 2014-15 Midrange Unified Storage Array Buyer’s Guide with the observation, “Tech analyst DCIG has released the 2014 refresh of its Mid-Range Array report and there are a few surprises. NetApp tops the list and the landscape is full of upset apple carts.”
An Omaha city employee recently gained unwanted public visibility after they sent twelve filing cabinets containing a hundred years of irreplaceable original building permits from the basement of City Hall to the county dump. It turns out that the head of the permits and inspections division decided to get rid of the cabinets as part of cleaning out its basement storage area. They did not realize that other city employees regularly pulled the permits, which dated from the 1880s through the 1980s. They were also apparently unaware that a local preservation group was developing a plan to move the permits to a new facility in order to make the permits more secure and accessible to the public.
Like Omaha’s City Hall, businesses often face what appear to be incompatible priorities. IT departments are expected to keep spending in check and know that only 10-20 percent of data is ever accessed after 60 days of its creation. But knowing which data to keep available and which data to delete or archive can be a challenge. This type of dilemma is one of many drivers in the development of a new group of storage systems–public cloud gateways.
DCIG is pleased to announce the September 2 release of the DCIG 2014-15 Midrange Unified Storage Array Buyer’s Guide that weights, scores and ranks more than 200 features of forty (40) different storage arrays from fourteen (14) different storage providers. The plethora of vendors and products in the marketplace–combined with a lack of readily available comparative data–can make product research and selection a daunting task. DCIG creates Buyer’s Guides in order to help end users accelerate the product research and selection process–driving cost out of the research process while simultaneously increasing confidence in the results.
A couple of weeks ago I attended the Flash Memory Summit in Santa Clara, CA, where I had the opportunity to talk to a number of providers, fellow analysts and developers in attendance about the topic of flash memory. The focus of many of these conversations was less about what flash means right now as its performance ramifications are already pretty well understood by the enterprise. Rather many are already looking ahead to take further advantage of flash’s particular idiosyncrasies and, in so doing, give us some good insight into what will be hot in flash in the years to come.
The enterprise expectations for the availability of their applications hosted in their data center are easy to articulate and quantify: they expect all of these applications to be highly available all of the time with no outages regardless of the circumstances. Meeting those expectations is a far more difficult task and, to date, was for the most part impossible to accomplish using existing host and storage array-based technologies. The HP XP7, with its introduction this week of concurrent, bi-directional synchronous replication between paired storage volumes on different XP7 storage arrays and storage virtual arrays, brings enterprises closer to this ideal of attaining 100 percent application availability under almost any circumstances than they may have ever hoped to achieve.
DCIG is pleased to announce the availability of its DCIG 2014-15 Virtual Server Backup Software Buyer’s Guide that weights, scores and ranks over 100 features on 26 different backup software solutions from 22 different backup software providers. This Buyer’s Guide provides the critical information that all size organizations need when selecting backup software that is specifically tuned to protecting virtualized environments.
DCIG is preparing to release the DCIG 2014-15 Midrange Unified Storage Array Buyer’s Guide. Although this is a diverse marketplace, there are some themes that emerged as we compared the features being offered in arrays today versus the arrays covered in the 2013 edition of this Buyer’s Guide. Those themes include much larger cache sizes, multiplied storage capacity, public cloud storage connectivity, and support for Microsoft virtualization technologies.
As DCIG prepares to release its forthcoming DCIG 2014-15 Virtual Server Backup Software Buyer’s Guide, it has unveiled a number of changes in the features offered on virtual server backup software and the ways in which they offer them. While support for VMware and its various capabilities certainly remain a focal point, support for other hypervisors, connectivity to public storage clouds and even tape support are becoming a bigger part of the conversation. Here are five early insights that DCIG has gleaned from the research that it has completed in recent weeks and months.
Perhaps nowhere does the complexity of the IT infrastructure within today’s organizations come more clearly into focus than when viewed from the perspective of data protection. Backup and recovery software sees first hand all of the applications and operating systems in an enterprise’s environment . Yet, at the same time, it is expected to account for this complexity by centralizing management, holding the line on costs, and simplifying these tasks even as it meets heightened end-user demands for faster backups and recoveries. To break through this complexity, there are three tips that any organization can follow to help both accelerate and simplify the protection and recovery of data in their environment.
Matt Urmston, StorageCraft’s Chief Evangelist and Director of Product Management, has worked in a variety of roles in backup, archiving, data recovery and high availability. In this third blog entry of this interview series, Matt emphasizes that StorageCraft’s value is in the recovery process–getting systems back online quickly and efficiently, and having that work every time.
It has been said that everyone knows what “normal” is but that it is often easier to define “abnormal” than it is to define “normal.” To a certain degree that axiom also applies to defining “high end storage arrays.” Everyone just seems to automatically assume that a certain set of storage arrays are in the “high end” category but when push comes to shove, people can be hard-pressed to provide a working definition as to what constitutes a high end storage array in today’s crowded storage space.
There is literally a divergence occurring right now in data storage solutions. On one hand, a number of storage providers seek to deliver highly differentiated storage solutions that work with a broad set of applications and operating systems. On the other, a few providers focus on delivering a storage solution that tightly integrates with one or more applications to deliver unparalleled levels of application performance and ease of management. The latest Oracle ZFS Storage Appliance ZS3 Series with its new OS8.2 provide the best of what both of these categories of storage systems currently have to offer to deliver a storage platform that truly stands apart.
Organizations are becoming increasingly virtualized within their data center infrastructures which is leading them to aggressively virtualize the storage arrays in their infrastructure to complement their already virtualized server environment. As they do so, it behooves them to distinguish between, and have a clear understanding, of each virtual component that makes up their newly virtualized storage infrastructure. The need to clarify this terminology comes clearly into focus as organizations evaluate the multi-tenancy and virtual storage array capabilities found on many high end storage arrays.
The requirements of integrated backup appliances deployed into small and remote offices are generally modest as almost any size integrated backup appliance could theoretically meet the data protection and recovery needs of these size offices. However their objective is to identify and deploy an appropriately priced and sized backup appliance that meets their office’s technical needs and fits within their budget while also still meeting the broader needs of the distributed enterprise of which they are a part.
The use of data reduction technologies such as compression and deduplication to reduce storage costs are nothing new. Tape drives have used compression for decades to increase backup data densities on tape while many modern deduplicating backup appliances use compression and deduplication to also reduce backup data stores. Even a select number of existing HDD-based storage arrays use data compression and deduplication to minimize data stores for large amounts of file data stored in archives or on networked attached file servers.
As the role of IT changes from functioning as specialists to generalists, many IT staff members find themselves in the role of a Business Technologist. In this new role, they serve a two-fold purpose. First, they must understand and document the specific needs and requirements of the business by interfacing with key end-users and product managers. Once they document these needs, they then map those requirements to a specific technology solution that solves them.
Choosing the right backup appliance – physical or virtual – does not have to be complicated so long as an organization knows the right questions to ask and gathers the appropriate information. However, as organizations are gathering this information, most conclude that a virtual backup appliance is NOT the right answer in most circumstances. In this fifth and final installment of DCIG’s interview with STORServer President Bill Smoldt, he explains how to choose the most appropriate backup appliance for your environment and why a virtual backup appliance is probably not the choice you will be making.
As I attended sessions at Microsoft TechEd 2014 last week and talked with people in the exhibit hall a number of themes emerged including “mobile first, cloud first”, hybrid cloud, migration to the cloud, disaster recovery as a service, and flash memory storage as a game-changer in the data center. But as I reflect on the entire experience, a statement made John Loveall, Principal Program Manager for Microsoft Windows Server during one of his presentations sums up to overall message of the conference, “Today it is really all about the integrated solution.”
Distributed enterprises with varying size remote offices under their management are no different than any other organization in that they also want to capitalize on the numerous benefits that integrated backup appliances offer. Yet selecting the “right-sized” backup appliance for each office can quickly become very complicated as it can create a tangled web of backup and recovery management if neither the appliances nor the backup software can be centrally monitored and managed.