A couple of weeks ago I attended the Flash Memory Summit in Santa Clara, CA, where I had the opportunity to talk to a number of providers, fellow analysts and developers in attendance about the topic of flash memory. The focus of many of these conversations was less about what flash means right now as its performance ramifications are already pretty well understood by the enterprise. Rather many are already looking ahead to take further advantage of flash’s particular idiosyncrasies and, in so doing, give us some good insight into what will be hot in flash in the years to come.
The enterprise expectations for the availability of their applications hosted in their data center are easy to articulate and quantify: they expect all of these applications to be highly available all of the time with no outages regardless of the circumstances. Meeting those expectations is a far more difficult task and, to date, was for the most part impossible to accomplish using existing host and storage array-based technologies. The HP XP7, with its introduction this week of concurrent, bi-directional synchronous replication between paired storage volumes on different XP7 storage arrays and storage virtual arrays, brings enterprises closer to this ideal of attaining 100 percent application availability under almost any circumstances than they may have ever hoped to achieve.
DCIG is pleased to announce the availability of its DCIG 2014-15 Virtual Server Backup Software Buyer’s Guide that weights, scores and ranks over 100 features on 26 different backup software solutions from 22 different backup software providers. This Buyer’s Guide provides the critical information that all size organizations need when selecting backup software that is specifically tuned to protecting virtualized environments.
DCIG is preparing to release the DCIG 2014-15 Midrange Unified Storage Array Buyer’s Guide. Although this is a diverse marketplace, there are some themes that emerged as we compared the features being offered in arrays today versus the arrays covered in the 2013 edition of this Buyer’s Guide. Those themes include much larger cache sizes, multiplied storage capacity, public cloud storage connectivity, and support for Microsoft virtualization technologies.
As DCIG prepares to release its forthcoming DCIG 2014-15 Virtual Server Backup Software Buyer’s Guide, it has unveiled a number of changes in the features offered on virtual server backup software and the ways in which they offer them. While support for VMware and its various capabilities certainly remain a focal point, support for other hypervisors, connectivity to public storage clouds and even tape support are becoming a bigger part of the conversation. Here are five early insights that DCIG has gleaned from the research that it has completed in recent weeks and months.
Perhaps nowhere does the complexity of the IT infrastructure within today’s organizations come more clearly into focus than when viewed from the perspective of data protection. Backup and recovery software sees first hand all of the applications and operating systems in an enterprise’s environment . Yet, at the same time, it is expected to account for this complexity by centralizing management, holding the line on costs, and simplifying these tasks even as it meets heightened end-user demands for faster backups and recoveries. To break through this complexity, there are three tips that any organization can follow to help both accelerate and simplify the protection and recovery of data in their environment.
Matt Urmston, StorageCraft’s Chief Evangelist and Director of Product Management, has worked in a variety of roles in backup, archiving, data recovery and high availability. In this third blog entry of this interview series, Matt emphasizes that StorageCraft’s value is in the recovery process–getting systems back online quickly and efficiently, and having that work every time.
It has been said that everyone knows what “normal” is but that it is often easier to define “abnormal” than it is to define “normal.” To a certain degree that axiom also applies to defining “high end storage arrays.” Everyone just seems to automatically assume that a certain set of storage arrays are in the “high end” category but when push comes to shove, people can be hard-pressed to provide a working definition as to what constitutes a high end storage array in today’s crowded storage space.
There is literally a divergence occurring right now in data storage solutions. On one hand, a number of storage providers seek to deliver highly differentiated storage solutions that work with a broad set of applications and operating systems. On the other, a few providers focus on delivering a storage solution that tightly integrates with one or more applications to deliver unparalleled levels of application performance and ease of management. The latest Oracle ZFS Storage Appliance ZS3 Series with its new OS8.2 provide the best of what both of these categories of storage systems currently have to offer to deliver a storage platform that truly stands apart.
Organizations are becoming increasingly virtualized within their data center infrastructures which is leading them to aggressively virtualize the storage arrays in their infrastructure to complement their already virtualized server environment. As they do so, it behooves them to distinguish between, and have a clear understanding, of each virtual component that makes up their newly virtualized storage infrastructure. The need to clarify this terminology comes clearly into focus as organizations evaluate the multi-tenancy and virtual storage array capabilities found on many high end storage arrays.
The requirements of integrated backup appliances deployed into small and remote offices are generally modest as almost any size integrated backup appliance could theoretically meet the data protection and recovery needs of these size offices. However their objective is to identify and deploy an appropriately priced and sized backup appliance that meets their office’s technical needs and fits within their budget while also still meeting the broader needs of the distributed enterprise of which they are a part.
The use of data reduction technologies such as compression and deduplication to reduce storage costs are nothing new. Tape drives have used compression for decades to increase backup data densities on tape while many modern deduplicating backup appliances use compression and deduplication to also reduce backup data stores. Even a select number of existing HDD-based storage arrays use data compression and deduplication to minimize data stores for large amounts of file data stored in archives or on networked attached file servers.
As the role of IT changes from functioning as specialists to generalists, many IT staff members find themselves in the role of a Business Technologist. In this new role, they serve a two-fold purpose. First, they must understand and document the specific needs and requirements of the business by interfacing with key end-users and product managers. Once they document these needs, they then map those requirements to a specific technology solution that solves them.
Choosing the right backup appliance – physical or virtual – does not have to be complicated so long as an organization knows the right questions to ask and gathers the appropriate information. However, as organizations are gathering this information, most conclude that a virtual backup appliance is NOT the right answer in most circumstances. In this fifth and final installment of DCIG’s interview with STORServer President Bill Smoldt, he explains how to choose the most appropriate backup appliance for your environment and why a virtual backup appliance is probably not the choice you will be making.
As I attended sessions at Microsoft TechEd 2014 last week and talked with people in the exhibit hall a number of themes emerged including “mobile first, cloud first”, hybrid cloud, migration to the cloud, disaster recovery as a service, and flash memory storage as a game-changer in the data center. But as I reflect on the entire experience, a statement made John Loveall, Principal Program Manager for Microsoft Windows Server during one of his presentations sums up to overall message of the conference, “Today it is really all about the integrated solution.”
Distributed enterprises with varying size remote offices under their management are no different than any other organization in that they also want to capitalize on the numerous benefits that integrated backup appliances offer. Yet selecting the “right-sized” backup appliance for each office can quickly become very complicated as it can create a tangled web of backup and recovery management if neither the appliances nor the backup software can be centrally monitored and managed.
The disconnect between how quickly and efficiently end users think their IT department can back up and recover data and the IT department’s actual ability to deliver on these expectations can be substantial. Too often, IT departments are not equipped to recover data nearly as fast as end users expect, and they may not even have the data available to recover. In this fourth installment of DCIG’s interview with STORServer President Bill Smoldt, he explains why misconceptions about backup persist and what backup paradigms must change for the benefit of everyone.
At TechEd 2014 in Houston, TX this week, Microsoft made it clear that it is no longer content to just send customers to storage array vendors to meet their storage needs, especially when it comes to embracing a cloud-oriented approach to infrastructure. In the process of improving Windows storage technology, Microsoft is effectively delivering the benefits of–and addressing the barriers to–the adoption of server SAN technology.
DCIG is pleased to announce the availability of its DCIG 2014-15 Security Information and Event Management (SIEM) Appliance Buyer’s Guide. In this Buyer’s Guide, DCIG weights, scores and ranks 29 SIEM appliances respectively from nine (9) different providers. Like all previous DCIG Buyer’s Guides, this Buyer’s Guide provides the critical information that all size organizations need when selecting a SIEM appliance to help provide visibility into their security posture by providing usable and actionable information.
Toward the end of April Wikibon’s David Floyer posted an article on the topic of server SANs entitled “The Rise of Server SANs” which generated a fair amount of attention and was even the focus of a number of conversations that I had at this past week’s Symantec Vision 2014 conference in Las Vegas. However I have to admit, when I first glanced at some of the forecasts and charts that were included in that piece, I thought Wikibon was smoking pot and brushed it off. But after having had some lengthy conversations with attendees at Symantec Vision, I can certainly see why Wikibon made some of the claims that it did.