Perhaps nowhere does the complexity of the IT infrastructure within today’s organizations come more clearly into focus than when viewed from the perspective of data protection. Backup and recovery software sees first hand all of the applications and operating systems in an enterprise’s environment . Yet, at the same time, it is expected to account for this complexity by centralizing management, holding the line on costs, and simplifying these tasks even as it meets heightened end-user demands for faster backups and recoveries. To break through this complexity, there are three tips that any organization can follow to help both accelerate and simplify the protection and recovery of data in their environment.
Matt Urmston, StorageCraft’s Chief Evangelist and Director of Product Management, has worked in a variety of roles in backup, archiving, data recovery and high availability. In this third blog entry of this interview series, Matt emphasizes that StorageCraft’s value is in the recovery process–getting systems back online quickly and efficiently, and having that work every time.
It has been said that everyone knows what “normal” is but that it is often easier to define “abnormal” than it is to define “normal.” To a certain degree that axiom also applies to defining “high end storage arrays.” Everyone just seems to automatically assume that a certain set of storage arrays are in the “high end” category but when push comes to shove, people can be hard-pressed to provide a working definition as to what constitutes a high end storage array in today’s crowded storage space.
There is literally a divergence occurring right now in data storage solutions. On one hand, a number of storage providers seek to deliver highly differentiated storage solutions that work with a broad set of applications and operating systems. On the other, a few providers focus on delivering a storage solution that tightly integrates with one or more applications to deliver unparalleled levels of application performance and ease of management. The latest Oracle ZFS Storage Appliance ZS3 Series with its new OS8.2 provide the best of what both of these categories of storage systems currently have to offer to deliver a storage platform that truly stands apart.
Organizations are becoming increasingly virtualized within their data center infrastructures which is leading them to aggressively virtualize the storage arrays in their infrastructure to complement their already virtualized server environment. As they do so, it behooves them to distinguish between, and have a clear understanding, of each virtual component that makes up their newly virtualized storage infrastructure. The need to clarify this terminology comes clearly into focus as organizations evaluate the multi-tenancy and virtual storage array capabilities found on many high end storage arrays.
The requirements of integrated backup appliances deployed into small and remote offices are generally modest as almost any size integrated backup appliance could theoretically meet the data protection and recovery needs of these size offices. However their objective is to identify and deploy an appropriately priced and sized backup appliance that meets their office’s technical needs and fits within their budget while also still meeting the broader needs of the distributed enterprise of which they are a part.
The use of data reduction technologies such as compression and deduplication to reduce storage costs are nothing new. Tape drives have used compression for decades to increase backup data densities on tape while many modern deduplicating backup appliances use compression and deduplication to also reduce backup data stores. Even a select number of existing HDD-based storage arrays use data compression and deduplication to minimize data stores for large amounts of file data stored in archives or on networked attached file servers.
As the role of IT changes from functioning as specialists to generalists, many IT staff members find themselves in the role of a Business Technologist. In this new role, they serve a two-fold purpose. First, they must understand and document the specific needs and requirements of the business by interfacing with key end-users and product managers. Once they document these needs, they then map those requirements to a specific technology solution that solves them.
Choosing the right backup appliance – physical or virtual – does not have to be complicated so long as an organization knows the right questions to ask and gathers the appropriate information. However, as organizations are gathering this information, most conclude that a virtual backup appliance is NOT the right answer in most circumstances. In this fifth and final installment of DCIG’s interview with STORServer President Bill Smoldt, he explains how to choose the most appropriate backup appliance for your environment and why a virtual backup appliance is probably not the choice you will be making.
As I attended sessions at Microsoft TechEd 2014 last week and talked with people in the exhibit hall a number of themes emerged including “mobile first, cloud first”, hybrid cloud, migration to the cloud, disaster recovery as a service, and flash memory storage as a game-changer in the data center. But as I reflect on the entire experience, a statement made John Loveall, Principal Program Manager for Microsoft Windows Server during one of his presentations sums up to overall message of the conference, “Today it is really all about the integrated solution.”
Distributed enterprises with varying size remote offices under their management are no different than any other organization in that they also want to capitalize on the numerous benefits that integrated backup appliances offer. Yet selecting the “right-sized” backup appliance for each office can quickly become very complicated as it can create a tangled web of backup and recovery management if neither the appliances nor the backup software can be centrally monitored and managed.
The disconnect between how quickly and efficiently end users think their IT department can back up and recover data and the IT department’s actual ability to deliver on these expectations can be substantial. Too often, IT departments are not equipped to recover data nearly as fast as end users expect, and they may not even have the data available to recover. In this fourth installment of DCIG’s interview with STORServer President Bill Smoldt, he explains why misconceptions about backup persist and what backup paradigms must change for the benefit of everyone.
At TechEd 2014 in Houston, TX this week, Microsoft made it clear that it is no longer content to just send customers to storage array vendors to meet their storage needs, especially when it comes to embracing a cloud-oriented approach to infrastructure. In the process of improving Windows storage technology, Microsoft is effectively delivering the benefits of–and addressing the barriers to–the adoption of server SAN technology.
DCIG is pleased to announce the availability of its DCIG 2014-15 Security Information and Event Management (SIEM) Appliance Buyer’s Guide. In this Buyer’s Guide, DCIG weights, scores and ranks 29 SIEM appliances respectively from nine (9) different providers. Like all previous DCIG Buyer’s Guides, this Buyer’s Guide provides the critical information that all size organizations need when selecting a SIEM appliance to help provide visibility into their security posture by providing usable and actionable information.
Toward the end of April Wikibon’s David Floyer posted an article on the topic of server SANs entitled “The Rise of Server SANs” which generated a fair amount of attention and was even the focus of a number of conversations that I had at this past week’s Symantec Vision 2014 conference in Las Vegas. However I have to admit, when I first glanced at some of the forecasts and charts that were included in that piece, I thought Wikibon was smoking pot and brushed it off. But after having had some lengthy conversations with attendees at Symantec Vision, I can certainly see why Wikibon made some of the claims that it did.
VMware® VMmark® has quickly become a performance benchmark to which many organizations turn to quantify how many virtual machines (VMs) they can realistically expect to host and then perform well on a cluster of physical servers. Yet a published VMmark score for a specified hardware configuration may overstate or, conversely, fail to fully reflect the particular solution’s VM consolidation and performance capabilities. The HP ProLiant BL660c published VMmark performance benchmarks using a backend HP 3PAR StoreServ 7450 all-flash array provide the relevant, real-world results that organizations need to achieve maximum VM density levels, maintain or even improve VM performance as they scale and control costs as they grow.
Last month DCIG published a blog entry addressing feedback that we have received regarding our methodology, credibility, and processes. We believe that our methods are solid and provide every vendor with an opportunity to participate in the process. Further, as we said in that blog entry, “There is no reason to change any of its practices and has not been to date presented with any compelling reasons to do.” However, DCIG does recognize that it needs to be flexible and willing to refine processes as necessary. It’s all part of growing pains.
There is backup and then there is backup. To meet the backup and recovery needs of today’s organizations, they need to verify that the selected backup appliance includes the features needed to protect their environment today and positions them to meet their needs into the foreseeable future. In this third installment of DCIG’s interview with STORServer President Bill Smoldt, he describes the new must-have features that backup appliances must offer.
It is the end of April and for those of you out there yet looking to attend a quality conference to learn about Unified Communications (UC), it is not too late. Even though Enterprise Connect 2014 just ended, an event that most of the major UC players attend and arguably one of the best conferences of the year, there are several quality conferences still available to attend in 2014.
Companies all want more reliable backup and recovery, with short recovery times when things go awry. In part II of this interview series with StorageCraft’s Chief Evangelist Matt Urmstom, we expand on how StorageCraft uses StorageCraft ImageManager and StorageCraft Headstart Restore technology to provide a full DR solution that can offer recovery in as little as five minutes, and also how ShadowProtect performs equally well in physical and virtual environments.