The business case for organizations with petabytes of file data under management to classify and then place it across multiple tiers of storage has never been greater. By distributing this data across disk, flash, tape and the cloud, they stand to realize significant cost savings. The catch is finding a cost-effective solution that makes it easier to administer and manage file data than simply storing it all on flash storage. This is where a solution such as what Quantum now offers come into play.
Today’s backup mantra seems to be backup to the cloud or bust! But backup to the cloud is more than just redirecting backup streams from a local file share to a file share presented by a cloud storage provider and clicking the “Start” button. Organizations must examine to which cloud storage providers they can send their data as well as how their backup software packages and sends the data to the cloud. BackupAssist 10.0 answers many of these tough questions about cloud data protection that businesses face while providing them some welcomed flexibility in their choice of cloud storage providers.
In almost every industry there is a tendency to use phrases such as Tier 1, Tier 2, and Tier 3 to describe providers, the products in a specific market, the quality of service provided, or some combination thereof. It is one applies these three terms to the storage industry and how to properly classify storage providers into one of these various tiers that the conversation becomes intriguing. After all, how does one define what constitutes and separates a Tier 1 storage provider from other providers in the market?
DCIG is pleased to announce the availability of the 2016-17 Hybrid Cloud Backup Appliance Buyer’s Guide developed from the backup appliance body of research. As core business processes become digitized, the ability to keep services online and to rapidly recover from any service interruption becomes a critical need. Given the growth and maturation of cloud services, many organizations are exploring the advantages of storing application data with cloud providers and even recovering applications in the cloud.
Enterprises now demand higher levels of automation, integration, simplicity, and scalability from every component deployed into their IT infrastructure and the integrated backup appliances found in the DCIG’s forthcoming Buyer’s Guide Editions that cover integrated backup appliances are a clear output of those expectations. Intended for organizations that want to protect applications and data and then keep it behind corporate fire walls, these backup appliances come fully equipped from both hardware and software perspectives to do so.
Usually when I talk to backup and system administrators, they willingly talk about how great a product installation was. But it then becomes almost impossible to find anyone who wants to comment about what life is like after their backup appliance is installed. This blog entry represents a bit of anomaly in that someone willingly pulled back the curtain on what their experience was like after they had the appliance installed. In this third installment in my interview series with system architect, Fidel Michieli, describes how the implementation of Cohesity went in his environment and how Cohesity responded to issues that arose.
DCIG is pleased to announce the availability of the following DCIG 2016-17 Deduplicating Backup Appliance Buyer’s Guide Editions developed from the backup appliance body of research. Other Buyer’s Guide Editions based on this body of research will be published in the coming weeks and months, including the 2016-17 Integrated Backup Appliance Buyer’s Guide and 2016-17 Hybrid Cloud Backup Appliance Buyer’s Guide Editions.
This year’s Veritas Vision 2016 conference held a lot of intrigue for me. The show itself was not new. The Vision show has been an ongoing event for years though this was the first time in more than a decade that Veritas was free to set its own agenda for the entire show. Rather the intrigue was in what direction it would take going forward. This Veritas did by communicating that it plans to align its product portfolio and strategy to deliver on an objective that has, to date, eluded enterprise organizations and vendors alike for at least two decades: enterprise data management.
Every now and then a technology comes along that prompts enterprises to a complete do-over of their existing data center infrastructures. This type of dramatic change is already occurring within organizations of all sizes who are adopting and implementing SimpliVity.
Every now and then I hear rumors in the market place that the only backup software product that Dell puts any investment into is Dell Data Protection | Rapid Recovery while it lets NetVault and vRanger wither on the vine. Nothing could be further from the truth. In this third and final part of my interview series with Michael Grant, director of data protection product marketing for Dell’s systems and information management group, he refutes those rumors and illustrates how both the NetVault and vRanger products are alive and kicking within Dell’s software portfolio.
Few data center technologies currently generate more buzz than hyper-converged infrastructure solutions. By combining compute, data protection, flash, scale-out, and virtualization into a single self-contained unit, organizations get the best of what each of these individual technologies has to offer with the flexibility to implement each one in such a way that it matches their specific business needs. Yet organizations must exercise restraint in how many attributes they ascribe to hyper-converged infrastructure solutions as their adoption is a journey, not a destination.
In the last couple of weeks X-IO announced a number of improvements to its iglu line of storage arrays – namely flash optimized controllers and stretch clustering. But what struck me in listening to X-IO present the new features of this array was in how it kept referring to the iglu as “intelligent.” While that term may be accurate, when I look iglu’s architecture and data management features and consider them in light of what small and midsize enterprises need today, I see the iglu’s architecture as “thoughtful.”
Almost any hybrid or all-flash storage array will accelerate performance for the applications it hosts. Yet many organizations need a storage array that scales beyond just accelerating the performance of a few hosts. They want a solution that both solves their immediate performance challenges and serves as a launch pad to using flash more broadly in their environment.
DCIG recently released two Buyer’s Guides on Hybrid Storage Arrays – the DCIG 2015-16 SME Hybrid Storage Array and the DCIG 2015-16 Midsize Enterprise Hybrid Storage Array – that examine many of the features that hybrid storage arrays offer. Yet what these Guides can only do at a high level is reveal how certain features are implemented on hybrid storage arrays without getting into any real detail in terms of how they are implemented. One such feature is Quality of Service.
It is almost a given in today’s world that for almost any organization to operate at peak efficiency and achieve optimal results that it has to acquire and use multiple forms of technology as part of its business processes. However what is not always so clear is the forces that are at work both insider and outside of the business that drive its technology acquisitions. While by no means a complete list, here are four (4) forces that DCIG often sees at work behind the scenes that influence and drive many of today’s technology infrastructure buying decisions.
During the recent HP Deep Dive Analyst Event in its Fremont, CA, offices, HP shared some notable insights into the percentage of backup jobs that complete successfully (and unsuccessfully) within end-user organizations. Among its observations using the anonymized data gathered from hundreds of backup assessments at end-user organizations of all sizes, HP found that over 60% of them had backup job success rates of 98% or lower, with 12% of organizations showing backup success rates of lower than 90%. Yet what is more noteworthy is through HP’s use of Big Data analytics, it has identified large backups (those that take more than 12 hours to complete) as being the primary contributor to the backup headaches that organizations still experience.
The closer any new solution comes to being non-disruptively introduced into existing organizational backup infrastructures, the greater the odds that the solution will succeed and be adopted more broadly. By Dell including FIPS 140-2 compliant 256-bit AES encryption and VTL features as part of its 3.2 OS release for its existing and new DR series of backup appliances at no charge, organizations have new options to introduce the DR Series appliances without disrupting their existing backup processes.
Backup software has traditionally been one of the “stickiest” products in organizations of all sizes in art because it has been so painful to deploy and maintain that, once installed and sort of working, no organization wanted to subject itself to that process again. But in recent years as backup has become easier to install and maintain, swapping it out for another or consolidating multiple backup software solutions down to single one becomes much more plausible. This puts new impetus on backup software providers to introduce new features into their products to keep them relevant and “sticky” in their customer environments longer term.
Dell has brought together its various data protection products into one suite to make it easier to address multiple backup challenges with a single solution.
Facebook is turning to a disaggregated racks strategy to create a next gen cloud computing data center infrastructure