Integrating backup software, cloud services support, deduplication, and virtualization into a single hardware appliance remains a moving target. Even as backup appliance providers merge these technologies into their respective appliances, the methodologies they employ to do so can differ significantly between them. This becomes very apparent when one looks at growing number of backup appliances from the providers in the market today.
Category Archives: Cloud
Whether companies like it or not, individuals within their organizations over the last few decades have adopted the technologies that they need in order to more effectively do their jobs. One such adoption has been the use of public file sync-n-share technologies that put data – and the control of it – outside of the purview of corporate IT. In this third and final installment in my interview series with Nexsan’s CEO Robert Fernander, he explains how Nexsan’s UNITY empowers organizations to bring this part of the world of shadow IT back under corporate control.
Change. Digital transformation. Disrupt. Eat your own young. These were just some of the terms and phrases uttered at this past week’s HPE Discover event in Las Vegas by HPE executives at all levels of the organization. Yet in the face of the changes that are about to sweep through the technology industry, a technology provider that touches as many organizations around the world as HPE does needs to have more than this type of mindset. It needs to have the products and strategy in place to back it up. Based upon what I saw at HPE Discover last week, HPE is executing upon these requirements.
Every now and then a technology comes along that prompts enterprises to a complete do-over of their existing data center infrastructures. This type of dramatic change is already occurring within organizations of all sizes who are adopting and implementing SimpliVity.
Anyone familiar with the Internet of Things (IoT) recognizes its potential value: the ability to capture and assimilate tons of data from devices that enable one to make better decisions. The main problem with IoT is that unless one is a billion dollar organization, most organizations struggle to even deploy IoT, much less effectively capture and analyze the information collected by IoT devices. Recognizing this deficiency, Fujitsu recently brought together cloud analytics, cloud storage and IoT to enable nearly any size organization to reap the benefits that IoT has to offer.
Organizations of all sizes now look to host some or all of their applications with cloud hosting providers and for good reason.Yet organizations should not assume all cloud hosting providers are created equal. If anything, small and midsized enterprises (SMEs) may be particularly susceptible and even find themselves unnecessarily exposed to unexpected outages or extended periods of downtime if they do not carefully choose their cloud hosting provider.
As the whole technology world (or at least those intimately involved with the enterprise data center space) takes a breath before diving head first into VMworld next week, a few vendors are jumping the gun and making product announcements in advance of it. One of those is SimpliVity which announced its latest hyper-converged offering, OmniStack 3.0, this past Wednesday. In so doing, it continues to put a spotlight on why hyper-converged infrastructures and the companies delivering them are experiencing hyper-growth even in a time of relative market and technology uncertainty.
DCIG is pleased to announce the availability of its 2015-16 Hybrid Cloud Backup Appliance Buyer’s Guide that evaluates and ranks more than 100 features from nearly 60 different hybrid cloud backup appliances from ten (10) different providers.
Backup software has traditionally been one of the stickiest products in organizations of all sizes in art because it has been so painful to deploy and maintain. After all, once it was installed and sort of working, no organization wanted to subject itself to that torture again.
Backup software has traditionally been one of the “stickiest” products in organizations of all sizes in art because it has been so painful to deploy and maintain that, once installed and sort of working, no organization wanted to subject itself to that process again. But in recent years as backup has become easier to install and maintain, swapping it out for another or consolidating multiple backup software solutions down to single one becomes much more plausible. This puts new impetus on backup software providers to introduce new features into their products to keep them relevant and “sticky” in their customer environments longer term.
DCIG is in the process of researching the Private Cloud Storage Array marketplace with the intention of publishing the DCIG 2015-16 Private Cloud Storage Array Buyers Guide in March/April 2015. This will be an update to the DCIG 2013 Private Cloud Storage Array Buyers Guide. Since the publication of 2013 edition, nearly every vendor has come out with new models and new vendors have arrived on the scene warranting a fresh snapshot of this dynamic marketplace. The purpose of this courtesy notice is five-fold To inform prospective storage purchasers and storage vendors that DCIG intends to publish the DCIG 2015-16 Private Cloud Storage Array Buyers Guide in March/April 2015. To describe the appeal of private cloud storage while clarifying DCIG’s definition of private cloud storage. To disclose DCIG’s inclusion criteria and enumerate the products identified in our preliminary research. To give individuals an opportunity to inform DCIG of additional products that may qualify for inclusion in the guide. (While there is…
DCIG is in the process of researching the Private Cloud Storage Array marketplace with the intention of publishing the DCIG 2015-16 Private Cloud Storage Array Buyers Guide in March/April 2015. This will be an update to the DCIG 2013 Private Cloud Storage Array Buyers Guide. Since the publication of 2013 edition, nearly every vendor has come out with new models and new vendors have arrived on the scene warranting a fresh snapshot of this dynamic marketplace.
Today backup and recovery looks almost nothing like it did 10 years ago. But as one looks at all of the changes still going on in backup and recovery, one can only guess what backup and recovery might look line in another 5-10 years. In this ninth and final installment of my interview series with Brett Roscoe, General Manager, Data Protection for Dell Software, he provides some insight into where he sees backup and recovery going over the next decade.
Facebook is turning to a disaggregated racks strategy to create a next gen cloud computing data center infrastructure
Think “Dell” and you may think “PCs,” “servers,” or, even more broadly, “computer hardware.” If so, you are missing out on one of the biggest transformations going on among technology providers today as, over the last 5+ years, Dell has acquired multiple software companies and is using that intellectual property (IP) to drive its internal turnaround. In this sixth installment of my interview series with Brett Roscoe, General Manager, Data Protection for Dell Software, we discuss how these software acquisitions are fueling Dell’s transformation from a hardware provider into becoming a solutions provider.
There are so many options available in today’s next generation of backup and recovery tools that sometimes it can be tough to prioritize which features to implement. In this third installment of my interview series with Dell Software’s General Manager, Data Protection, Brett Roscoe, we discuss four (4) best practices that organizations should prioritize as they implement next generation backup and recovery tools.
2014 may eventually come to be characterized as the year of the tech break up. Tech conglomerates such as CA Technologies, HP, IBM and, most recently, Symantec have all opted to go down the “break up” route while others such as Cisco and EMC continue to experience internal and external pressures to pursue this option. But as enterprises look to create more agile, automated, cohesive infrastructures, it may be ultimately leave those such as Dell and Oracle that are opting to “make up” best positioned to deliver on these enterprise demands.
An Omaha city employee recently gained unwanted public visibility after they sent twelve filing cabinets containing a hundred years of irreplaceable original building permits from the basement of City Hall to the county dump. It turns out that the head of the permits and inspections division decided to get rid of the cabinets as part of cleaning out its basement storage area. They did not realize that other city employees regularly pulled the permits, which dated from the 1880s through the 1980s. They were also apparently unaware that a local preservation group was developing a plan to move the permits to a new facility in order to make the permits more secure and accessible to the public.
Like Omaha’s City Hall, businesses often face what appear to be incompatible priorities. IT departments are expected to keep spending in check and know that only 10-20 percent of data is ever accessed after 60 days of its creation. But knowing which data to keep available and which data to delete or archive can be a challenge. This type of dilemma is one of many drivers in the development of a new group of storage systems–public cloud gateways.
There is literally a divergence occurring right now in data storage solutions. On one hand, a number of storage providers seek to deliver highly differentiated storage solutions that work with a broad set of applications and operating systems. On the other, a few providers focus on delivering a storage solution that tightly integrates with one or more applications to deliver unparalleled levels of application performance and ease of management. The latest Oracle ZFS Storage Appliance ZS3 Series with its new OS8.2 provide the best of what both of these categories of storage systems currently have to offer to deliver a storage platform that truly stands apart.
As the role of IT changes from functioning as specialists to generalists, many IT staff members find themselves in the role of a Business Technologist. In this new role, they serve a two-fold purpose. First, they must understand and document the specific needs and requirements of the business by interfacing with key end-users and product managers. Once they document these needs, they then map those requirements to a specific technology solution that solves them.