DCIG is in the process of researching the Private Cloud Storage Array marketplace with the intention of publishing the DCIG 2015-16 Private Cloud Storage Array Buyers Guide in March/April 2015. This will be an update to the DCIG 2013 Private Cloud Storage Array Buyers Guide. Since the publication of 2013 edition, nearly every vendor has come out with new models and new vendors have arrived on the scene warranting a fresh snapshot of this dynamic marketplace.
Category Archives: Cloud
Today backup and recovery looks almost nothing like it did 10 years ago. But as one looks at all of the changes still going on in backup and recovery, one can only guess what backup and recovery might look line in another 5-10 years. In this ninth and final installment of my interview series with Brett Roscoe, General Manager, Data Protection for Dell Software, he provides some insight into where he sees backup and recovery going over the next decade.
Few organizations regardless of their size can claim to have 1.35 billion users, have to manage the upload and ongoing management of 930 million photos a day or be responsible for the transmission of 12 billion messages daily. Yet these are the challenges that Facebook’s data center IT staff routinely encounter. To respond to them, Facebook is turning to a disaggregated racks strategy to create a next gen cloud computing data center infrastructure that delivers the agility, scalability and cost-effective attributes it needs to meet its short and long term compute and storage needs.
Think “Dell” and you may think “PCs,” “servers,” or, even more broadly, “computer hardware.” If so, you are missing out on one of the biggest transformations going on among technology providers today as, over the last 5+ years, Dell has acquired multiple software companies and is using that intellectual property (IP) to drive its internal turnaround. In this sixth installment of my interview series with Brett Roscoe, General Manager, Data Protection for Dell Software, we discuss how these software acquisitions are fueling Dell’s transformation from a hardware provider into becoming a solutions provider.
There are so many options available in today’s next generation of backup and recovery tools that sometimes it can be tough to prioritize which features to implement. In this third installment of my interview series with Dell Software’s General Manager, Data Protection, Brett Roscoe, we discuss four (4) best practices that organizations should prioritize as they implement next generation backup and recovery tools.
2014 may eventually come to be characterized as the year of the tech break up. Tech conglomerates such as CA Technologies, HP, IBM and, most recently, Symantec have all opted to go down the “break up” route while others such as Cisco and EMC continue to experience internal and external pressures to pursue this option. But as enterprises look to create more agile, automated, cohesive infrastructures, it may be ultimately leave those such as Dell and Oracle that are opting to “make up” best positioned to deliver on these enterprise demands.
An Omaha city employee recently gained unwanted public visibility after they sent twelve filing cabinets containing a hundred years of irreplaceable original building permits from the basement of City Hall to the county dump. It turns out that the head of the permits and inspections division decided to get rid of the cabinets as part of cleaning out its basement storage area. They did not realize that other city employees regularly pulled the permits, which dated from the 1880s through the 1980s. They were also apparently unaware that a local preservation group was developing a plan to move the permits to a new facility in order to make the permits more secure and accessible to the public.
Like Omaha’s City Hall, businesses often face what appear to be incompatible priorities. IT departments are expected to keep spending in check and know that only 10-20 percent of data is ever accessed after 60 days of its creation. But knowing which data to keep available and which data to delete or archive can be a challenge. This type of dilemma is one of many drivers in the development of a new group of storage systems–public cloud gateways.
As DCIG prepares to release its forthcoming DCIG 2014-15 Virtual Server Backup Software Buyer’s Guide, it has unveiled a number of changes in the features offered on virtual server backup software and the ways in which they offer them. While support for VMware and its various capabilities certainly remain a focal point, support for other hypervisors, connectivity to public storage clouds and even tape support are becoming a bigger part of the conversation. Here are five early insights that DCIG has gleaned from the research that it has completed in recent weeks and months.
There is literally a divergence occurring right now in data storage solutions. On one hand, a number of storage providers seek to deliver highly differentiated storage solutions that work with a broad set of applications and operating systems. On the other, a few providers focus on delivering a storage solution that tightly integrates with one or more applications to deliver unparalleled levels of application performance and ease of management. The latest Oracle ZFS Storage Appliance ZS3 Series with its new OS8.2 provide the best of what both of these categories of storage systems currently have to offer to deliver a storage platform that truly stands apart.
As the role of IT changes from functioning as specialists to generalists, many IT staff members find themselves in the role of a Business Technologist. In this new role, they serve a two-fold purpose. First, they must understand and document the specific needs and requirements of the business by interfacing with key end-users and product managers. Once they document these needs, they then map those requirements to a specific technology solution that solves them.
As I attended sessions at Microsoft TechEd 2014 last week and talked with people in the exhibit hall a number of themes emerged including “mobile first, cloud first”, hybrid cloud, migration to the cloud, disaster recovery as a service, and flash memory storage as a game-changer in the data center. But as I reflect on the entire experience, a statement made John Loveall, Principal Program Manager for Microsoft Windows Server during one of his presentations sums up to overall message of the conference, “Today it is really all about the integrated solution.”
At TechEd 2014 in Houston, TX this week, Microsoft made it clear that it is no longer content to just send customers to storage array vendors to meet their storage needs, especially when it comes to embracing a cloud-oriented approach to infrastructure. In the process of improving Windows storage technology, Microsoft is effectively delivering the benefits of–and addressing the barriers to–the adoption of server SAN technology.
There is backup and then there is backup. To meet the backup and recovery needs of today’s organizations, they need to verify that the selected backup appliance includes the features needed to protect their environment today and positions them to meet their needs into the foreseeable future. In this third installment of DCIG’s interview with STORServer President Bill Smoldt, he describes the new must-have features that backup appliances must offer.
Arrived in Las Vegas last night to spend three (3) days and nights with a forecasted 90,000 other attendees at the National Association of Broadcasters (NAB) show. As one of NAB’s opening events – and my first stop at the show – was the ShowStoppers event at the Wynn Hotel and Casino near the Las Vegas convention center. There analysts and press got to spend a couple of uninterrupted hours talking with select providers about numerous emerging technologies, one of which was software defined storage.
In this final blog entry from our interview with Nimbus Data CEO and Founder Thomas Isakovich, we discuss his company’s latest product, the Gemini X-series. We explore the role of the Flash Director and how it Gemini X-series appeals to enterprises as well as cloud service providers.
One of the more difficult tasks for anyone deeply involved in technology is the ability to see the forest from the trees. Often responsible for supporting the technical components that make up today’s enterprise infrastructures, to step back and recommend which technologies are the right choices for their organization going forward is a more difficult feat. While there is no one right answer that applies to all organizations, five (5) technologies – some new as well as some old technologies that are getting a refresh – merit that organizations prioritize them in the coming months and years.
DCIG is pleased to announce the availability of its DCIG 2014-15 $50K and Under Converged Infrastructure Buyer’s Guide. In this two Buyer’s Guides, DCIG weights, scores and ranks 10 converged infrastructure solutions from six (6) different providers. Like previous DCIG Buyer’s Guides, this Buyer’s Guide provides the critical information that all size organizations need when selecting a converged infrastructure solution to help expedite application deployments and then simplify their long term management.
Anyone who managed IT infrastructures in the late 1990’s or early 2000’s probably still remembers how external storage arrays were largely a novelty reserved for high end enterprises with big data centers and deep pockets. Fast forward to today and a plethora of storage arrays exist in a variety of shapes and sizes at increasingly low price points. As such it can be difficult to distinguish between them. To help organizations sort them out, my blog entry today provides a primer on the types of storage arrays currently available on the market.
Archiving or backing up large amounts of data to the cloud is very appealing until one starts to examine the mechanics of actually sending or retrieving that data from the cloud. Waiting minutes or hours to send or retrieve data is no longer acceptable to today’s end-users who are rapidly becoming accustomed to near-instant response times. In this fifth and final part of my interview series with BridgeSTOR’s CEO John Matze, he explains how sending or retrieving data in a piecemeal fashion to the cloud is the fastest and most effective way to do so.
Storing archival and backup data in the cloud is high on the list of priorities of many organizations if for no other reason is that the data remains accessible and available without organizations having to bear the burden of managing the data locally long term. But as more organizations use cloud storage gateways to store this data, they will find distinct differences in how these appliances manage data in the cloud with differences sometimes existing even between appliances from the same vendor. In this fourth part of my interview series with BridgeSTOR’s CEO John Matze, he reveals the various methods that the BridgeSTOR NAS and VTL cloud gateway appliances store, access and manage this data locally and in the cloud.