Mention data management to almost any seasoned IT professional and they will almost immediately greet the term with skepticism. While organizations have found they can manage their data within certain limits, when they remove those boundaries and attempt to do so at scale, those initiatives have historically fallen far short if not outright failed. It is time for that perception to change. 20 years in the making, Commvault Activate puts organizations in a position to finally manage their data at scale.
During the recent HP Deep Dive Analyst Event in its Fremont, CA, offices, HP shared some notable insights into the percentage of backup jobs that complete successfully (and unsuccessfully) within end-user organizations. Among its observations using the anonymized data gathered from hundreds of backup assessments at end-user organizations of all sizes, HP found that over 60% of them had backup job success rates of 98% or lower, with 12% of organizations showing backup success rates of lower than 90%. Yet what is more noteworthy is through HP’s use of Big Data analytics, it has identified large backups (those that take more than 12 hours to complete) as being the primary contributor to the backup headaches that organizations still experience.
Think “Dell” and you may think “PCs,” “servers,” or, even more broadly, “computer hardware.” If so, you are missing out on one of the biggest transformations going on among technology providers today as, over the last 5+ years, Dell has acquired multiple software companies and is using that intellectual property (IP) to drive its internal turnaround. In this sixth installment of my interview series with Brett Roscoe, General Manager, Data Protection for Dell Software, we discuss how these software acquisitions are fueling Dell’s transformation from a hardware provider into becoming a solutions provider.
Deriving value from the plethora of unstructured data created by today’s multiple sources of Big Data hinges on analyzing and acting on it in real-time. To do so, enterprises must employ a solution that analyzes Big Data streams as they flow in. Using TIBCO Software’s Event Processing platform, enterprises can process Big Data streams while they are still in motion providing real-time operational intelligence so they may take the appropriate action while the action still has meaningful value.
Around two years ago the DCIG 2011 Enterprise Scale-Out Storage Buyer’s Guide was released. At the time we mentioned that scale-out systems were being used to store “Big Data” and create private storage clouds. Since then scale-out storage systems have become the foundation for building out private storage clouds which prompted DCIG to change the name of our refreshed Buyer’s Guide to better reflect the intended use case for these storage arrays.
Mention the year 2008 or 2009 to almost any person and it almost inevitably elicits a negative reaction in terms of how those years were from a business perspective. However as DCIG renews its annual tradition of reflecting back on what blog entries were most read on its website during the course of 2012, 2008 and 2009 emerged as very good years in terms of DCIG providing content that is still relevant and frequently read in 2012. Today and over the next four (4) business days, I will share what blog entries garnered the most attention on DCIG’s website in 2012.
The factors that influenced which tape library to use in your environment used to be much simpler in nature when tape was used primarily as a backup target. But as disk has evolved to assume that role, tape libraries have evolved to provide new features so they may assume a much more strategic position within organizations to support their Big Data and Cloud initiatives. In this webcast, I take a look at how to choose the right tape library for your environment in light of these new forces that are impacting the use of tape libraries within organizations.
An integrated and centralized data store model that enables stakeholders from throughout an organization to harvest and analyze data on the same platform continues to be the goal of many organizations from the Global 2000 as they strive to address the requirements of Big Data. However, with today’s decentralized and cloud based storage systems and data storage requirements for the Global 2000 reaching the petabyte and even exabyte levels, massive centralized single data store infrastructures with single points of failure such as Apache Hadoop may not the most effective long term solution.
The accelerating increase in the amount of unstructured Electronically Stored Information (ESI) is leaving IT organizations struggling with how to store and manage all of this new information. Aside from needing to provide the underlying storage infrastructure to host this amount of data, companies are also faced with the task of properly managing their Big Data file stores to meet both existing and emerging governance, risk and compliance (GRC) obligations. To do so, there are five initial steps they can take now to get their organization in front of these demands.
To say with any degree of certainty what technologies will be hot in the next 6 – 12 months generally takes equal amounts of smarts and industry insight coupled with a little bit of luck sprinkled in to get it right. So as I compare what I forecast earlier this year to what I see taking place now, I was certainly right on some points but premature in predicting others. So today with the midpoint of 2012 upon us, I thought I would take a look at the five specific technology trends impacting organizations right now.
The deployment of flash memory storage as either storage or memory almost inevitably results in increases in application performance. However to get the real ‘kick’ in performance that today’s transactional applications need and which flash can provide, a more elegant approach to flash’s deployment is needed. Today I continue my discussion with Fusion-io Senior Director of Product Management, Brent Compton, who elaborates on the APIs that the Fusion ioMemory SDK exposes that make this boost in transactional performance possible.
Last week developers of enterprise applications got some new toys to play with in the storage memory realm. The newly released ioMemory SDK will grant developers the ability to better utilize the potential of Fusion-io’s line of enterprise flash memory storage. Fusion-io expects the SDK will simplify code bases while providing a sizable performance boost. We begin our discussion with Brent Compton, Senior Director of Product Management with Fusion-io.
We live in the information age where data is being produced at rates that almost boggle the mind. But living in the age of Big Data does not translate into this data being easily available and digestible. The new DCIG Interactive Buyer’s Guide (IBG) fundamentally addresses this basic organizational need by delivering information about enterprise technologies in the form of Research as a Service.
Today’s expectations for always-on environments, coupled with the introduction of Big Data into enterprise environments are stretching the capabilities of today’s backup software well beyond what it was ever intended to solve. As such, enterprises can no longer look at backup software as a ‘one size fits all’ approach.
Amazon announced their Storage Gateway (beta) on January 25th, about two days before my article on VMWare and Citrix squaring off in the “Dropbox for Enteprise” market. In my article I noted that VMWare and Citrix are exploiting a based limitation of Dropbox, Evernote and Box introduced by supporting a Consumerization of IT (CoIT) product. Consumer-based file-share-and-synch applications cannot be installed in a company’s data center. As file-share-synch drives cloud adoption in the enterprise, vendors emerge from all corners.
Network security monitoring is a constantly changing environment of both tools and methodologies. Most of them today, however, have used a lone “cowboy” mentality where datacenter solutions operate independently. MetaFlows is changing that. Today, I am continuing my interview with MetaFlows CEO Livio Ricciulli, discussing how their product is optimizing network security monitoring and performance.
Enterprise organizations face the daily challenge of ever-growing threats to their network and IT infrastructure. Not only are these threats growing, but they are constantly changing as well, forcing companies to adapt by changing not only their tools but also their training. Today, I’m talking with MetaFlows CEO Livio Ricciulli about how MetaFlows is addressing these problems by delivering network security monitoring using the “Software as a Service” model.