Ever since I got involved with IT in general and data storage specifically, the predominant way that organizations manage their data growth is by throwing more storage at the problem. Sure, they pay homage to technologies like archiving, data lifecycle management and storage resource management (SRM) but at the end of the day the “just buy more” principle prevails. Yet as we enter 2013, data management is finally poised to become a data center priority.
It’s no secret that ‘Big Data’ is becoming a ‘Big Problem’ for organizations from a data and storage management perspective. However what organizations may fail to realize is that the best way to solve their Big Data problems is NOT by mindlessly throwing more resources at them. Rather it is to look at Big Data more strategically and then tackle the data management problems it creates in one fell swoop using software like CommVault® Simpana® and its OnePass technology.
Information managers can expect data storage companies to drive significant campaigns around Big Data as we enter 2012. Storage is the least of anyone’s concerns, according to The Economist Intelligence Unit (EIU) report Big Data: Harnessing a game-changing asset. Information Governance in 2012 requires Data Science strategy and practitioners be added to all business teams.
This last week while doing some research I ran across the term Data Center Infrastructure Management (DCIM) software for the first time. Intrigued, I spent a little time investigating what it was only to discover that for the most part DCIM is a new name for an old friend (or nemesis, depending on your experiences) – Storage Resource Management (SRM.) But this is one of the few times where a change of name may stand to do everyone a lot of good from the vendors who are providing DCIM software to the organizations who are buying it.
This is one of my favorite blogs of the year to write. Even though this is only the second time since DCIG launched its blogging site two years ago that I have had the opportunity to write a blog in this format, I have been looking forward to looking back all year. In case you have not yet figured it out, today I take a look back at the top 10 most read blogs in 2009 on the DCIG site. However this year I am doing a two part series with today’s blog examining the 10 most read blogs in 2009 that were written in 2009.
The big news in the industry this past week was around the private cloud announcement made by and between EMC Corp., Cisco Systems Inc and VMware. In brief, these three companies are aligning to provide integrated virtualization product bundles for midsize, large and enterprise organizations that are referred to as Vblocks. Conceptually and practically, this is a smart move on the part of these three companies to deliver this type of service. Though some reports cited fear of user lock-in if this configuration is deployed, one would think concerns about deploying a private cloud that does not work as expected would be far greater.
Gone are the days when the sole purpose of storage resource management (SRM) software was to report on file ages and sizes, storage utilization and server-to-storage LUN assignments. Those functions are still important but not nearly enough to meet the demands of today’s progressive enterprise data centers. These organizations are demanding faster, easier deployments of SRM software that grant them more insight into their increasingly virtualized environments as well as better reporting and management of their replication processes that are becoming so critical in today’s data centers. It is this void that APTARE StorageConsole 7.0 seeks to fill.
The point is that to succeed in the SRM space as an independent software vendor that does not tie the software purchase to the hardware, you need to deliver three things: (1) a great product; (2) great value, and (3) a genuine commitment to develop and evolve the product to meet customer’s needs. One would think those points would be obvious but I believe a major reason that many SRM products failed on their first go-round was it seemed vendors were more interested in selling half-baked software and getting bought out by larger vendors than they were in providing products that worked, provided value out of the box and then delivered value to customers on a long term basis.
The challenge that APTARE faces, however, is the same challenge that every other SRM vendor faces. Keep SRM software relevant in the face of declining storage capacity prices. This factor alone often makes it far too easy for companies to throw more storage capacity at the problem as opposed to trying to monitor and proactively manage it. Regardless of whether or not APTARE has the right architecture, they need to help break users of their storage consumption habit
Can APTARE’s StorageConsole remain relevant in 2008 and beyond? That was a question that weighed on my mind as I met with Rick Clark, APTARE’s President and CEO, a couple of weeks ago. The purpose of the briefing: receive an update on what steps APTARE is taking to keep its StorageConsole 6.5 product alive and growing as the data protection space evolves. Of course, the particular challenge that StorageConsole needs to address now and in the coming years is managing the growing use of disk in data protection and start to wean itself off of managing tape-based backup.