February 27, 2009

Like everyone else these days, I’ve been watching the economic news with a feeling of dread combined with a strong sense of outrage. How did those idiots – and by “those idiots” I mean “them” and not “us” – screw this all up for the rest of us. Luckily, the talking heads on the news shows all assure us things will get better – just as soon as we hit absolute rock bottom.

In the mean time, life goes on for all of those who have to keep the lights on and the IT systems running. The only difference is that we’re now asked to do more with less, or in some cases the same with a lot less. But all is not lost. There are solutions that can help businesses do more without breaking the already reduced budget.

The old mantra was ROI – return on investment. Executives needed to see that the cost of new IT purchases was justified by what they could return. But that was in the good times. These days, the goal is ROA – return on assets. Show that the business can make better use of the data center equipment it already has. Virtualization is the first solution that comes to mind. Server virtualization can make full use of compute power by hosting multiple application servers in a single box. Storage virtualization offers similar efficiencies through consolidation and thin provisioning to reduce the cases of under-utilized equipment. Why purchase another 10-20TB of storage, when there may already be 20TB of wasted space assigned to servers that are not using it. And while this is not a new idea, the goal should be to achieve these efficiencies with the existing storage subsystems the company already owns, and not to go out and buy a new storage system that provides this. Remember, the goal is to do more with the assets you already have. The way to do this is with software. Software that works with any type of storage. Software that works with your existing Fibre Channel SAN (if you have one) or with your existing LAN (if you don’t). Software that works with your servers – whether these are physical servers or virtual servers. But most importantly, software that works with the applications your business relies on. Because in the end, the whole reason for all the hardware, and now the software, is to run these applications. Beyond storage virtualization, this software would also provide other advanced services that will let users do more with less. Data protection services that let users keep multiple versions of the important application data without requiring special hardware and using the existing storage in the most efficient manner. Rather than keeping multiple full copies of important data, it would use space efficient delta snapshots. This means that users could have 20 to 30 delta snapshots in less space than currently used to keep 2 or 3 full copies. More copies with less space. And more efficient copies can also mean more frequent copies. Data protection with remote replication gives this software the ability to provide disaster recovery, again without requiring special hardware to link the two sites. Having these features in software rather than in hardware means users are free to select different storage subsystems at the different sites. So if the business does need to purchase any new storage, it is free to choose the one that offers the best price rather than being forced to purchase one that’s compatible with the storage it already uses. Having advanced software means the replication features are optimized to analyze data, to ensure that amount of data transmitted is the minimum needed. This allows the replication feature to operate over long distances using the least amount of bandwidth, allowing the business to use existing WAN connections rather than being forced to upgrade to larger, more expensive connections. Again, do more with less. So in these tough economic times, the way to do more with less is to use the right software rather than investing in more hardware. At least in my humble opinion. What do you think?

Read More

February 27, 2009

Direct attached storage often still predominates in small businesses, but as networked storage becomes more affordable and the management of it becomes easier and simpler to perform, network hard drives and network attached storage (NAS) appliances are poised to become much more pervasive. Recently Jerome Wendt, DCIG’s Lead Analyst and President, met with Jonathan Huberman, President of Iomega as well as the Consumer and Small Business Products Division of EMC, to discuss Iomega’s growing role in networked storage for small businesses and other similarly-sized work groups. In this first of a 3-part series, Jonathan examines current trends in networked storage for small businesses, how Iomega is differentiating itself from competitors and what advantages being a part of EMC brings to Iomega.

February 26, 2009

Over the past year there has been a lot of talk and speculation about Electronic Health Records (EHR). The topic started making headlines last year as President Obama and Senator McCain sparred over how to best fix health care with EHR touted as the single best way to control the ever increasing costs of medical treatment. Although it remains to be seen if this is actually the case, the recent stimulus bill passed by Congress on February 13th, 2009, has ensured EHR projects will be funded.

February 25, 2009

In looking at the tape market and what it needs to provide in tape libraries to meet today’s organizational needs, it is refreshes, not overhauls, that are required to align with these needs. Because tape libraries are becoming a secondary, as opposed to a primary, backup target in customer environments, tape library providers need to re-prioritize and even scale back the number of changes they make because if users do not want or use specific features, they will not pay for them.

February 24, 2009

Should I Archive Today? Tick…tick….tick… Data – do I need to save it? For how long do I need to save it? Do I need to save it in an immutable format? Do I have to comply with an existing regulatory requirement? Will the new Obama administration create new regulations with which I’ll have to […]

February 24, 2009

Over the last few months DCIG has spent fair amount of time researching and documenting specific reasons why tape will not die. Green IT is the one reason we most often hear cited for retaining tape, though new disk-based deduplication and replication technologies coupled with new disk storage system designs that are based on grid storage architectures can offset some of those concerns. So before organizations think that after 30, 90 or 180 days that they should immediately move their archival and backup data, deduplicated or otherwise, from disk to tape just to save money, there are certain intangible savings from an eDiscovery perspective that keeping data on disk provides that are not always feasible on tape.

February 23, 2009

Data protection is top of mind with more enterprise organizations today as they look to redesign data protection. Rapidly changing economic forces, new technologies and steadily growing volumes of data are prompting enterprises to rethink how they can best protect, manage and recover their data by leveraging these new technologies without introducing new people or extraordinary costs to accomplish these objectives. To get Symantec’s take on these new challenges facing organizations, DCIG lead analyst, Jerome Wendt, recently met with Deepak Mohan, Symantec’s senior vice president of the Data Protection Group, to discuss these topics.

February 20, 2009

If you have followed the news lately it would appear that the media and President Obama feel the economy is firmly entrenched somewhere between disaster and Armageddon, which has framed much of the debate surrounding the stimulus bills that are in both houses of Congress. When the Senate passed their version of the bill on February 9th, it promised $838 Billon dollars for spending projects designed to jump start the economy. But like most things in government there is a lot more in the details than the headlines. Now that the stimulus bill is out in the open, DCIG has a more clear view of where health care regulation is going and how IT will be affected.

February 19, 2009

Oracle’s ASM and 3PAR’s Thin Provisioning could be combined to offer a complete, end-to-end, storage solution. Oracle’s ASM feature would create, allocate, place, and rebalance data files for performance and 3PAR’s Thin Provisioning would dedicate disk space on the fly and only when needed.

February 18, 2009

In listening to all the market rumblings, I realize there’s more said and written about backup than any other topic. A quick tour of Delicious, where thousands of IT and storage professionals share knowledge and ideas, underscores this backup fixation. Of the 369,960 links tagged “backup,” only 11,274 are tagged “backup+recovery.” I’m worried that the […]

February 18, 2009

Video surveillance is shaping up as the next big thing in enterprise security. IP-based cameras from Mobotix and the continued growth of high-capacity network attached storage systems from Overland Storage make it possible for almost any size and type of organization to inexpensively deploy a video surveillance solution. But what was still missing until recently was a comprehensive backend support structure for implementing these solutions and then supporting them long-term.

February 17, 2009

The 64 Oz. Cherry Coke Enterprises have been super sized with primary disk for too long. It’s like buying a 64 oz. Cherry Coke at the local 7-Eleven day after day when you really aren’t that thirsty.  Enterprises continue to buy more primary capacity than they need or can afford. I’ve visited a lot of […]

February 12, 2009

A recent DCIG blog entry called into question the value of Bear Stearns selection of Orchestria and its inability to detect the alleged illegal activities of two of its Asset Management portfolio managers. More specifically, it asked why Orchestria did not detect the illegal activities of these individuals and why Bear Stearns did not configure it to monitor for these activities in the first place. The blog posting prompted a comment and phone call from Alan Morley, one of the individuals formerly responsible for implementing and managing Orchestria at Bear Stearns and why monitoring, detecting and preventing this activity is not as easy as it sounds.

February 11, 2009

If you happened to attend any recent conferences or trade shows then you know that most of the discussions center on driving costs out of storage environments. In the current yo-yo economy we live in, most IT Directors are looking for new and unique ways to solve their storage dilemma as storage capacity continues to […]

February 10, 2009

To kick off the new year, we recently released our Storage Predictions for 2009. We’ve received a lot interest in this list since we released it, and I personally have been asked about prediction number 3, “RAID will Hit a Data Dead End”. Allow me to explain. For this prediction, we say: RAID Nears Retirement. […]

February 10, 2009

Organizations have learned that the benefits of piece of mind, simplified operations and lower TCO that MSPs can offer are too good to pass up. By taking much of the burden of application maintenance and management off of internal IT resources, organizations can focus on more strategic initiatives that will help them respond more quickly to market opportunities and grow the business.

February 9, 2009

Recently, I had a passing conversation with an attorney about FRCP and as we were talking, he kept bringing up areas that concerned him. So I asked him, “What is your biggest eDiscovery concern?” Without hesitation he replied, “Having a judge issue ‘Death Instructions’.”

February 6, 2009

One of the more critical pieces of information that organizations need as they put together a disaster recovery plan is how much data they have in their environment and how quickly it is changing. The reason this information is so important is that without it, organizations often have no way to effectively size how much or what type of capacity they need to protect and recover their production data. In fact, I was astonished at how little information this was available about this topic or the fact that there were so few good articles on the subject.

February 5, 2009

Enterprise data protection software is experiencing a fundamental shift in terms of what organizations expect it to deliver and the amount of distributed structured and unstructured data that it needs to protect. As recently as a few years ago, the expectations of enterprise organizations were relatively modest – support for most major operating systems, integration with major applications (MS Exchange, Oracle, etc.) and tape library support – as compared to today’s standards. While some of those requirements still hold true today, more has changed than has stayed the same. This is putting a great deal of pressure on data protection products to swiftly evolve.

February 4, 2009

A clustered server environment is only as reliable as the system administrators who maintain it. The challenge they encounter after they configure and deploy the hardware and software that make-up a clustered environment is, “How to maintain it?” Most system administrators leave the configuration alone for fear of disrupting a mission critical application after it is initially deployed. Crucial details such as patches and configuration changes are not completed just due to the nature of the system itself. But what catches organizations off-guard is that at some point down the road when an event does prompt a failover from one server to another, the failover fails to occur because smaller changes have occurred in the environment that now preclude the failover from successfully taking place.