Like everyone else these days, I’ve been watching the economic news with a feeling of dread combined with a strong sense of outrage. How did those idiots – and by “those idiots” I mean “them” and not “us” – screw this all up for the rest of us. Luckily, the talking heads on the news shows all assure us things will get better – just as soon as we hit absolute rock bottom. In the mean time, life goes on for all of those who have to keep the lights on and the IT systems running. The only difference is that we’re now asked to do more with less, or in some cases the same with a lot less. But all is not lost. There are solutions that can help businesses do more without breaking the already reduced budget. The old mantra was ROI – return on investment. Executives needed to see that the cost of new IT purchases was…
Monthly Archives: February 2009
Direct attached storage often still predominates in small businesses, but as networked storage becomes more affordable and the management of it becomes easier and simpler to perform, network hard drives and network attached storage (NAS) appliances are poised to become much more pervasive. Recently Jerome Wendt, DCIG’s Lead Analyst and President, met with Jonathan Huberman, President of Iomega as well as the Consumer and Small Business Products Division of EMC, to discuss Iomega’s growing role in networked storage for small businesses and other similarly-sized work groups. In this first of a 3-part series, Jonathan examines current trends in networked storage for small businesses, how Iomega is differentiating itself from competitors and what advantages being a part of EMC brings to Iomega.
Over the past year there has been a lot of talk and speculation about Electronic Health Records (EHR). The topic started making headlines last year as President Obama and Senator McCain sparred over how to best fix health care with EHR touted as the single best way to control the ever increasing costs of medical treatment. Although it remains to be seen if this is actually the case, the recent stimulus bill passed by Congress on February 13th, 2009, has ensured EHR projects will be funded.
In looking at the tape market and what it needs to provide in tape libraries to meet today’s organizational needs, it is refreshes, not overhauls, that are required to align with these needs. Because tape libraries are becoming a secondary, as opposed to a primary, backup target in customer environments, tape library providers need to re-prioritize and even scale back the number of changes they make because if users do not want or use specific features, they will not pay for them.
Should I Archive Today? Tick…tick….tick… Data – do I need to save it? For how long do I need to save it? Do I need to save it in an immutable format? Do I have to comply with an existing regulatory requirement? Will the new Obama administration create new regulations with which I’ll have to comply? Will my company be around next year or will we be acquired by someone else? How could the new company’s corporate policies change my retention policies? These are the types of questions we constantly hear organizations struggle with. These ultimately lead to archiving decisions being delayed for months or years. Meanwhile, the clock keeps ticking towards that inevitable lawsuit (think of the implications of the Federal Rules of Civil Procedures). It’s not a matter of if, but when. If you don’t come up with a strategy on how to tackle current or future policies and procedures, it will cost you more than what the…
Over the last few months DCIG has spent fair amount of time researching and documenting specific reasons why tape will not die. Green IT is the one reason we most often hear cited for retaining tape, though new disk-based deduplication and replication technologies coupled with new disk storage system designs that are based on grid storage architectures can offset some of those concerns. So before organizations think that after 30, 90 or 180 days that they should immediately move their archival and backup data, deduplicated or otherwise, from disk to tape just to save money, there are certain intangible savings from an eDiscovery perspective that keeping data on disk provides that are not always feasible on tape.
Data protection is top of mind with more enterprise organizations today as they look to redesign data protection. Rapidly changing economic forces, new technologies and steadily growing volumes of data are prompting enterprises to rethink how they can best protect, manage and recover their data by leveraging these new technologies without introducing new people or extraordinary costs to accomplish these objectives. To get Symantec’s take on these new challenges facing organizations, DCIG lead analyst, Jerome Wendt, recently met with Deepak Mohan, Symantec’s senior vice president of the Data Protection Group, to discuss these topics.
If you have followed the news lately it would appear that the media and President Obama feel the economy is firmly entrenched somewhere between disaster and Armageddon, which has framed much of the debate surrounding the stimulus bills that are in both houses of Congress. When the Senate passed their version of the bill on February 9th, it promised $838 Billon dollars for spending projects designed to jump start the economy. But like most things in government there is a lot more in the details than the headlines. Now that the stimulus bill is out in the open, DCIG has a more clear view of where health care regulation is going and how IT will be affected.
Oracle’s ASM and 3PAR’s Thin Provisioning could be combined to offer a complete, end-to-end, storage solution. Oracle’s ASM feature would create, allocate, place, and rebalance data files for performance and 3PAR’s Thin Provisioning would dedicate disk space on the fly and only when needed.
In listening to all the market rumblings, I realize there’s more said and written about backup than any other topic. A quick tour of Delicious, where thousands of IT and storage professionals share knowledge and ideas, underscores this backup fixation. Of the 369,960 links tagged “backup,” only 11,274 are tagged “backup+recovery.” I’m worried that the market is so focused on backup that we fail to recognize what’s really important… the “AND RECOVERY” part. Why is recovery so often overlooked in the endless dialogue on backup? Is it because backups are more top-of-mind since they are performed regularly throughout the day and companies are constantly struggling to meet their operational backup windows? Or, is it because recoveries can be even more complex, costly and cumbersome than most backup and archive operations? We need to strike a better balance between backup and recovery so IT managers don’t have to choose between meeting shrinking backup windows and supporting multiple recovery points. That’s why…
Video surveillance is shaping up as the next big thing in enterprise security. IP-based cameras from Mobotix and the continued growth of high-capacity network attached storage systems from Overland Storage make it possible for almost any size and type of organization to inexpensively deploy a video surveillance solution. But what was still missing until recently was a comprehensive backend support structure for implementing these solutions and then supporting them long-term.
The 64 Oz. Cherry Coke Enterprises have been super sized with primary disk for too long. It’s like buying a 64 oz. Cherry Coke at the local 7-Eleven day after day when you really aren’t that thirsty. Enterprises continue to buy more primary capacity than they need or can afford. I’ve visited a lot of companies over the last several months and I have yet to meet one employed IT leader who was looking to spend more than they had to for their company’s storage infrastructure. Just like 7-Eleven with Cherry Coke, the largest storage vendors have a vested interest in continuing to super size customers. EMC, HDS and IBM generate over half of their storage revenue from high performance (primary) storage. Yet less than 20% of information in a typical enterprise is transactional. Just like the mere pennies a serving of Cherry Coke costs 7-Eleven, the cost of disk drives is a small component of primary storage price, the…
A recent DCIG blog entry called into question the value of Bear Stearns selection of Orchestria and its inability to detect the alleged illegal activities of two of its Asset Management portfolio managers. More specifically, it asked why Orchestria did not detect the illegal activities of these individuals and why Bear Stearns did not configure it to monitor for these activities in the first place. The blog posting prompted a comment and phone call from Alan Morley, one of the individuals formerly responsible for implementing and managing Orchestria at Bear Stearns and why monitoring, detecting and preventing this activity is not as easy as it sounds.
If you happened to attend any recent conferences or trade shows then you know that most of the discussions center on driving costs out of storage environments. In the current yo-yo economy we live in, most IT Directors are looking for new and unique ways to solve their storage dilemma as storage capacity continues to grow. One way enterprise IT organizations are tackling this problem is thru deduplication using a disk-based backup solution. Though this is definitely a good approach of tackling data growth and cost savings in the backup space, it does nothing to alleviate the burden of data growth on primary storage since backup solutions do not remove and archive aging production data. This is where a robust archival system like the Permabit Enterprise Archive can help enterprises. The Enterprise Archive offers a grid-based architecture that organizations can add to existing production environments and start to realize reductions in their primary storage Day 1. The Enterprise Archive integrates deduplication and compression for storage space…
To kick off the new year, we recently released our Storage Predictions for 2009. We’ve received a lot interest in this list since we released it, and I personally have been asked about prediction number 3, “RAID will Hit a Data Dead End”. Allow me to explain. For this prediction, we say: RAID Nears Retirement. As multi-tiered storage continues to evolve, SANs will become more complex, unified networks will emerge, and as newer and larger drive technologies such as 1 TB drives take root, RAID as a data protection technology will become irrelevant. Advanced data protection schemes based on Erasure Coding technology for long term reliable data storage will take hold putting additional pressure on legacy solutions depending on RAID. RAID is a technology that has served us well, but there are two ways in which it fails to scale going forward. Most importantly, RAID technologies today have serious problems with large capacity drives, like the 1 and 1.5 TB…
Organizations have learned that the benefits of piece of mind, simplified operations and lower TCO that MSPs can offer are too good to pass up. By taking much of the burden of application maintenance and management off of internal IT resources, organizations can focus on more strategic initiatives that will help them respond more quickly to market opportunities and grow the business.
Recently, I had a passing conversation with an attorney about FRCP and as we were talking, he kept bringing up areas that concerned him. So I asked him, “What is your biggest eDiscovery concern?” Without hesitation he replied, “Having a judge issue ‘Death Instructions’.”
One of the more critical pieces of information that organizations need as they put together a disaster recovery plan is how much data they have in their environment and how quickly it is changing. The reason this information is so important is that without it, organizations often have no way to effectively size how much or what type of capacity they need to protect and recover their production data. In fact, I was astonished at how little information this was available about this topic or the fact that there were so few good articles on the subject.
Enterprise data protection software is experiencing a fundamental shift in terms of what organizations expect it to deliver and the amount of distributed structured and unstructured data that it needs to protect. As recently as a few years ago, the expectations of enterprise organizations were relatively modest – support for most major operating systems, integration with major applications (MS Exchange, Oracle, etc.) and tape library support – as compared to today’s standards. While some of those requirements still hold true today, more has changed than has stayed the same. This is putting a great deal of pressure on data protection products to swiftly evolve.
A clustered server environment is only as reliable as the system administrators who maintain it. The challenge they encounter after they configure and deploy the hardware and software that make-up a clustered environment is, “How to maintain it?” Most system administrators leave the configuration alone for fear of disrupting a mission critical application after it is initially deployed. Crucial details such as patches and configuration changes are not completed just due to the nature of the system itself. But what catches organizations off-guard is that at some point down the road when an event does prompt a failover from one server to another, the failover fails to occur because smaller changes have occurred in the environment that now preclude the failover from successfully taking place.