Reflections on SNW 2012: Time to Revisit Assumptions About Storage

SNW 2012 revealed a dynamic industry that is innovating across all storage tiers. From incorporating super-low-latency flash memory into the data center to new tape formats that essentially turn tape libraries into high-latency disk drives, lots of talent is being applied to meet the growing demands that enterprises have for their storage systems.

As an IT Director at three different universities over the last 24 years I have researched and purchased multiple storage systems and taken two universities through the data center virtualization process, including establishing off-site disaster recovery capabilities. SNW 2012 was my first opportunity to sit down and talk with some of the people who envision and create these systems.  

As my father-in-law advised my son, who was about to start his first job, “There is a knack to everything.” In other words, there is specialized knowledge or skill in every endeavor that makes a real difference in the time and energy required to produce a given amount of work as well as the quality of the results achieved. This specialized knowledge and skill distinguishes the novice from the amateur, and the mere wage earner from the expert.

When it comes to storage systems, engineering still matters. There is specialized storage knowledge that applies across time whether designing/implementing a file system, writing software that implements storage protocols, or qualifying hard drives for use in storage arrays. On the other hand, flash memory presents new opportunities and challenges as engineers seek to leverage the strengths and mitigate the weaknesses of flash memory as it is introduced into enterprise storage systems.

Not surprisingly, flash storage and SSDs were the primary focus of storage system innovation. Multiple permutations of where flash belongs in the data center storage infrastructure were in evidence here, including all-flash PCI cards, SSD-based arrays and appliances, and SSD as one layer in multi-layer storage systems. In some cases this flash memory is intended to serve as primary storage, in other cases just as a cache in front of primary storage.

Somewhat surprisingly, tape seems to be experiencing a resurgence. Tape lives at the opposite end of the storage tier from SSD. The renewed interest in tape is based largely on the Linear Tape File System (LTFS), an open format released by IBM in 2010 that makes accessing files stored on LTFS-formatted tape similar to accessing files stored on disk.  

LTFS is being used to make archive data much more readily accessible since files can be retrieved and used directly from the tape media without having to be restored to disk first. Although there are many applications of this technology, there has been particular interest among broadcasters and others who need to make available large amounts of archived video content. Although latency is high, once a file is found it can in many cases be retrieved from tape as rapidly as from a disk array.

Another dynamic I observed at last week’s SNW 2012 is that there are experts–not only CxO’s but also engineers–who have worked in storage companies from startup through acquisition (plus the required period before stock options can be exercised) and then repeated the cycle. Related to this dynamic is how easy it is for the process of being acquired to slow the pace of innovation in a given product as the product and people from the startup are integrated into the larger entity.  

From a customer perspective this dynamic may drive some of the risk out of adopting a startup’s technology–if the startup has a solid team of storage experts engaged in creating their products. This dynamic also suggests that the acquisition of a startup by a major player entails its own set of risks for current customers. The bottom line is that due diligence is still very much required of customers to go beyond check-boxes on a requirements list to understand not only present capabilities but the likely future path of a given solution.

Cloud computing and storage, including file-sync-and-share technologies, were also topics of significant interest among attendees. Intuit’s CIO shared an insider’s view of how Intuit has labored to move IT from primarily a compliance focus to being a provider of Global Enterprise Solutions that owns business outcomes; even as their own private cloud has become key to meeting customer requirements for mobile, social and global service capabilities. It is a testament to the rapid maturation of public cloud offerings that Intuit is now seriously evaluating a transition to the public cloud even for sensitive financial data.

There are new dynamics in data storage and retrieval–especially the demands that Big Data puts on storage systems and the emergence of flash memory–that mean it is time to revisit assumptions about storage systems. The need for fresh thinking about storage is as true for the businesses that purchase storage systems as it is for the people who create those systems. 

image_pdfimage_print
Ken Clipperton

About Ken Clipperton

Ken Clipperton is the Lead Analyst for Storage at DCIG, a group of analysts with IT industry expertise who provide informed, insightful, third party analysis and commentary on IT hardware, software and services. Within the data center, DCIG has a special focus on the enterprise data storage and electronically stored information (ESI) industries.

Leave a Reply

Bitnami