One of the most common initial use cases for cloud storage is for the storage of archival data. However that does not mean every organization is quite ready to move all of their archival data to the cloud or, what they do move to the cloud, trust the cloud to be available to provide access to the data when they need it. In this fifth blog entry in my interview series with C2C Systems’ CTO Ken Hughes, he talks about the importance of having access to cloud storage repositories for archival data and the advantages of keeping on-premise and data in the cloud synchronized.
Category Archives: Search
Doing searches across unstructured data stores and understanding who owns this data are emerging as higher priorities in today’s Big Data era. However archiving software can vary greatly in how it performs these tasks of search and assigning data ownership. In this fourth blog entry in my interview series with C2C Systems’ CTO Ken Hughes, he examines how C2C performs search across distributed email and file systems and what techniques it employs to establish data ownership.
Faced with the accelerating increase in the volume of Electronically Stored Information (ESI) and the emergence of the concept of Big Data, enterprises worldwide need next generation IT systems to fulfill their corporate compliance, information governance and eDiscovery requirements to process and analyze all of this data. It is in response to this demand and the result of recent legal precendents that Technology Assisted Review (TAR), also known as Predictive Coding or Computer Assisted eDiscovery, is emerging as a legally viable and court-recognized option.
Server virtualization has effectively broken the one-to-one relationship between servers and applications, enabling more efficient use of the host’s physical resources. But this is not without its drawbacks, as applications like backup software that took advantage of these idle resources no longer have access to them.
Companies who execute Information Governance plans are looking for eDiscovery products supporting Early Case Assessment (ECA). ECA is a combination of search, workflow management, information processing, and multilingual user interfaces. ECA requires a cohesive set of technology, business and data science stakeholders to select products.
ECA is powerful business process, but identifying ECA products is a beleaguering task. ECA mashes together eDiscovery and technology requirements. The “mashing of requirements” creates a broad matrix of products and functionality. Without question, eDiscovery has significantly evolved within the last few years.
This week I am going to hearken back to a conference call that took place a couple of weeks ago on the morning of November 3, 2009. This is a new quarterly conference call that CommVault is sponsoring. This particular call was hosted by its Vice President of Marketing and Business Development, David West and was intended to provide some insight into CommVault’s Q209 successes. But, to my surprise, Tyco Electronics’ Scott Zeiders who heads its UNIX Tech Support, also joined the call and commented on Tyco’s experiences with implementing CommVault® Simpana®.
Business processes, like electronic discovery, offer defined metrics and quantitative impacts on organizations. Historically speaking, electronic discovery review budgets have been rising steadily; creating the need to improve review (better crushing power) or reduce data going into review (refined selection process). Moreover, the team at KVS/Symantec knew in 2005 that “Discovery Accelerator 1.0″ was a stifled product; primarily designed to return email results for people, according to dates and keywords. At the time all the talk was around better “improving review,” but the market has been saying “early case assessment” since early 2007.
Carl Frappaolo, AIIM Vice President, Market Intelligence says “Unstructured information drives numerous business processes…” The logical option here would be to deploy a business process management suite (BPMS) of tools. Step one is to identify what departments, project groups and individuals are involved in the business processes. Step two; identify the information that results from those individuals, groups and departments. Step three, once the business process is mapped to the information you simply associate it with a retention management product and policies.
Microsoft (NASDAQ:MSFT) and Sun (NASDAQ:JAVA). If Microsoft and Sun were collaborating on a charity picnic this time last year, it would have been a shock. Now these companies are coming together to provide a combined software and hardware solution and going forward arm-in-arm with CommVault (NASDAQ:CVLT) to help companies address their most pressing application data management needs. No where is this challenge more acutely felt by businesses than in supporting Microsoft Exchange Server 2007 and Microsoft Office SharePoint Server 2007.
Putting the right infrastructure in place to meet the needs of those mission-critical applications requires a new generation of hardware and software support. Organizations must do more than just select the right combination of hardware and software but deploy and support it. “Turnkey” is the new operative word no matter what size a company is and companies need turnkey solutions when implementing and supporting today’s mission critical applications.
Today’s announcement that CommVault® has entered into a collaborative alliance with Microsoft and Sun is most significant for companies introducing 64-bit processing. CommVault® cinches the 64-bit data management angle for companies while Microsoft and Sun provide a proven, underlying 64-bit operating system and server/storage hardware environment.
This new collaborative agreement is the outcome of conversations that started back in September 2007 when Microsoft and Sun announced their strategic relationship. The missing component, until now in this relationship, was a 64-bit unified data and information management platform that CommVault’s Simpana® Suite shores up. Simpana Suite unifies the data management interface for backup, recovery, archiving, search and classification while creating storage efficiencies using its Single Instance Storage (SIS) technology.
For Microsoft Exchange, SIS is one part of CommVault’s Common Technology Engine (CTE) which delivers release independence on a couple of fronts. CommVault’s CTE enables corporations to backup data on a 32-bit Microsoft Exchange server then recovers that same data to a 64-bit Microsoft Exchange system. Further, support and licensing within CommVault’s CTE and Simpana suite aren’t limited to version or chipsets. Companies can use their existing CommVault licenses on either 32-bit or 64-bit Sun and Microsoft platforms.
This announcement sets the stage for CommVault, Microsoft and Sun to provide an integrated 64-bit turnkey solution for enterprise Linux, Unix and Windows shops. By moving to a complete 64-bit hardware and software architecture, companies can consolidate servers while improving application and server management.
However on a larger scale this reflects a huge step forward for CommVault and how enterprises should perceive the Simpana software suite. Dave West, CommVault’s VP of Marketing and Business Development, says that CommVault started the move to a 64-bit software architecture three years ago in anticipation of corporate needs. CommVault realized that 64-bit was a market differentiator and provides vertical product scalability. “This relationship is a pay-off for us making that investment in 64-bit architecture,” says West.
The most immediate benefits users should expect from this relationship are improved data management and support for their Microsoft Exchange Server 2007 and SharePoint Server 2007 applications. Using an end-to-end 64-bit computing platform, companies can scale vertically with Exchange and SharePoint. They can be assured their larger data backup, recovery and archiving requirements can be addressed using 64-bit architecture versus 32-bit running in emulation mode.