The closer any new solution comes to being non-disruptively introduced into existing organizational backup infrastructures, the greater the odds that the solution will succeed and be adopted more broadly. By Dell including FIPS 140-2 compliant 256-bit AES encryption and VTL features as part of its 3.2 OS release for its existing and new DR series of backup appliances at no charge, organizations have new options to introduce the DR Series appliances without disrupting their existing backup processes.
One of the most exciting and terrifying times in the lifecycle of a company is transitioning from a small to mid-range or mid-range to enterprise sized company. Well led companies that survive those transitions have often been planning for the occasion for some time. The longer they have been planning the more likely they’ve become aware of the need for long term archiving. Of everything.
DCIG is pleased to announce the release of its 2014 Mobile Data Management (MDM) Buyer’s Guide that weight, score and rank over 100 features. Like previous Buyer’s Guides, this Buyer’s Guide provides the critical information that organizations need when selecting Mobile Data Management software to help meet the security, compliance and Bring-Your-Own-Device (BYOD) challenges in an ever increasing mobile enterprises.
Everyone frequently talks about archiving data when they know the make-up of the data is and where it is located. But what no one want to discuss is the more common real-world problem of not even knowing where data is so it may be archived – especially as it pertains to Outlook PST files. In this sixth and final blog entry in my interview series with C2C Systems’ CTO Ken Hughes, he talks about the real world problem of finding and archiving PST files in organizations and how ArchiveOne takes that into account in its architecture.
Over the past several years, the concept and associated technology to support Do-It-Yourself (DIY) eDiscovery has emerged within the litigation services and technology market as an approach that can increase productivity, provide users with more control over the process and ultimately reduce the overall cost of eDiscovery. Although there are use cases that prove the value of DIY eDiscovery, some people contend that DIY eDiscovery is not legally defensible. Others point to “headline” cases that suggest DIY eDiscovery can turn into a train wreck.
Doing searches across unstructured data stores and understanding who owns this data are emerging as higher priorities in today’s Big Data era. However archiving software can vary greatly in how it performs these tasks of search and assigning data ownership. In this fourth blog entry in my interview series with C2C Systems’ CTO Ken Hughes, he examines how C2C performs search across distributed email and file systems and what techniques it employs to establish data ownership.
As the last business day of 2012 it is time for DCIG to unveil its most read blog entries of 2012. While a few long time reader favorites remain in this year’s Top 5, a couple of newcomers also made first time appearances on this year’s list driven by what is likely growing user interest/concern in managing Big Data and doing eDiscovery across their unstructured data stores.
I have disclosed the blog entries that have earned an honorable mention on DCIG’s website for the number of page views they received in 2012. I have also already revealed the Top 5 blog entries written in 2012 that were the most frequently read in 2012. So it is time today to begin to reveal the Top 10 most frequently viewed blog entries on DCIG’s website in 2012 regardless of what year they were published, starting with numbers 6 – 10.
One of the unique aspects about running a blog site that primarily does analysis as opposed to commenting and covering today’s news is that the most read blog entries on DCIG’s site each year are rarely from the current year. This year was no exception as only one of the Top 5 blog entries written in 2012 made it into the Top 10 of DCIG’s most read blog entries of 2012 that I will start to reveal in tomorrow’s blog entry.
Most companies recognize the benefits of deleting data when it no longer serves any business purpose or when it legal requirements to retain it have been met. However the act of deleting data still gives many organizations pause. In this third blog entry in my interview series with C2C Systems’ CTO Ken Hughes, he discusses C2C’s policy management features and the granular ways in which users may manage deletion in their data stores.
The accelerating increase in the volume of Electronically Stored Information (ESI) is resulting in knowledge workers reaching a point where they may not be able to utilize traditional data management and analytic technology and processes to keep pace. However, the increases in knowledge worker productivity and decreases in eDiscovery costs made possible by predictive analytic technology are coming to the point where they are applicable to other knowledge management tasks within the enterprise.
The purpose of archiving is becoming more than simply facilitating smaller email stores, faster response times or better use of expensive storage capacity. The growing driver behind archiving is to enable organizations to implement information governance. In this second blog entry in my interview series with C2C System’s CTO Ken Hughes, Ken explains eDiscovery and retention management are becoming the new driving forces behind archiving and why C2C’s ArchiveOne is so well positioned to respond to that trend.
Archiving is emerging as one of the hot new trends of the next decade with organizations looking for better ways to manage their Big Data stores. Perhaps nowhere is data growth more rampant – and the need for better ways to manage it – more evident than with corporate email stores. In this blog entry, I begin an interview series with C2C System’s CTO Ken Hughes in which we initially discuss C2C’s focus on Microsoft Exchange and which size environments C2C’s products are best positioned to handle.
Faced with the accelerating increase in the volume of Electronically Stored Information (ESI) and the emergence of the concept of Big Data, enterprises worldwide need next generation IT systems to fulfill their corporate compliance, information governance and eDiscovery requirements to process and analyze all of this data. It is in response to this demand and the result of recent legal precendents that Technology Assisted Review (TAR), also known as Predictive Coding or Computer Assisted eDiscovery, is emerging as a legally viable and court-recognized option.
I spend the last 3 days walking the exhibit hall and meeting with technology vendors at the 2012 International Legal Technology Association (ILTA) Conference held August 26-30, 2012 at the Gaylord National Resort and Conference Center in Washington, D.C. ILTA is the premier peer networking organization, providing information to members to maximize the value of technology in support of the legal profession. As such, ITLA’s annual conference is a great place to talk to most of the technology vendors targeting the legal market, see demos of their technology offerings, listen to presentations from industry experts and judge the current status of the evolution of technology adoption within the legal community.
An integrated and centralized data store model that enables stakeholders from throughout an organization to harvest and analyze data on the same platform continues to be the goal of many organizations from the Global 2000 as they strive to address the requirements of Big Data. However, with today’s decentralized and cloud based storage systems and data storage requirements for the Global 2000 reaching the petabyte and even exabyte levels, massive centralized single data store infrastructures with single points of failure such as Apache Hadoop may not the most effective long term solution.
Earlier this week I attended the 2nd Annual Carmel Valley eDiscovery Retreat (CVeDR) in Monterey, California. CVeDR, founded by Chris La Cour, brings together well known Litigation and eDiscovery experts and industry thought leaders for 3 days of speeches, panel discussions and debate on the important topics facing the industry.
Today DCIG and eDiscovery Solutions Group are pleased to jointly announce the availability of a new DCIG Buyer’s Guide, the DCIG 2012 Early Case Assessment Buyer’s Guide that weights, scores and ranks 25 software tools that help law firms, litigators and organizations reduce their overall cost of eDiscovery. This Buyer’s Guide gives these entities the resources they need to assist them to quickly complete what is typically a very arduous and time-consuming process: selecting the right ECA software that matches their specific requirements.
DCIG expects to unveil its DCIG 2012 Early Case Assessment (ECA) Buyer’s Guide in Q2CY12. As prior Buyer’s Guides have done, it puts at the fingertips of organizations a Buyer’s Guide that provides them with a comprehensive list of ECA software that can assist them in this all-important buying decision while removing much of the mystery around how ECA are configured and which ones are suitable for which purposes.
Later this month DCIG will be unveiling a new product designed to aid end users, value added resellers and sales teams within our coverage community. The product is based on a successful line of analysis that DCIG has been producing since 2010 – the DCIG Buyer’s Guides.