Blockchain technology holds the potential to dramatically enhance global commerce and every supply chain. Unfortunately, the first real-world experience many organizations have had with it is using its implementation vis-à-vis Bitcoin to pay a ransom to cybercriminals who have encrypted their company’s files. The good news is that vendors like Nexsan see the upside of blockchain and are using it for more noble purposes: protecting files stored on its Unity Active Archive appliances.
It’s summer time and nothing typifies it more in the United States than a parade on one of its summer holidays. Keeping with this tradition, the Acronis Backup 12.5 release rolls out a parade of new features that help differentiate it in a crowded market. Leading its feature parade is the introduction of security software to authenticate preexisting backups; the flexibility to customize the names of archived backups; and, event-based backup scheduling all caught my eye as features that few other backup software products currently offer.
Each passing week seems to bring new use cases for solid state drives (SSDs) further to the forefront and brings into question the viability of disk and tape for them. This week was no exception. The announcement of NGD Systems 24TB Catalina SSD directly targets use cases such as active archive where tape predominate but for which the 24TB Catalina SSD emerges as a potential replacement.
Perhaps nowhere does the complexity of the IT infrastructure within today’s organizations come more clearly into focus than when viewed from the perspective of data protection. Backup and recovery software sees first hand all of the applications and operating systems in an enterprise’s environment . Yet, at the same time, it is expected to account for this complexity by centralizing management, holding the line on costs, and simplifying these tasks even as it meets heightened end-user demands for faster backups and recoveries. To break through this complexity, there are three tips that any organization can follow to help both accelerate and simplify the protection and recovery of data in their environment.
One of the most exciting and terrifying times in the lifecycle of a company is transitioning from a small to mid-range or mid-range to enterprise sized company. Well led companies that survive those transitions have often been planning for the occasion for some time. The longer they have been planning the more likely they’ve become aware of the need for long term archiving. Of everything.
Anyone involved with managing any serious amounts of data (and when I say “serious amounts of data,” I mean multiple PBs of data) knows that today’s disk-based storage solutions are, for the most part, not equipped to meet the diverse requirements of storing this amount of data. While still an extreme use case, a growing number of organizations have to manage PBs of data.
DCIG is pleased to announce the availability of its 2013 Private Cloud Storage Array Buyer’s Guide that weights, scores and ranks over 150 features on 25 different cloud storage arrays from 15 different providers. This Buyer’s Guide provides the critical information that all size organization need when selecting a private cloud storage array that provides the availability, ease-of-use, flexibility and scalability features to meet the demands of their most data-intensive applications.
Digital archiving suffers from a perception problem though one that is probably well-deserved. Perceived as difficult to cost-justify, hard to implement and whose benefits can often be achieved by simply throwing more disk at the problem, most companies have had a hard time justifying its deployment. However a wave of fundamental changes in the storage industry as a whole and in digital archiving technology itself are setting this technology up to be one of the hottest technologies in the months and years to come.
Everyone frequently talks about archiving data when they know the make-up of the data is and where it is located. But what no one want to discuss is the more common real-world problem of not even knowing where data is so it may be archived – especially as it pertains to Outlook PST files. In this sixth and final blog entry in my interview series with C2C Systems’ CTO Ken Hughes, he talks about the real world problem of finding and archiving PST files in organizations and how ArchiveOne takes that into account in its architecture.
Bad news is only bad until you hear it, then it’s just information followed by opportunity. Information may arrive in political, personal, technological and economic forms. It creates opportunity which brings people, vision, ideas and investment together. When thinking about a future history of 2013, three (3) opportunities come to mind.
One of the most common initial use cases for cloud storage is for the storage of archival data. However that does not mean every organization is quite ready to move all of their archival data to the cloud or, what they do move to the cloud, trust the cloud to be available to provide access to the data when they need it. In this fifth blog entry in my interview series with C2C Systems’ CTO Ken Hughes, he talks about the importance of having access to cloud storage repositories for archival data and the advantages of keeping on-premise and data in the cloud synchronized.
Doing searches across unstructured data stores and understanding who owns this data are emerging as higher priorities in today’s Big Data era. However archiving software can vary greatly in how it performs these tasks of search and assigning data ownership. In this fourth blog entry in my interview series with C2C Systems’ CTO Ken Hughes, he examines how C2C performs search across distributed email and file systems and what techniques it employs to establish data ownership.
Ever since I got involved with IT in general and data storage specifically, the predominant way that organizations manage their data growth is by throwing more storage at the problem. Sure, they pay homage to technologies like archiving, data lifecycle management and storage resource management (SRM) but at the end of the day the “just buy more” principle prevails. Yet as we enter 2013, data management is finally poised to become a data center priority.
As the last business day of 2012 it is time for DCIG to unveil its most read blog entries of 2012. While a few long time reader favorites remain in this year’s Top 5, a couple of newcomers also made first time appearances on this year’s list driven by what is likely growing user interest/concern in managing Big Data and doing eDiscovery across their unstructured data stores.
The purpose of archiving is becoming more than simply facilitating smaller email stores, faster response times or better use of expensive storage capacity. The growing driver behind archiving is to enable organizations to implement information governance. In this second blog entry in my interview series with C2C System’s CTO Ken Hughes, Ken explains eDiscovery and retention management are becoming the new driving forces behind archiving and why C2C’s ArchiveOne is so well positioned to respond to that trend.
Archiving is emerging as one of the hot new trends of the next decade with organizations looking for better ways to manage their Big Data stores. Perhaps nowhere is data growth more rampant – and the need for better ways to manage it – more evident than with corporate email stores. In this blog entry, I begin an interview series with C2C System’s CTO Ken Hughes in which we initially discuss C2C’s focus on Microsoft Exchange and which size environments C2C’s products are best positioned to handle.
SNW 2012 revealed a dynamic industry that is innovating across all storage tiers. From incorporating super-low-latency flash memory into the data center to new tape formats that essentially turn tape libraries into high-latency disk drives, lots of talent is being applied to meet the growing demands that enterprises have for their storage systems.
In today’s information age our focus always tends to be on the here and now and how quickly we can access information that was made sometimes just seconds ago. But in terms of the total amount of data in the digital universe, that is just the tip of the iceberg with possibly as much as 90% of today’s data existing as archival data. Ensuring the integrity of that data and making sure it is stored cost effectively for decades is the responsibility of today’s new generation of tape libraries. In part 3 of my interview series with Spectra Logic’s CEO Nathan Thompson, we discuss how tape libraries have continued to mature to meet today’s new business demands for retaining archival data for even longer periods of time.
Crawl. Walk. Run. That progression pretty well summarizes how most people look to take advantage of cloud service providers over time though, in cloud services terminology, the progression may be better summed up as: Archive, Replicate, Recover. Today I conclude my conversation with American Internet Service’s VP of Network Engineering, Steve Wallace, as we examine how many of AIS’ clients initially get their data into the AIS cloud and then expand their use of AIS cloud services over time.
DCIG is very excited to announce the availability of its inaugural DCIG 2012 Big Data Tape Library Buyer’s Guide that weights, scores and ranks over 140 features on more than 60 tape libraries from 8 different storage providers. Driven by the explosion of storage requirements to address “Big Data” and the “Cloud,” organizations are now more than ever looking for cost-effective, viable storage media on which to store this data. This is why DCIG believes tape libraries are poised to be one of the big benefactors of these growing storage demands which prompted DCIG to produce its first ever Tape Library Buyer’s Guide to help enterprises choose the right solution for their environment.