One industry where the linear tape file system (LTFS) has seen the most rapid uptick in its adoption is in the media and entertainment industry. However there are three (3) cautionary notes that organizations should still keep in mind if they opt to go down the path of using LTFS to access data stored on tape.
Category Archives: Physical Tape
One of the more difficult tasks for anyone deeply involved in technology is the ability to see the forest from the trees. Often responsible for supporting the technical components that make up today’s enterprise infrastructures, to step back and recommend which technologies are the right choices for their organization going forward is a more difficult feat. While there is no one right answer that applies to all organizations, five (5) technologies – some new as well as some old technologies that are getting a refresh – merit that organizations prioritize them in the coming months and years.
Ever since using disk as a preferred backup target gained momentum in the late 2000’s, there have been those who opine that disk’s life in this role would be short lived. But those providers who deliver disk-based backup solutions and are betting their future on them see no slowdown in their adoption. In this first interview with Sepaton’s Director of Product Management, Peter Quick, we discuss how databases and virtual machines (VMs) are just beginning to take full advantage of the benefits that disk offers as a backup target.
Archiving or backing up large amounts of data to the cloud is very appealing until one starts to examine the mechanics of actually sending or retrieving that data from the cloud. Waiting minutes or hours to send or retrieve data is no longer acceptable to today’s end-users who are rapidly becoming accustomed to near-instant response times. In this fifth and final part of my interview series with BridgeSTOR’s CEO John Matze, he explains how sending or retrieving data in a piecemeal fashion to the cloud is the fastest and most effective way to do so.
Storing archival and backup data in the cloud is high on the list of priorities of many organizations if for no other reason is that the data remains accessible and available without organizations having to bear the burden of managing the data locally long term. But as more organizations use cloud storage gateways to store this data, they will find distinct differences in how these appliances manage data in the cloud with differences sometimes existing even between appliances from the same vendor. In this fourth part of my interview series with BridgeSTOR’s CEO John Matze, he reveals the various methods that the BridgeSTOR NAS and VTL cloud gateway appliances store, access and manage this data locally and in the cloud.
Ask any large organization how many tapes they have sitting around in local storage or at Iron Mountain or some other third party storage facility and odds are they have more tapes – and are likely spending more money storing these tapes – than they would like to admit. This opens up a unique opportunity for a third party provider to solve this dilemma. In this third part of my interview series with BridgeSTOR’s CEO John Matze, we discuss how using the BridgeSTOR VTL cloud gateway appliance organizations can move their tape museums into the cloud.
NAS gateway appliances that connect to backend public storage clouds are still not a “dime a dozen” but they are definitely more prevalent than they were even a few years ago. However a new class of gateway appliances that provides a virtual tape library (VTL) is now available from BridgeSTOR. In this second part of my interview series with BridgeSTOR’s CEO John Matze, we discuss the inner workings of its VTL interface that it is making available this month on its cloud gateway appliance.
Over the last few months I have been talking to a number of end-users about their implementations of deduplication technology. In the process of doing so, they have provided me with valuable insight into how they are implementing deduplication when using disk-based targets that deduplicate data. Based upon that feedback it appears that most are adhering to the following five guidelines as they implement deduplication in their environments.
To say that FalconStor has had some struggles over the past few weeks would probably be a bit of an understatement. Any time that a company’s CEO abruptly resigns with “certain improper payments” cited as the reason for his departure, it can leave a company floundering and seeking direction. However having had an opportunity to chat with FalconStor’s new CEO, Jim McNeil, at SNW over dinner this past week, he is already helping FalconStor move past the CEO’s departure and regroup and refocus under his leadership.
Some seem to think that virtualization for the sake of virtualization is the proper business objective because of how it helps reduce server and storage footprints, utilize physical resources more effectively or ultimately lower costs. Certainly these are proper short term goals but the real end game of virtualization is not simply to create a virtualized data center environment. It is to create one that fully automates IT operations.
I am playing the role of road warrior this week by attending two conferences. The first two days of this week I was attending Storage Networking World (SNW) 2010 in Orlando, FL, and then today I hopped on a direct flight Las Vegas to catch one day of the Symantec Vision conference.
Maybe it is just me but 2010 has, up until now, seemed pretty slow on the news front. Or maybe it is just that much of the news released did not really pique my interest. Regardless, the last two weeks a number of news items jumped out at me that I wanted to spend a little time commenting on today in my weekly Friday recap blog.
It is hard to believe it is approaching the end of 2009 already but what a year it has been. While 2009 has arguably been one of the toughest economic years in anyone’s recent memory (and I for one am not convinced the economic slump is by any means approaching an end), from a storage technology perspective it has been one of the most innovative and exciting in recent memory. Deduplication has gone main stream, cloud storage is on every organization’s radar screen and all organizations (storage end-users and providers alike) are beginning to grasp just how disruptive solid state drives (SSDs) are going to be.
As 2009 approaches, the traditional benchmarks for enterprise backup software such as the management of physical tape libraries, support for multiple operating systems and SAN backups are yesterday’s news. Instead support for backup to disk, continuous data protection (CDP), protection for laptops and desktops and a common repository where protected data is stored, deduplicated and available for rapid access and search is how enterprise data protection software is now defined and measured. Yet even when one factors in these new benchmarks for enterprise data protection, how products such as Atempo Time Navigator play in this rapidly evolving space, and in which verticals they best play, are less than intuitive to the untrained eye.
Keep up to date with Jerome’s key publications; last month on the 28th Jerome wrote how value added resellers and professional services firms can leverage VTLs when selling services like recovery management, eDiscovery and email archiving.
Read More at SearchStorageChannel.com
Jerome sits down with Agite software to discuss a case study where implementing a virtual tape library (VTL) from Sepaton solved some speeds and feeds issues. VTL can’t fix administrator decisions to disable or turn off some tape libraries. Data Protection Management discovers your environment and finds things that aren’t top of mind, perhaps from decisions made months or even years ago. Agite Software’s backupVISUAL takes center stage in “VTL” does not mean “PNP”