Open source has done a great deal to give those individuals who have high levels of technical skills access to the same code bases and other professionals that once were the exclusive domain of high tech companies. But the concept of meritocracy goes well beyond just technical skills. This sixth and final entry in our interview series discusses Jordan’s thoughts on how meritocracy really works and how it has helped to elevate those who live and work even in remote parts of the world to the same status of those who work for large companies.
Establishing a standard as to how an organization uses proprietary and open source code is at best difficult for most organizations. But iXsystems has essentially bet its future on the continued use of open source code in its product line. This makes it an imperative that it get this decision right to continue fostering support for its product in the open source community. This fifth entry in my interview series with iXsystems’ CTO Jordan Hubbard discusses his thoughts on iXsystems’ responsibility toward the open source community for their contributions and how it draws the line between proprietary and open source code.
Most businesses fail to grasp and/or cannot map how they benefit if their IT staff does development for the open source community or even at home. That is not an issue with iXsystems as it openly encourages its developers to engage in storage development both at home and in the open source community. In this fourth installment in my interview series with iXsystems CTO, Hubbard shares how companies in general and iXsystems specifically benefits short and long term for its developers doing work at home and in the FreeBSD kernel community. Ben: Do you see developers coming from a “hobbyist” category, the “enterprise” category, or both? Jordan: Both. A lot of our external developers, people who actually send us contributions, have a foot into both worlds. They have some storage gear at home that they use for their own purposes. They have backed up all their family photos and all their Macs and Windows desktops and laptops onto the NAS,…
Some claim that all-flash memory and/or solid state drive (SSD) storage arrays will become the silver bullets that solve all of today’s challenges with enterprise storage arrays. Those closer to the manufacturing of storage arrays have a much different viewpoint as they see a long and healthy life ahead for traditional spinning media. In this third installment in DCIG’s interview series with iXsystem’s CTO Jordan Hubbard, he discusses how hybrid storage arrays hit the sweet spot for the storage needs of most organizations.
One of the most common requests that DCIG gets from its readers is to include the actual cost of storage systems in its Buyer’s Guides. The reason DCIG continues to decline that request and only includes starting list prices is that most storage systems may be configured in multiple different ways. This makes it impossible to arrive at a definitive price point. The second part in DCIG’s interview series with iXsystem’s Jordan Hubbard illustrates this point as he discusses how the availability of multiple different storage configurations and services trumps a cookie cutter approach to buying storage every time.
A veteran of the operating systems industry, Jordan Hubbard has spent time working on a far-ranging array of products, from open source, grassroots efforts to one of the world’s largest consumer electronics companies. In this first blog entry of my interview series with iXsystems’ CTO Jordan Hubbard, we take a look at some ways in which iX’s value propositions set it apart from competitors. Ben: Today I’m interviewing Jordan Hubbard of iXsystems, a company who is behind much of the open source FreeNAS project. Jordan is one of the co-founders of FreeBSD, and more recently comes from working on OS X and iOS at Apple. Jordan, thanks for chatting with me today. Tell me a little bit about yourself, as well as your new role of CTO at iXsystems. Jordan: Thanks, Ben. I’ve done a lot of open source projects over the years, from code development libraries to the Ardent Windows Manager for the X Window System. The FreeBSD project…
Though no one would make the statement that tape as a storage medium will ever leapfrog over disk again as the preferred method of data storage, it can be said with confidence that one of the oldest computer storage medium is holding steady in its current niche and is here to stay, at least for the foreseeable future. Expansive tape libraries have remained a necessity as the Big Data market grows ever larger each year. An interesting illustration of this growth is that while tape sales dropped by 14% in 2012 overall, sales actually rose by 1% in the third quarter of 2012, and some analysts expect them to increase again by at least 3% in calendar year 2013. The amount of data growth is becoming exponentially greater with small, medium and large enterprise organizations alike generating much more data and storing it to tape than ever before. The benefits of tape over disk for long-term storage are well-documented, but…
One of the most exciting and terrifying times in the lifecycle of a company is transitioning from a small to mid-range or mid-range to enterprise sized company. Well led companies that survive those transitions have often been planning for the occasion for some time. The longer they have been planning the more likely they’ve become aware of the need for long term archiving. Of everything.
If the preliminary survey data for the 2014 DCIG Big Data Tape Library Buyer’s Guide is any indication the tape industry is still alive and is adapting to its evolving role.
In the dark and not-so-distant past there was a saying: “Nobody ever lost their job buying Big Blue.” The idea that buying IBM was a sure thing now sounds pretty strange to those of us who cut our sysadmin teeth during the heady days of the late ’90s and early 2000’s.
Over the years both Microsoft Exchange and SQL Server have gotten the reputation (well-earned, I might add) of being difficult to backup and recover. Yet with the vast majority of all size companies running one if not both of these applications in their environments, using backup software that can effectively protect and recover these applications is no longer optional – it is a prerequisite. To ensure a company is effectively performing these tasks, here are five features backup software must offer to protect and recover Microsoft Exchange and/or SQL Server.
Around two years ago the DCIG 2011 Enterprise Scale-Out Storage Buyer’s Guide was released. At the time we mentioned that scale-out systems were being used to store “Big Data” and create private storage clouds. Since then scale-out storage systems have become the foundation for building out private storage clouds which prompted DCIG to change the name of our refreshed Buyer’s Guide to better reflect the intended use case for these storage arrays.
Backup appliances are HOT right now with organizations of all sizes loving the flexibility and ease with which these appliances enable them to get backup up and running in their environment. But as DCIG’s research into backup appliances uncovers, they are not all created equal with features like deduplication, SSD support and application integration emerging as key differentiators. It is these features and more that Eversync bundles in its new, recently announced line of data protection appliances.
Few IT administrators willingly want to refer to themselves as “backup gurus” under the best of circumstances. But as organizations virtualize their environments, even grizzled veterans who were previously comfortable with their backups are now unsure of the best way to proceed so their backups are completed quickly, easily and within designated backup windows.
Fusion-io has taken some very impressive steps to lure developers to adopt their ioMemory platform with some impressive features in the SDK kit. As we have revealed in prior blog entries, its new APIs for block IO and key value stores grant developers access to flash in ways that its competitors simply do not. In this final blog entry in my interview series with Brent Compton, Senior Director of Product Management at Fusion-io, he describes new memory-access semantics exposed in the SDK.
Fusion-io is in the process of luring developers to its ioMemory platform with some impressive new features. Already we have looked at the new APIs the Fusion-io ioMemory SDK kit offers for block IO and key value stores. In today’s conversation, Brent Compton, Fusion-io’s Senior Director of Product Management, describes new native filesystem service that Fusion-io exposes in its ioMemory SDK kit and some of the technical aspects as to how it works.
The deployment of flash memory storage as either storage or memory almost inevitably results in increases in application performance. However to get the real ‘kick’ in performance that today’s transactional applications need and which flash can provide, a more elegant approach to flash’s deployment is needed. Today I continue my discussion with Fusion-io Senior Director of Product Management, Brent Compton, who elaborates on the APIs that the Fusion ioMemory SDK exposes that make this boost in transactional performance possible.
Since the advent of the TCP/IP protocol, network administrators have had a major blind spot: the ability to reliably determine the identity of an individual device or user. BlackRidge’s new Eclipse™ solution, built on BlackRidge’s patented Transport Access Control (TAC), uses client drivers or gateway appliances to insert unique identity information to every TCP packet. In this third and final post in our blog interview series, BlackRidge Technology CTO John Hayes and I discuss where BlackRidge is heading and the challenge of managing infrastructures from the perspective of devices rather than networks.
As small and midsize businesses (SMBs) virtualize their servers at an increasing pace, many fail to consider the impact this change has on how they do backups – or that it impacts their backups at all. However since many IT administrators who are responsible for backups in these environments would freely admit to not being backup gurus, here are some tips on what features to look for in backup software in order to properly protect and recover your newly virtualized environment.
As small and midsize businesses (SMBs) virtualize their servers at an increasing pace, many fail to consider the impact this change has on how they do backups – or that it impacts their backups at all. However since many IT administrators who are responsible for backups in these environments would freely admit to not being backup gurus, here is some insight into how server virtualization changes backup and what SMBs need to know about backup as they implement virtualization in their environment.