To help organizations evaluate available enterprise storage arrays and make informed decisions about the most appropriate array for their needs, DCIG is pleased to announce the availability of its body of research into enterprise storage arrays. DCIG’s body of research into enterprise storage arrays, presented and made available through its Analysis Portal, directly addresses this specific challenge that organizations routinely encounter when buying storage arrays.
Anyone in attendance at VMworld last week in Las Vegas and walking through the exhibit hall where all of the vendors showcased their wares could hardly miss the vast numbers of hyper-converged infrastructure and hyper-converged like vendors in attendance. Cisco, Dell, EMC, HPE, Nutanix, Maxta, Cohesity, Pivot3, Rubrik, Simplivity and Datrium, just to name a few, and I am sure there were others. Yet what caught my attention is speaking to their representatives and some of their users is how storage remains a factor in the architecture of hyper-converged infrastructure (HCI) solutions.
Hyperconverged infrastructure solutions stand poised to disrupt traditional IT architectures in every way possible. Combining compute, data protection, networking, memory, scale out, storage, and virtualization on a single platform, they deliver the benefits of traditional IT infrastructures without their associated complexities. But as organizations look to consolidate on hyperconverged infrastructure solutions, they need data protection services such as Pivot3’s Quality of Service (QoS) feature now found on its vSTAC SLX Hyperconverged product that enables organizations to better protect their applications.
Each Buyer’s Guide Edition released by DCIG generates a tremendous amount of interest in the technology industry as a whole. However, as people consider each Buyer’s Guide Edition that DCIG puts out, many make the assumption that DCIG seeks to make each Guide Edition “all-inclusive.” This is a mistaken assumption which leads some to claim or position themselves as “in the know”, jumping to misleading and erroneous conclusions.
DCIG is pleased to announce the availability of the DCIG 2016-17 Midmarket Enterprise Storage Array Buyer’s Guide as the first Buyer’s Guide Edition developed from this body of research. Other Buyer’s Guides based on this body of research will be published in the coming weeks and months, including the 2016-17 Midrange Unified Storage Array Buyer’s Guide and the 2016-17 High End Storage Array Buyer’s Guide.
Integrating backup software, cloud services support, deduplication, and virtualization into a single hardware appliance remains a moving target. Even as backup appliance providers merge these technologies into their respective appliances, the methodologies they employ to do so can differ significantly between them. This becomes very apparent when one looks at growing number of backup appliances from the providers in the market today.
Whether companies like it or not, individuals within their organizations over the last few decades have adopted the technologies that they need in order to more effectively do their jobs. One such adoption has been the use of public file sync-n-share technologies that put data – and the control of it – outside of the purview of corporate IT. In this third and final installment in my interview series with Nexsan’s CEO Robert Fernander, he explains how Nexsan’s UNITY empowers organizations to bring this part of the world of shadow IT back under corporate control.
Yesterday I broke away from my normal routine of analyzing enterprise data protection and data storage technologies to take a closer look at enterprise security. To do so, I stopped by the Omaha Tech Security Conference held at the local Hilton Omaha conference center and visited some of the vendors’ booths to learn more about their respective technologies. In so doing, it quickly became evident from my conversations with a number of security providers that they recognize their need to introduce Big Data analytics into their products to convert the data, events, and incidents that they record and log into meaningful analysis that organizations can consume and act upon.
Organizations may view true software defined storage (SDS) software as only appropriate to host their tier two and tier three applications. However, many known and named accounts now use SDS software to host their tier one applications. In this third and last installment of my interview series with Nexenta’s Chairman and CEO, Tarkan Maner, he explains where SDS software initially gets a foothold in organizations and why it rapidly gains traction and moves up to host tier one applications.
More data to backup, less time to recover it, heightened recovery expectations and limited time to dedicate to manage these tasks. These are the dilemmas that every mid-market business faces when backing up and recovering its data. The good news is that the DL1300 Backup and Recovery Appliance offers the specific features that mid-market companies need to address these issues. Delivered as a turn-key, easy-to-deploy solution, the DL1300 offers the comprehensive set of features that mid-market companies need to reduce their time spent on backups, replication and/or archiving data to low cost 3rd party cloud locations.
In the last 12-18 months, software-only software-defined storage (SDS) seems to be on the tip of everyone’s tongue as the “next big thing” in storage. However, getting some agreement as to what features constitute SDS software, who offers it and even who competes against who, can be a bit difficult to ascertain as provider allegiances and partnerships quickly evolve. In this second installment of my interview series with Nexenta’s Chairman and CEO, Tarkan Maner, he provides his views into how SDS software is impacting the competitive landscape, and how Nexenta seeks to differentiate itself.
DCIG is pleased to announce the availability of its 2016-17 Unified Utility Storage Array Buyer’s Guide Edition. This Buyer’s Guide, one of six Utility Storage Array Buyer’s Guide Editions produced by DCIG, reviews and ranks 29 unified utility storage arrays from eight providers. This Buyer’s Guide provides much of the information that organizations need to make an informed decision about unified utility storage arrays that are intended for use in enterprise environments. These arrays are highly available, support both block and file protocols, scale to at least 75TBs and are available for less than $1000/TB.
It was just a couple of months ago that I became aware that enterprise file sync-and-share capabilities were available for the first time behind corporate file walls with the introduction of Nexsan’s UNITY product. While at the time I could not find another storage system that offered similar capabilities, that all changed this week when HGST, a subsidiary of Western Digital, announced that it had partnered with CTERA, to offer a competitive product in the private enterprise file sync-and-share space.
Change. Digital transformation. Disrupt. Eat your own young. These were just some of the terms and phrases uttered at this past week’s HPE Discover event in Las Vegas by HPE executives at all levels of the organization. Yet in the face of the changes that are about to sweep through the technology industry, a technology provider that touches as many organizations around the world as HPE does needs to have more than this type of mindset. It needs to have the products and strategy in place to back it up. Based upon what I saw at HPE Discover last week, HPE is executing upon these requirements.
Every now and then a technology comes along that prompts enterprises to a complete do-over of their existing data center infrastructures. This type of dramatic change is already occurring within organizations of all sizes who are adopting and implementing SimpliVity.
In today’s enterprise data centers, when one thinks performance, one thinks flash. That’s great. But that thought process can lead organizations to think that “all-flash arrays” are the only option they have to get high levels of performance for their applications. That thinking is now so outdated. The latest server-based storage solution from Datrium illustrates how accelerating application performance just became insanely easy by simply clicking a button versus resorting to upgrading some hardware in their environment.
Formally or informally, almost all size organizations currently implement file sync and share in some capacity. However, almost all organizations have reservations about its implementation, especially when using public cloud file sync and share solutions such as DropBox. Nexsan’s UNITY™ represents the first storage platform in the mid-to-enterprise market to introduce enterprise file sync and share that operates inside of corporate file walls. In part 2 of my interview series with Nexsan’s CEO, Robert (Bob) Fernander, he explains how this works and what benefits early Nexsan customers are seeing from it.
Any organization that looks at the cost of networked storage for the first time may suffer from sticker shock as they look to deploy a solution. Conversely, those who already have a networked storage solution in place may feel bound to keep using the same provider going forward. Nexenta’s Chairman and CEO, Tarkan Maner, unabashedly addresses these concerns in this first part of my interview series with him as he first defines Software-Defined Storage (SDS) and then calls out storage providers for holding their customers hostage with overpriced and inflexible storage solutions.
What is old is new again and perhaps nowhere does that adage hold more true than with Nexsan. Having once been a standalone company before being acquired by Imation a few years ago, Nexsan is now back as a storage company with Imation operating in the background as a holding company. In this first installment in my interview series with him, Nexsan CEO Robert Fernander provides some details on the “new” Nexsan as well as provides an overview as to what products new and old that organizations can now expect to find as part of its product portfolio.
Anyone familiar with the Internet of Things (IoT) recognizes its potential value: the ability to capture and assimilate tons of data from devices that enable one to make better decisions. The main problem with IoT is that unless one is a billion dollar organization, most organizations struggle to even deploy IoT, much less effectively capture and analyze the information collected by IoT devices. Recognizing this deficiency, Fujitsu recently brought together cloud analytics, cloud storage and IoT to enable nearly any size organization to reap the benefits that IoT has to offer.