Last week HPE announced its acquisition of SimpliVity, a provider of enterprise hyper-converged infrastructure solutions. While that announcement certainly made news in the IT industry, the broader implications of this acquisition signaled that enterprise IT providers such as HPE could no longer sit on the sidelines and merely be content to partner with providers such as SimpliVity as hyper-converged solutions rapidly become a growing percentage of enterprise IT. If HPE wanted its fair share of this market, it was imperative that it act sooner rather than later to ensure it remained a leading player in this rapidly growing market.
Change. Digital transformation. Disrupt. Eat your own young. These were just some of the terms and phrases uttered at this past week’s HPE Discover event in Las Vegas by HPE executives at all levels of the organization. Yet in the face of the changes that are about to sweep through the technology industry, a technology provider that touches as many organizations around the world as HPE does needs to have more than this type of mindset. It needs to have the products and strategy in place to back it up. Based upon what I saw at HPE Discover last week, HPE is executing upon these requirements.
In today’s enterprise data centers, when one thinks performance, one thinks flash. That’s great. But that thought process can lead organizations to think that “all-flash arrays” are the only option they have to get high levels of performance for their applications. That thinking is now so outdated. The latest server-based storage solution from Datrium illustrates how accelerating application performance just became insanely easy by simply clicking a button versus resorting to upgrading some hardware in their environment.
As the whole technology world (or at least those intimately involved with the enterprise data center space) takes a breath before diving head first into VMworld next week, a few vendors are jumping the gun and making product announcements in advance of it. One of those is SimpliVity which announced its latest hyper-converged offering, OmniStack 3.0, this past Wednesday. In so doing, it continues to put a spotlight on why hyper-converged infrastructures and the companies delivering them are experiencing hyper-growth even in a time of relative market and technology uncertainty.
Hyper-converged infrastructures are quickly capturing the fancy of end-user organizations everywhere. They bundle hypervisor, server and storage in a single node and provide the flexibility to scale-out to form a single logical entity. In this configuration, they offer a very real opportunity for organizations to economically and practically collapse their existing infrastructure of servers and storage arrays into one that is much easier to implement, manage and upgrade over time.
Facebook is turning to a disaggregated racks strategy to create a next gen cloud computing data center infrastructure
Dell has had all of the pieces for a number of years to be a next generation technology company that does more than just sell products but to actually integrate them and solve the broader, real world problems that enterprises face. However, to date, Dell has been trapped in the world of “1+1+1=1” where organizations only get the individual value that each Dell product has to offer but no broader synergistic value that using all of their products together could potentially and ideally collectively deliver. Yet at this year’s Dell World 2014, I saw more tangible evidence that the bigger value proposition that Dell has the potential and technologies to deliver is getting much closer to being a reality.
As the role of IT changes from functioning as specialists to generalists, many IT staff members find themselves in the role of a Business Technologist. In this new role, they serve a two-fold purpose. First, they must understand and document the specific needs and requirements of the business by interfacing with key end-users and product managers. Once they document these needs, they then map those requirements to a specific technology solution that solves them.
As I attended sessions at Microsoft TechEd 2014 last week and talked with people in the exhibit hall a number of themes emerged including “mobile first, cloud first”, hybrid cloud, migration to the cloud, disaster recovery as a service, and flash memory storage as a game-changer in the data center. But as I reflect on the entire experience, a statement made John Loveall, Principal Program Manager for Microsoft Windows Server during one of his presentations sums up to overall message of the conference, “Today it is really all about the integrated solution.”
One of the more difficult tasks for anyone deeply involved in technology is the ability to see the forest from the trees. Often responsible for supporting the technical components that make up today’s enterprise infrastructures, to step back and recommend which technologies are the right choices for their organization going forward is a more difficult feat. While there is no one right answer that applies to all organizations, five (5) technologies – some new as well as some old technologies that are getting a refresh – merit that organizations prioritize them in the coming months and years.
DCIG is pleased to announce the availability of its DCIG 2014-15 $50K and Under Converged Infrastructure Buyer’s Guide. In this two Buyer’s Guides, DCIG weights, scores and ranks 10 converged infrastructure solutions from six (6) different providers. Like previous DCIG Buyer’s Guides, this Buyer’s Guide provides the critical information that all size organizations need when selecting a converged infrastructure solution to help expedite application deployments and then simplify their long term management.