President and Founder, DCIG, LLC.
Jerome Wendt currently serves as the President and Founder of DCIG, LLC, which he founded in 2007. Mr. Wendt is an avid writer who has written thousands of articles that have appeared in multiple magazines, on-line publications, and websites. Mr. Wendt is recognized as one of the foremost technology analysts in the enterprise data storage and data protection industries. Mr. Wendt covers topics related to enterprise and cloud infrastructures to include all-flash and hybrid arrays, cloud computing, cloud storage, data protection, hyperconverged infrastructures, and software-defined storage (SDS).
Since founding DCIG, Mr. Wendt originated and developed the processes and methodologies that went into the creation of the DCIG Buyer’s Guides. The first DCIG Midrange Array Buyer’s Guide was released in 2010 with millions of copies of the DCIG Buyer’s Guides being distributed worldwide. These Buyer’s Guides have assisted decision makers in properly evaluating and classifying key enterprise data center technologies. The DCIG Buyer’s Guides are widely recognized and used by information technology professionals who view them as the “go-to” source if looking to understand where a product best fits in their enterprise infrastructure.
Prior to founding DCIG, Mr. Wendt served as storage engineer working for First Data Corp. He also has written and contributed to leading publications to include ComputerWorld, InfoStor, IT Central Station, SearchStorage.com, and Storage Magazine, among others. He earned a bachelor’s degree in Computer Information Systems in 1995 from Washburn University (Topeka, KS) and a bachelor’s degree in Theology in 1990 from Ambassador University (now merged with Azusa Pacific University) in Pasadena, CA. More recently, Mr. Wendt was certified as an Amazon Cloud Solutions Architect. When away from work, he enjoys bowling, camping, fishing and playing Sudoku.
In 2019 the level of interest that companies expressed in using artificial Intelligence (AI) and machine learning (ML) exploded. Their interest is justifiable. These technologies gather the almost endless streams of data coming out of the scads of devices that companies deploy everywhere, analyze it, and then turn it into useful information. But time is the secret ingredient that companies must look for as they look to select an effective AI or ML product.
Vendors are finding multiple ways to enter the scale-out hyper-converged infrastructure (HCI) backup conversation. Some acquire other companies such as StorageCraft did in early 2017 with its acquisition of ExaBlox. Others build their own such as Cohesity and Commvault did. Yet among these many iterations of scale-out, HCI-based backup systems, HYCU’s decision to piggyback its new HYCU-X on top of existing HCI offerings, starting with Nutanix’s AHV HCI Platform, represents one of the better and more insightful ways to deliver backup using a scale-out architecture.
To ensure an application migration to the cloud goes well or that a company should even migrate a specific application to the cloud requires a thorough understanding of each application. This understanding should encompass what resources the application currently uses as well as how it behaves over time. To gather the information it needs about each application, here is a list of best practices that a company can put in place for its on-premises applications before it moves any of them to the cloud.
There is little dispute tomorrow’s data center will become software-defined for reasons no one entirely anticipated even as recently as a few years ago. While companies have long understood the benefits of virtualizing the infrastructure of their data centers, the complexities and costs of integrating and managing data center hardware far exceeded whatever benefits that virtualization delivered. Now thanks to technologies such as such as the Internet of Things (IoT), machine intelligence, and analytics, among others, companies may pursue software-defined strategies more aggressively.