Delivering software specific solutions in the form of appliances have turned niche software applications such as deduplication into some of today’s hottest mainstream technologies. But independent software vendors (ISVs) can still be fearful that offering their software in the form of an appliance can rob from existing revenue streams and create new support costs. In this final segment of a 3-part series, independent consultant to Bell Micro, Tom Baylark, discusses how offering software on an appliance can broaden software’s appeal without increasing and even possibly lower ISV costs.
Resellers and their customers increasingly expect that the software they use for specific applications such as video surveillance and CAT Scans is delivered to them in the form of appliances for faster, turnkey deployments. But just because the software is bundled with hardware in the form of an appliance does not mean all solutions are the same, even those delivering branded appliance solutions.
Independent software vendors (ISVs) that sell software based on x86 hardware platforms face a new type of challenge in today’s economic environment. While their software can run on any vendor’s hardware platform, the time it takes for them to install, configure and support their software on each platform gives resellers pause and is prompting resellers and customers alike to look for the ISV’s software in the form of appliance-based solutions.
I was not the only one from DCIG attending SNW last week: Kelly Polanski, a Contributing Analyst with DCIG, was also in the SNW mix. While I took briefings and caught up on the latest advancements in specific products, Kelly attended the afternoon SNW Summits. Following these Summits, she provided me with some of her notes. I found some of the information significantly compelling, and so I’m sharing them now with DCIG’s readership. Today’s blog shares some of the insights that Kelly gained from the Cloud Computing Summit on April 6.
Infrastructure management remains one of the nagging, unresolved issues of the information age. Companies bring more computer equipment in every size, shape and form into their data centers and offices. Getting this equipment installed and configured is rarely a problem. But tracking what pieces of equipment are under warranty, and when those warranties expire and keeping that information easily accessible when it is needed, is a rarity. Add the software maintenance contracts for each OS and application, each of which has its own expiration date, and the burden on already stressed IT teams is enormous.
20 years ago Michael Dell envisioned a future where businesses no longer just bought servers and storage from resellers but instead bought these products over the Internet. Fast forward to today. Michael Dell envisions a new type of future as Dell puts in place a channel program with equal ambitions of selling equal amounts of gear through the channel and pulling in billions of dollar from it. The question is, “Did Dell so succeed in going direct that, in the process, it forever alienated the very resellers that it now needs to partner with in order to succeed in the channel?”
The last thing anyone usually thinks about is the details of the service contract when they purchase a new product. Companies at a high level may know they are signing up for next day or 4 hour break/fix support. But, in practice, there is no guarantee in the contract in terms of when they will actually get their product repaired and their application back online. All that the 4-hour service level guarantees is that a qualified technician will be on-site within 4 hours.
As previously discussed in a DCIG blog entry on the real impact of losing 1 bit in 100 trillion, non-recoverable bit errors on SATA disk drives have the potential to become a too frequently occurring problem as organizations dramatically increase the scale of their disk stores. Imagine what will happen when data stores expand to petabyte sizes with more frequent access? Right now commercial data stores are on track to achieve the petascale range sooner rather than later. According to multiple sources, data collected and stored is doubling every year for most businesses; a rate of growth that has held fairly constant over time. In the 1990s, a 100 GB database was large enough to stress most systems – back when disk scanning speeds were 30 MB/s and database tools were relatively immature. In the current decade, terascale data stores are already common – and managing 100 GB is now considered somewhat trivial. In the coming decade, truly massive petascale systems can…
The yardstick for measuring the effectiveness of technology distributors has become exceedingly narrow. Most would agree that distribution competitiveness is currently measured as a function of component price and time-to-delivery for their reseller partners. Competing for new and expanded business opportunity using these criteria is tough, because of the maturity of distribution models and distributor practices which look largely the same from one to the next.