Ransomware gets a lot of press – and for good reason – because when hackers break through your firewalls, encrypt your data, and make you pay up or else lose your data, it rightfully gets people’s attention. But hackers probably have less desire than most to be in the public eye and sensationalized ransomware headlines bring them unwanted attention. That’s why some hackers have said goodbye to the uncertainty of a payout associated with getting a ransom for your data and instead look to access your servers to do some bitcoin mining using your CPUs.
Almost all size organizations now view flash as a means to accelerate application performance in their infrastructure … and for good reason. Organizations that deploy flash typically see increases in performance by factor of up to 10x. But while many all-flash storage arrays can deliver these increases in performance, savvy organizations must prepare to do more than simply increase workload performance. They need to identify solutions that help them better troubleshoot their emerging flash infrastructure as well as future proof their investment in flash by better modeling anticipated application workloads on all-flash arrays being evaluated before they are acquired.
On March 17, 2015, the Storage Performance Council (SPC) updated its “Top Ten” list of SPC-2 results that includes performance metrics going back almost three (3) years to May 2012. Noteworthy in these updated results is that the three storage arrays ranked at the top are, in order, a high end mainframe-centric, monolithic storage array (the HP XP7, OEMed from Hitachi), an all-flash storage array (from startup Kaminario, the K2 box) and a hybrid storage array (Oracle ZFS Storage ZS4-4 Appliance). Making these performance results particularly interesting is that the hybrid storage array, the Oracle ZFS Storage ZS4-4 Appliance, can essentially go toe-to-toe from a performance perspective with both the million dollar HP XP7 and Kaminario K2 arrays and do so at approximately half of their cost.
One of the main objectives of every DCIG Buyer’s Guide is to help enterprises and/or their buyers create a short list of technology products that align with their specific business or technical needs. But once they have a short list of products that meet those needs, they still need some criteria to help them make the right choice from among those products. While there is no silver bullet that guarantees they make the “best” selection, performance testing is an option to which organizations often turn to validate a choice in technology though using this method is as complex as it sounds.