Enterprises view in-server flash with more than a wary eye. On one hand, they see how it tremendously accelerates application performance at a fraction of what flash on storage arrays costs. Yet on the other hand, data on in-server flash is no longer centrally stored so it must be managed as a “one-off” which detracts from its appeal. Leveraging the new SmartIO feature in Storage Foundation 6.1, organizations can begin to realize the best of what in-server flash offers while eliminating this drawback.
Many enterprises have been watching the development of flash with a high level of interest though they have cautiously deployed it as a storage tier because of its high cost. Symantec Foundation Suite 6.1 takes these concerns head-on by delivering functionality that will inherently change the way enterprises design, implement, and maintain shared storage environments. In particular, Storage Foundation’s new Flexible Storage Sharing (FSS) feature provides organizations the flexibility to non-disruptively put capacity, performance or both in their servers while still making it accessible to all of the applications in that cluster.
It is no secret that almost every organization regardless of its size has growing amounts of unstructured data that reside almost everywhere. The BIG unknown is what useful information, if any, these data repositories contain and what value, cost or risk they present. Using Symantec Data Insight 4.0, organizations can better understand the data that resides within their Dark Data repositories, the context in which it is being used, and then take informed actions to better manage and secure this data.
The storm season is once again upon us and it looks like it will be another one for the record books as evidenced by the tornados that have already hit Oklahoma and many other states. In fact, if you live in the Midwest and particularly in eastern and southeastern Nebraska, this picture probably has an all too familiar look to it. In this particular case, it’s time to batten down the hatches and get ready for a rough ride
How fully virtualized organizations are is often a calculated guess based on anecdotal evidence or surveys conducted by virtualization providers who just sample organizations who already use virtualization. King Research’s Windows Server 2012 Migration/Virtualization Survey, commissioned by Symantec, eliminates much of this guesswork and built-in bias. However it more importantly provides key insights into just how virtualized organizations are right now, how quickly they plan to virtualize their environments in the next few years and how they would prefer to protect them once virtualized.
A reality check is going on in enterprises when it comes to cloud backup. While the vast majority recognize its value and are aggressively adopting it at many levels, the intangible issues of recovery and support tend to rear their head and preclude these enterprises from to date adopting a core cloud offering: cloud backup. It is these concerns that IBM and Symantec are teaming up to tackle so that enterprises may confidently do more than backup to the cloud – they can recover their data once it is in the cloud with a process that is supported end-to-end.
Using cluster file system software on virtual machines (VMs) in VMware environments has always been a bit problematic at best. While it could be done with techniques like Raw Disk Mappings (RDMs) and 3rd party cluster file system software, organizations need to sacrifice “desirable” virtualization features like vMotion to achieve it.
Virtualizing applications such that it results in the use of fewer servers makes great sense. Applications are centralized. Hardware is more efficiently used. Data center floor space is freed up. Virtual machine (VM) loads may be more efficiently and non-disruptively redistributed between physical systems. But then the realization hits. You have put all of your proverbial eggs in one basket and unless you have a real or near real-time copy of this data off-site, should a major disaster hit, your goose is cooked. The question then becomes, “What is the best way to get this data off-site?”
Everyone hates to deal with clutter and perhaps nowhere is this truer than when it comes to managing data. Knowingly or otherwise, enterprises tend to sweep data management tasks under the proverbial rug since they rarely see its true cost or feel its impact. That perception changed significantly in 2012 as more organizations are starting to feel the sting of being unable to find the information they need simply because they have too much data in too many places to effectively search it.
Everyone anecdotally knows that solid state disks (SSDs) are fast – like really fast when compared to hard disk drives (HDDs). It is just that the number of proof points coming from independent sources that conclusively demonstrate their performance advantage have been in short supply. Now proof points appearing on the SPECsfs website are confirming what people already suspect to be true: the performance of SSD-based systems is smoking fast with off-the-shelf SSD-based storage systems leaving their enterprise counterparts in the dust.
It was just a few years ago that “mobile devices” and the “the cloud” were blips on corporate radar screens. Fast forward to today and those blips are fast taking shape as major forces for which enterprises must account. As this occurs, organizations need to re-think the steps they take to control and manage information sprawl going forward.
It is no secret that almost any enterprise with performance intensive applications wants to host them on flash memory storage sooner rather than later. Yet what precludes some enterprises from hosting these applications on flash memory storage are concerns about flash memory’s cost, application disruption and even how the data is protected once it is placed there. Using Symantec Storage Foundation in conjunction with flash memory solutions such as the Fusion ioDrive helps to put these concerns to rest.
Delivering high availability (HA) to applications classified as “business critical” in recent years has been as much a technical obstacle as a financial one that organizations have struggled to overcome. The latest version of Symantec’s Veritas Cluster Server addresses these concerns. Now any application running on either a physical or virtual machine may recover almost immediately to a virtual machine (VM) giving enterprises the high availability (HA) they have sought without the hardware costs or VM reboot wait times.
In the last few years VMware has added a number of features to its core vSphere platform to address organizational concerns about the availability and uptime of their virtualized applications, to include High Availability (HA), vCenter Site Recovery Manager (SRM) and vMotion. Yet there are still other aspects of delivering on the uptime requirements of mission critical applications that enterprises want that VMware does not offer. It is these gaps that Symantec’s upcoming release of Veritas Cluster Server fills.
The ramifications of organizations not getting data under control are significant. Recent analyst studies find that structured data stores may grow by as much as 60% annually and unstructured data stores by as much as 80% annually. Aggravating this situation, once all of this data is consolidated, the hardware costs associated with scaling the storage infrastructure to accommodate this data growth may be a factor of up to 10x of what it costs prior to consolidation.
The IT infrastructure that most enterprises want is pretty obvious: it is a private cloud. Less intuitive, however, is how enterprises will transition from simply hosting file, print and web servers in their private clouds today to hosting business critical applications tomorrow. Successfully navigating this transition requires that enterprises introduce a new set of proven technologies that deliver the agility and cost-savings that they have come to expect from private clouds with the availability, performance, manageability and visibility they need for their business critical applications and data.
As businesses and enterprises of all sizes adopt and implement virtualization and the cloud, they expect the future of their IT data centers to be much brighter in terms of driving down infrastructure costs while improving agility. Yet the global findings coming out of Symantec’s September 2012 State of the Data Center Survey suggest that complexity is doing more than surviving and thriving in today’s virtualized data centers. The promised benefits of virtualization and the cloud are failing to fully materialize.
Flash memory arrays have already earned a reputation as being the most highly performing and energy efficient storage systems available. However the level of trust that enterprises have in the storage management software on these arrays is still being built which results in enterprises being reluctant to use flash memory arrays for hosting anything but a few performance-intensive applications. By Violin Memory now integrating key Symantec Veritas Storage Foundation storage efficiency and storage management technologies into its flash memory arrays, enterprises get the performance and energy efficiencies they want with the proven software stability and reliability they need.
The gap between fantasy and reality is still pretty wide in terms of what enterprises hope to achieve with Big Data analytics. Bridging this gap requires that organizations follow some best practices when implementing Big Data analytics tools and take into account some of the shortcomings of Hadoop. Using Symantec’s forthcoming Veritas Cluster File System Connector for Hadoop, they may implement Hadoop and the Big Data analytics benefits it provides coupled with the stability and reliability that the Veritas Cluster File System provides.
People who are on the outside looking in at a data center often assume that it runs like clockwork with everyone knowing exactly what hardware is located where, what applications are running on the hardware and that the hardware is properly utilized. Yet as anyone who actually works in a data center knows, making that perception a reality requires the right software, a lot of hard work and more than a little bit of luck. The release of Veritas Operations Manager Advanced 5.0 from Symantec gives enterprises the software platform they need to make this perception a reality while serving to take some of the hard work and chance out of the management equation.