was successfully added to your cart.

TIBCO Event Processing: Relevant, Real-time Operational Intelligence

Deriving value from the plethora of unstructured data created by today’s multiple sources of Big Data hinges on analyzing and acting on it in real-time. To do so, enterprises must employ a solution that analyzes Big Data streams as they flow in. Using TIBCO Software’s Event Processing platform, enterprises can process Big Data streams while they are still in motion providing real-time operational intelligence so they may take the appropriate action while the action still has meaningful value.

Streams of Big Data Flowing In

Enterprises have more opportunities – and more reasons – than ever to capture multiple streams of Big Data coming in from numerous sources. Device sensors, Internet of Things (IoT), log files, RFID tags and social media platforms such as blogs, Facebook, Google+, LinkedIn and Twitter, all generate raw data that enterprises can utilize to make real-time assessments.

In this new Digital Economy the advantage goes to enterprises that can capture data, analyze it and then quickly and appropriately respond to it as events occur. Data’s greatest value increasingly becomes the moment it is created or a short period of time (seconds, minutes or hours) thereafter. This makes it essential for enterprises to have a solution that can ingest and analyze this data, and in a timely manner, produce the information that enterprises need to act appropriately to save money or turn a profit.

The Challenges of Extracting Big Data’s Real-time Value

The ease and speed with which large volumes of data generated by machines and human activities are offset by the multiple challenges associated with quickly and effectively deriving value from this data. Specific challenges associated with extracting Big Data’s value include:

  • Ingesting data from numerous, different devices. Multiple bespoke protocols and industrial standards mean that little commonality exists in how device sensors, IoT devices, RFID tags and social media platforms transmit and receive data. This puts the onus on any data processing solution to account for how each of these devices or platforms transmits data so it may appropriately ingest the data and in the correct sequence.
  • Storing and expeditiously processing the data. It is estimated that 50 billion devices will be connected to the internet by 2020. Twitter already daily averages 500 million tweets while Facebook collects approximately 500 terabytes of data per day. Ingesting and analyzing the data from these sources in real-time time to derive value requires that any solution have the architecture and efficiency to keep up with these data rates.
  • Establishing the data’s context. Data arriving from each of these sources does not map into the traditional “name,” “address,” “email,” and “phone number” fields used by relational databases. Rather data is created and stored in an unstructured format. This leaves it to the solution to establish the data’s content and context as its meaning may change depending upon when and under what conditions the data was created

Fast Data Architecture Delivers Real-time Operational Intelligence

Achieving operational intelligence requires a Fast Data architecture that analyzes Big Data in real-time as it happens. Big Data analytics were designed to look at historical information and produce analysis after the benefits associated with the collecting the data has passed. This “Too Late” approach makes it difficult if not impossible to reap the benefits of Big Data analysis for firms wishing to use and iterate on that analysis with live streaming data.

The TIBCO Fast Data architecture provides this missing link to realizing Big Data’s benefits. This architecture is designed around the processing, analysis and immediate insight into data in real-time. To accomplish this, it ingests and holds Big Data streams in memory as they arrive for a specified period of time. Holding this data in-memory expedites its analysis while also providing defined parameters to evaluate the data’s context.

Data held in memory is analyzed based upon one or more criterion to identify and spot patterns so decisions and actions may occur promptly and while there is still value, avoiding the Too Late architecture of existing systems and more recent Big Data models. For instance, a Fast Data architecture will:

  • Continuously run queries against the incoming streams of data to determine if matching conditions exist to take action.
  • Perform thousands or even millions of queries per second. Since new data is constantly arriving as old data ages, the conditions for whether or not to perform an action may change quickly.
  • Provide the flexibility to add, change or remove queries as well as change the frequency of refresh rates as to how quickly queries are performed across the data in memory.

By continuously running queries against an ever changing set of data and then matching to real-time actions, solutions based on the TIBCO Fast Data architecture can finally deliver enterprises the operational intelligence that they need to take action while it still matters which saves money, improves customer satisfaction and drives profitability.

Fast Data’s Real World Business Ramifications

Identifying and creating new revenue opportunities, improving operational efficiencies and driving down CAPEX and/or OPEX costs are just some of the possibilities that result from implementing a solution based on the TIBCO Fast Data architecture. By analyzing large amounts of data created within a defined period of time, then creating and executing queries based on business rules against that data in real time, enterprises can drive customer satisfaction and revenue in new and innovative ways.

For example, more and more people have Internet-connected mobile devices. .By associating the device with the customer and their location the TIBCO Fast Data enabled solution can determine if the individual is a new or returning customer and potentially even pull up past purchases made by that individual. Using that information, an email or text may be sent to that individual’s device that contains a coupon, offers a deal that is only valid while they are in the store or recommends a item to buy based upon a prior purchase they have made. This is context-aware marketing and customer service that adds value.

Enterprises may similarly leverage the TIBCO Fast Data-solution to improve operational efficiencies and/or drive down costs. For example, delivery services can improve efficiency with effective rerouting, and minimize staffing by predicting and updating package volumes in real time. Real-time information can also enhance partner relationships and uncover new business opportunities. And the same fast data platform can simultaneously improve customer experience by exposing more information about real time locations of package and predictions for when they will be delivered.

TIBCO Event Processing: Relevant, Real-time Operational Intelligence

Enterprises have had access to multiple streams of Big Data generated by external social media platforms, mobile devices and IoT as well as internal sources such as device sensors and RFID tags for some time. Yet maximizing value from historical analysis of past data and live streaming data typically requires analyzing and acting upon it in minutes or even seconds after its creation.

TIBCO Event Processing provides enterprises the TIBCO Fast Data architecture that they need to do real-time processing and analytics. By quickly ingesting and analyzing data and then making real-time decisions based upon it, TIBCO Event Processing gives enterprises access to the operational intelligence that they need to make an informed business decision based upon the best data available in the environment in which they operate.

image_pdfimage_print
Jerome M. Wendt

About Jerome M. Wendt

President & Lead Analyst of DCIG, Inc. Jerome Wendt is the President and Lead Analyst of DCIG Inc., an independent storage analyst and consulting firm. Mr. Wendt founded the company in September 2006.

Leave a Reply