Complex Event Processing (CEP- not to be confused by Circular Probability Error) is defined processing many events happening across all the layers of an organization, identifying the most meaningful events within the event cloud, analyzing their impact, and taking subsequent action in real time.
Oracle CEP is a Java application server for the development and deployment of high-performance event driven applications. It can detect patterns in the flow of events and message payloads, often based on filtering, correlation, and aggregation across event sources, and includes industry leading temporal and ordering capabilities. It supports ultra-high throughput (1 million/sec++) and microsecond latency.
Tibco is also trying to get into this market (it claims to have a 40 % market share in the public CEP market 😉 though probably they have not measured the DoE and DoD as worthy of market share yet
What it is: Methods 1 through 3 look at historical data and traditional architectures with information stored in the warehouse. In this environment, it often takes months of data cleansing and preparation to get the data ready to analyze. Now, what if you want to make a decision or determine the effect of an action in real time, as a sale is made, for instance, or at a specific step in the manufacturing process. With streaming data architectures, you can look at data in the present and make immediate decisions. The larger flood of data coming from smart phones, online transactions and smart-grid houses will continue to increase the amount of data that you might want to analyze but not keep. Real-time streaming, complex event processing (CEP) and analytics will all come together here to let you decide on the fly which data is worth keeping and which data to analyze in real time and then discard.
When you use it: Radio-frequency identification (RFID) offers a good user case for this type of architecture. RFID tags provide a lot of information, but unless the state of the item changes, you don’t need to keep warehousing the data about that object every day. You only keep data when it moves through the door and out of the warehouse.
The same concept applies to a customer who does the same thing over and over. You don’t need to keep storing data for analysis on a regular pattern, but if they change that pattern, you might want to start paying attention.
Figure 4: Traditional architecture vs. streaming architecture
In academia here is something called SASE Language
The query below retrieves the total trading volume of Google stocks in the 4 hour period after some bad news occurred.
PATTERN SEQ(News a, Stock+ b[ ])WHERE [symbol] AND a.type = 'bad' AND b[i].symbol = 'GOOG' WITHIN 4 hoursHAVING b[b.LEN].volume < 80%*b[1].volumeRETURN sum(b[ ].volume)
The next query reports a one-hour period in which the price of a stock increased from 10 to 20 and its trading volume stayed relatively stable.
PATTERN SEQ(Stock+ a[])WHERE [symbol] AND a[1].price = 10 AND a[i].price > a[i-1].price AND a[a.LEN].price = 20 WITHIN 1 hourHAVING avg(a[].volume) ≥ a[1].volumeRETURN a[1].symbol, a[].price
The third query detects a more complex trend: in an hour, the volume of a stock started high, but after a period of price increasing or staying relatively stable, the volume plummeted.
PATTERN SEQ(Stock+ a[], Stock b)WHERE [symbol] AND a[1].volume > 1000 AND a[i].price > avg(a[…i-1].price)) AND b.volume < 80% * a[a.LEN].volume WITHIN 1 hourRETURN a[1].symbol, a[].(price,volume), b.(price,volume)
(note from Ajay-
I was not really happy about the depth of resources on CEP available online- there seem to be missing bits and pieces in both open source, academic and corporate information- one reason for this is the obvious military dual use of this technology- like feeds from Satellite, Audio Scans, etc)
On Demand entertainment I need to hear
On Demand information of webcasts, white papers dear
On demand downloads of information I am told I really need
Sometimes it is tough to keep which is shallow what is deep
Is it really on demand or were you overwhelmed and manipulated by the supply
On Demand Supply and estimates of forecasts of influencer of the demand
Friendship is also on demand
How many Fans, Followers, Likes can you get
Before your critical mass makes you Viral
Like a Video Bieber whose clothes are torn by crowds
Searching for your 900 seconds of On Demand fame
You want to be paid on demand but work only on a creative fancy
Your on demand laziness is too demanding now
Ceteras Paribus, On demand is too much to demand
and much too on always on 24 7
Give me a book a friend and some peace and quiet
Bet you things arent there on supply but always on demand
Or are they?
From the good folks at AsterData, a webcast on a slightly interesting analytics topic
Enterprises and government agencies can become overwhelmed with information. The value of all that data lies in the insights it can reveal. To get the maximum value, you need an analytic platform that lets you analyze terabytes of information rapidly for immediate actionable insights.
Aster Data’s massively parallel database with an integrated analytics engine can quickly reveal hard-to-recognize trends on huge datasets which other systems miss. The secret? A patent-pending SQL-MapReduce framework that enables business analysts and business intelligence (BI) tools to iteratively analyze big data more quickly. This allows you to find anomalies more quickly and stop disasters before they happen.
Discover how you can improve:
Network intelligence via graph analysis to understand connectivity among suspects, information propagation, and the flow of goods
Security analysis to prevent fraud, bot attacks, and other breaches
Geospatial analytics to quickly uncover details about regions and subsets within those communities
Visual analytics to derive deeper insights more quickly