Test drive a Chrome notebook.

The United States
Image via Wikipedia

Wanna test out the new Chrome OS.

Go to https://services.google.com/fb/forms/cr48basic/

and fill the form

Chrome

Test drive a Chrome notebook.

We have a limited number of Chrome notebooks to distribute, and we need to ensure that they find good homes. That’s where you come in. Everything is still very much a work in progress, and it’s users, like you, that often give us our best ideas about what feels clunky or what’s missing. So if you live in the United States, are at least 18 years old, and would like to be considered for our small Pilot program, please fill this out. We’ll review the requests that come in and contact you if you’ve been selected.

https://services.google.com/fb/forms/cr48basic/

 

EU files anti trust against Google to reduce budget deficit

From the Old Lady-

http://www.nytimes.com/2010/12/01/technology/01google.html?_r=1&hpw

Google’s dominance on the Internet has been a sore point in Europe, where it controls more than 80 percent of the online search market, compared with about 66 percent in the United States, according to comScore, a research firm.

and

If Google is found in violation of European competition law, the commission has the power to fine it up to 10 percent of its annual revenue, which totaled more than $23 billion last year.

Before settling last year, Microsoft had paid fines of about $2.4 billion over the past decade in a long-running antitrust case in Brussels that focused on the Windows operating system.

In another case, the commission fined Intel about $1.45 billion for abusing its dominance in the computer chip market.


——————————————————————————————

Maybe Google should ask the European Union to buy a groupon for anti trust cases.

Related

11 Ways to Beat Google

 

 

AsterData partners with Tableau

This chart represents several constituent comp...
Image via Wikipedia

Tableau which has been making waves recntly with its great new data visualization tool announced a partner with my old friends at AsterData. Its really cool piece of data vis and very very fast on the desktop- so I can imagine what speed it can help with AsterData’s MPP Row and Column Zingbang AND Parallel Analytical Functions

Tableau and AsterData also share the common Stanfordian connection (but it seems software is divided quite equally between Stanford, Hardvard Dropouts and North Carolina )

It remains to be seen in this announcement how much each company  can leverage the partnership or whether it turns like the SAS Institute- AsterData partnership last year or whether it is just to announce connectors in their software to talk to each other.

See a Tableau vis at

http://public.tableausoftware.com/views/geographyofdiabetes/Dashboard2?:embed=yes&:toolbar=yes

AsterData remains the guys with the potential but I would be wrong to say MapReduceSQL is as hot in December 2010 as it was in June 2009- and the elephant in the room would be Hadoop. That and Google’s continued shyness from encashing its principal comptency of handling Big Data (but hush – I signed a NDA with the Google Prediction API– so things maaaay change very rapidly on ahem that cloud)

Disclaimer- AsterData was my internship sponsor during my winter training while at Univ of  Tenn.

 

Brief Interview Timo Elliott

Here is a brief interview with Timo Elliott.Timo Elliott is a 19-year veteran of SAP Business Objects.

Ajay- What are the top 5 events in Business Integration and Data Visualization services you saw in 2010 and what are the top three trends you see in these in 2011.


Timo-

Top five events in 2010:

(1) Back to strong market growth. IT spending plummeted last year (BI continued to grow, but more slowly than previous years). This year, organizations reopened their wallets and funded new analytics initiatives — all the signs indicate that BI market growth will be double that of 2009.

(2) The launch of the iPad. Mobile BI has been around for years, but the iPad opened the floodgates of organizations taking a serious look at mobile analytics — and the easy-to-use, executive-friendly iPad dashboards have considerably raised the profile of analytics projects inside organizations.

(3) Data warehousing got exciting again. Decades of incremental improvements (column databases, massively parallel processing, appliances, in-memory processing…) all came together with robust commercial offers that challenged existing data storage and calculation methods. And new “NoSQL” approaches, designed for the new problems of massive amounts of less-structured web data, started moving into the mainstream.

(4) The end of Google Wave, the start of social BI.Google Wave was launched as a rethink of how we could bring together email, instant messaging, and social networks. While Google decided to close down the technology this year, it has left its mark, notably by influencing the future of “social BI”, with several major vendors bringing out commercial products this year.

(5) The start of the big BI merge. While several small independent BI vendors reported strong growth, the major trend of the year was consolidation and integration: the BI megavendors (SAP, Oracle, IBM, Microsoft) increased their market share (sometimes by acquiring smaller vendors, e.g. IBM/SPSS and SAP/Sybase) and integrated analytics with their existing products, blurring the line between BI and other technology areas.

Top three trends next year:

(1) Analytics, reinvented. New DW techniques make it possible to do sub-second, interactive analytics directly against row-level operational data. Now BI processes and interfaces need to be rethought and redesigned to make best use of this — notably by blurring the distinctions between the “design” and “consumption” phases of BI.

(2) Corporate and personal BI come together. The ability to mix corporate and personal data for quick, pragmatic analysis is a common business need. The typical solution to the problem — extracting and combining the data into a local data store (either Excel or a departmental data mart) — pleases users, but introduces duplication and extra costs and makes a mockery of information governance. 2011 will see the rise of systems that let individuals and departments load their data into personal spaces in the corporate environment, allowing pragmatic analytic flexibility without compromising security and governance.

(3) The next generation of business applications. Where are the business applications designed to support what people really do all day, such as implementing this year’s strategy, launching new products, or acquiring another company? 2011 will see the first prototypes of people-focused, flexible, information-centric, and collaborative applications, bringing together the best of business intelligence, “enterprise 2.0”, and existing operational applications.

And one that should happen, but probably won’t:

(4) Intelligence = Information + PEOPLE. Successful analytics isn’t about technology — it’s about people, process, and culture. The biggest trend in 2011 should be organizations spending the majority of their efforts on user adoption rather than technical implementation.                 About- http://timoelliott.com/blog/about

Timo Elliott is a 19-year veteran of SAP BusinessObjects, and has spent the last twenty years working with customers around the world on information strategy.

He works closely with SAP research and innovation centers around the world to evangelize new technology prototypes.

His popular Business Analytics and SAPWeb20 blogs track innovation in analytics and social media, including topics such as augmented corporate reality, collaborative decision-making, and social network analysis.

His PowerPoint Twitter Tools lets presenters see and react to tweets in real time, embedded directly within their slides.

A popular and engaging speaker, Elliott presents regularly to IT and business audiences at international conferences, on subjects such as why BI projects fail and what to do about it, and the intersection of BI and enterprise 2.0.

Prior to Business Objects, Elliott was a computer consultant in Hong Kong and led analytics projects for Shell in New Zealand. He holds a first-class honors degree in Economics with Statistics from Bristol University, England. He blogs on http://timoelliott.com/blog/ (one of the best designed blogs in BI) . You can see more about him personal web site here and photo/sketch blog here. You should follow Timo at http://twitter.com/timoelliott

Art Credit- Timo Elliott

Related Articles

Complex Event Processing- SASE Language

Logo of the anti-RFID campaign by German priva...
Image via Wikipedia

Complex Event Processing (CEP- not to be confused by Circular Probability Error) is defined processing many events happening across all the layers of an organization, identifying the most meaningful events within the event cloud, analyzing their impact, and taking subsequent action in real time.

Software supporting CEP are-

Oracle http://www.oracle.com/us/technologies/soa/service-oriented-architecture-066455.html

Oracle CEP is a Java application server for the development and deployment of high-performance event driven applications. It can detect patterns in the flow of events and message payloads, often based on filtering, correlation, and aggregation across event sources, and includes industry leading temporal and ordering capabilities. It supports ultra-high throughput (1 million/sec++) and microsecond latency.

Tibco is also trying to get into this market (it claims to have a 40 % market share in the public CEP market 😉 though probably they have not measured the DoE and DoD as worthy of market share yet

– see webcast by TIBCO ‘s head here http://www.tibco.com/products/business-optimization/complex-event-processing/default.jsp

and product info here-http://www.tibco.com/products/business-optimization/complex-event-processing/businessevents/default.jsp

TIBCO is the undisputed leader in complex event processing (CEP) software with over 40 percent market share, according to a recent IDC Study.

A good explanation of how social media itself can be used as an analogy for CEP is given in this SAS Global Paper

http://support.sas.com/resources/papers/proceedings10/040-2010.pdf

You can see a report on Predictive Analytics and Data Mining  in q1 2010 also from SAS’s website  at –http://www.sas.com/news/analysts/forresterwave-predictive-analytics-dm-104388-0210.pdf

A very good explanation on architecture involved is given by SAS CTO Keith Collins here on SAS’s Knowledge Exchange site,

http://www.sas.com/knowledge-exchange/risk/four-ways-divide-conquer.html

What it is: Methods 1 through 3 look at historical data and traditional architectures with information stored in the warehouse. In this environment, it often takes months of data cleansing and preparation to get the data ready to analyze. Now, what if you want to make a decision or determine the effect of an action in real time, as a sale is made, for instance, or at a specific step in the manufacturing process. With streaming data architectures, you can look at data in the present and make immediate decisions. The larger flood of data coming from smart phones, online transactions and smart-grid houses will continue to increase the amount of data that you might want to analyze but not keep. Real-time streaming, complex event processing (CEP) and analytics will all come together here to let you decide on the fly which data is worth keeping and which data to analyze in real time and then discard.

When you use it: Radio-frequency identification (RFID) offers a good user case for this type of architecture. RFID tags provide a lot of information, but unless the state of the item changes, you don’t need to keep warehousing the data about that object every day. You only keep data when it moves through the door and out of the warehouse.

The same concept applies to a customer who does the same thing over and over. You don’t need to keep storing data for analysis on a regular pattern, but if they change that pattern, you might want to start paying attention.

Figure  4: Traditional architecture vs. streaming architecture

Figure 4: Traditional architecture vs. streaming architecture

 

In academia  here is something called SASE Language

  • A rich declarative event language
  • Formal semantics of the event language
  • Theorectical underpinnings of CEP
  • An efficient automata-based implementation

http://sase.cs.umass.edu/

and

http://avid.cs.umass.edu/sase/index.php?page=navleft_1col

Financial Services

The query below retrieves the total trading volume of Google stocks in the 4 hour period after some bad news occurred.

PATTERN SEQ(News a, Stock+ b[ ])WHERE   [symbol]    AND	a.type = 'bad'    AND	b[i].symbol = 'GOOG' WITHIN  4 hoursHAVING  b[b.LEN].volume < 80%*b[1].volumeRETURN  sum(b[ ].volume)

The next query reports a one-hour period in which the price of a stock increased from 10 to 20 and its trading volume stayed relatively stable.

PATTERN	SEQ(Stock+ a[])WHERE 	 [symbol]   AND	  a[1].price = 10   AND	  a[i].price > a[i-1].price   AND	  a[a.LEN].price = 20            WITHIN  1 hourHAVING	avg(a[].volume) ≥ a[1].volumeRETURN	a[1].symbol, a[].price

The third query detects a more complex trend: in an hour, the volume of a stock started high, but after a period of price increasing or staying relatively stable, the volume plummeted.

PATTERN SEQ(Stock+ a[], Stock b)WHERE 	 [symbol]   AND	  a[1].volume > 1000   AND	  a[i].price > avg(a[…i-1].price))   AND	  b.volume < 80% * a[a.LEN].volume           WITHIN  1 hourRETURN	a[1].symbol, a[].(price,volume), b.(price,volume)

(note from Ajay-

 

I was not really happy about the depth of resources on CEP available online- there seem to be missing bits and pieces in both open source, academic and corporate information- one reason for this is the obvious military dual use of this technology- like feeds from Satellite, Audio Scans, etc)

Tale of Two Analytical Interfaces

Occam’s razor (or Ockham’s razor[1]) is often expressed in Latin as the lex parsimoniae(translating to the law of parsimonylaw of economy or law of succinctness). The principle is popularly summarized as “the simplest explanation is more likely the correct one.

Using a simple screenshot- you can see Facebook Analytics for a Facebook page is simpler at explaining who is coming to visit rather than Google Analytics Dashboard (which has not seen the attention of a Visual UI or Graphic Redesign)

And if Facebook is going to take over the internet, well it is definitely giving better analytics in the process. What do you think?

Which Interface is simpler- and gives you better targeting. Ignore the numbers and just see the metrics measured and the way they are presented. Coincidently R is used at Facebook a lot (which has given the jjplot package)- and Google has NOT INVESTED MAJOR MONEY in creating Premium R Packages or Big Data Packages. I am talking investment at the scale Google is known for- not measly meetups.

(the summer of code dont count- it is for students mostly)

(but thanks for the Pizza G Men- and maybe revise that GA interface by putting a razor to some metrics)

GA vs Facebook Analytics

 

Who searches for this Blog?

Statue of Michael Jackson in Eindhoven, the Ne...
Image via Wikipedia

Using WP- Stats I set about answering this question-

What search keywords lead here-

Clearly Michael Jackson is down this year

And R GUI, Data Mining is up.

How does that affect my writing- given I get almost 250 visitors by search engines alone daily- assume I write nothing on this blog from now on.

It doesnt- I still write what ever code or poem that comes to my mind. So it is hurtful people misunderstimate the effort in writing and jump to conclusions (esp if I write about a company- I am not on payroll of that company- just like if  I write about a poem- I am not a full time poet)

Over to xkcd

All Time (for Decisionstats.Wordpress.com)

Search Views
libre office 818
facebook analytics 806
michael jackson history 240
wps sas lawsuit 180
r gui 168
wps sas 154
wordle.net 118
sas wps 116
decision stats 110
sas wps lawsuit 100
google maps jet ski 94
data mining 88
doug savage 72
hive tutorial 63
spss certification 63
hadley wickham 63
google maps jetski 62
sas sues wps 60
decisionstats 58
donald farmer microsoft 45
libreoffice 44
wps statistics 44
best statistics software 42
r gui ubuntu 41
rstat 37
tamilnadu advanced technical training institute tatti 37

YTD

2009-11-24 to Today

Search Views
libre office 818
facebook analytics 781
wps sas lawsuit 170
r gui 164
wps sas 125
wordle.net 118
sas wps 101
sas wps lawsuit 95
google maps jet ski 94
data mining 86
decision stats 82
doug savage 63
hadley wickham 63
google maps jetski 62
hive tutorial 56
donald farmer microsoft 45