Using Code Snippets in Revolution R

So I am still testing Revo R on the 64 bit AMI i created on the weekend and I really like the code snippets feature in Revolution R.

Code Snippets work in a fairly simply way.

Right click– Click on Insert Code Snippet.

You can get a drop down of tasks to do- (like Analysis) Selecting Analysis we get another list of tasks (like Clustering).

Once you click on Clustering you get various options. Like clicking clara will auto insert the code

Now even if you are averse to using a GUI /or GUI creators don’t have your particular analysis you can basically type in code at an extremely fast pace.

It is useful to people who do not have to type in the entire code, but it is a boon to beginners as the parameters in function inserted by code snippet are automatically selected in multiple colors.

Also separately if are typing code for a function and hover, the various parameters for that particular function are shown.

Quite possibly the fastest way to write R code- and it is un matched by other code editors I am testing including Vim,Notepad++,Eclipse R etc.

The RPE (R Productivity Environment for windows- horrible bureaucratic name is the only flaw here) thus helps as it is quite thoughtfully designed. Interestingly they even have a record macro feature – which I am quite unsure of , but looks like automating some tasks. That’s next 🙂

See screenshot –

It would be quite nice to see the new Revo R GUI if it becomes available if it is equally intuitively designed considering it now has the founders of SPSS and one founder of R* as it’s members-it should be a keenly anticipated product. again Revolution could also try creating a Paid Amazon AMI and try renting the software by the hour at least as technology demonstrator as the big analytics world seems unaware of the work they have been up to.

without getting much noise on how much the other founder of R loves Revo 😉 )

Clustering Business Analysts and Industry Analysts

In my interactions with the world at large (mostly online) in the ways of data, statistics and analytics- I come across people who like to call themselves analysts.

As per me, there are 4 kinds of analysts principally,

1) Corporate Analysts- They work for a particular software company. As per them their product is great and infallible, their code has no bugs, and last zillion customer case studies all got a big benefit by buying their software.

They are very good at writing software code themselves, unfortunately this expertise is restricted to Microsoft Outlook (emails) and MS Powerpoint ( presentations). No they are more like salesmen than analysts, but as Arthur Miller said ” All salesmen (person) are dreamers. When the dream dies, the salesman (person) dies (read transfers to bigger job at a rival company)

2) Third -Party Independent Analsyst- The main reason they are third party is they can not be tolerated in a normal corporate culture, their spouse can barely stand them for more than 2 hours a day, and their Intelligence is not matched by their emotional maturity. Alas, after turning independent analysts, they realize they are actually more dependent to people than before, and they quickly polish their behaviour to praise who ever is sponsoring their webinar,  white paper , newsletter, or flying them to junkets. They are more of boutique consultants, but they used to be quite nifty at writing code, when younger, so they call themselves independent and “Noted Industry Analyst”

3) Researcher Analysts- They mostly scrape info from press releases which are mostly written by a hapless overworked communications team thrown at a task at last moment. They get into one hour call with who ever is the press or industry/analyst  relations honcho is- turn the press release into bullet points, and publish on the blog. They call this as research Analysts and give it away for free (but actually couldnt get anyone to pay for it for last 4 years). Couldnt write code if their life depended on it, but usually will find transformation and expert somehwere in their resume/about me web page. May have co -authored a book, which would have gotten them a F for plagiarism had they submitted it as a thesis.

4) Analytical Analysts- They are mostly buried deep within organizational bureaucracies if corporate, or within partnerships if they are independent. Understand coding, innovation (or creativity). Not very aggressive at networking unless provoked by an absolute idiot belonging to first three classes of industry analyst. Prefer to read Atlas Shrugged than argue on business semantics.

Next time you see an industry expert- you know which cluster to classify them 😉

Image Citation-

http://gapingvoidgallery.com/

Economic: Indian Caste System -Simplification

I am often asked by Western and non Indian people regarding the caste system. It trips me a lot trying to explain the complexity, necessity and current scenario given the history.

Here is an effort- The Indian /Hindu caste system was primarily an economic system to divide labor. In the original Manusmriti ,named by the King Manu- it was flexible.

A son of blue collar worker could become a warrior if he was brave etc.

A couple of centuries later – the top castes primarily the priests decided to make it rigid. No more social intermingling or marriage between castes, and no more migration of occupation regardless of merit.

This led to a lot of lower caste people leaving Hinduism to join religions like Islam ( post 1000 AD, Muslim Invasions and Mughal Rule) and Christianity ( post the arrival of English).


Post 1947 , many of “lower castes” preferred to remain within Hinduism but adopted Buddhism as their primary worship mechanism.Also India‘s leaders in the 1940’s , many of whom were educated in UK as lawyers ( including Mahatma Gandhi, Subhash Chandra Bose, Jawahar Lal Nehru) decided this system had weakened the nation state and divided the energies of India, besides being obviously inhumane and degrading.

The Constitution of India was shepharded in 1950  by an assembly led by Dr. B R Ambedkar , one of the very first educated lower castes ( also called Harijan , after Mahatma Gandhi’s name for them, literally Hari -Jan people of the Lord).That Cosntitution endures as India remains the finest example of a Democracy in the non Western world.

The Indian constitution established 7.5 % jobs reservation in Government jobs and educational institutes at a college and masters level only for lowest and most educationally backward castes ( hence called scheduled castes), 15 % jobs reservation in Government jobs only for tribal people ( hence called scheduled tribes). The provision is renewed every 10 years. Think of it as a constitutionallu bound affirmative action.

In 1990, another 27.5 % of jobs and educational seats were reserved for castes that were socially okay but educationally backward. This caused some riots, delays, political actions, but was finally implemented by 2007.

Opponents of the new affirmative action say that this is like doing two wrongs to make a right. Supporters say data proves that reservation has led to social advancement ( especially in the State of Tamil Nadu).Rollback of the new system is a political impossibilty thanks to unity among hitherto repressed classes.

As an upper caste Hindu ( embarassingly enough my caste is both a warrior and a kingly royal caste , which gives me zero benefit in 2010 AD)……..

In God we Trust..All others must bring Data.

Unfortunately, when it comes to politics the same data is either hidden, partially hidden, or interpreted in different ways especially with regards to projecting sampling error or decisions.

Phew…!! That was an analytical layman definition of the Indian Caste System over 2000 years.

Note- The Indian soldier caste is Kshatriyas not Kshatritas..

Interview Dean Abbott Abbott Analytics

Here is an interview with noted Analytics Consultant and trainer Dean Abbott. Dean is scheduled to take a workshop on Predictive Analytics at PAW (Predictive Analytics World Conference)  Oct 18 , 2010 in Washington D.C

Ajay-  Describe your upcoming hands on workshop at Predictive Analytics World and how it can help people learn more predictive modeling.

Refer- http://www.predictiveanalyticsworld.com/dc/2010/handson_predictive_analytics.php

Dean- The hands-on workshop is geared toward individuals who know something about predictive analytics but would like to experience the process. It will help people in two regards. First, by going through the data assessment, preparation, modeling and model assessment stages in one day, the attendees will see how predictive analytics works in reality, including some of the pain associated with false starts and mistakes. At the same time, they will experience success with building reasonable models to solve a problem in a single day. I have found that for many, having to actually build the predictive analytics solution if an eye-opener. Seeing demonstrations show the capabilities of a tool, but greater value for an end-user is the development of intuition of what to do at each each stage of the process that makes the theory of predictive analytics real.

Second, they will gain experience using a top-tier predictive analytics software tool, Enterprise Miner (EM). This is especially helpful for those who are considering purchasing EM, but also for those who have used open source tools and have never experienced the additional power and efficiencies that come with a tool that is well thought out from a business solutions standpoint (as opposed to an algorithm workbench).

Ajay-  You are an instructor with software ranging from SPSS, S Plus, SAS Enterprise Miner, Statistica and CART. What features of each software do you like best and are more suited for application in data cases.

Dean- I’ll add Tibco Spotfire Miner, Polyanalyst and Unica’s Predictive Insight to the list of tools I’ve taught “hands-on” courses around, and there are at least a half dozen more I demonstrate in lecture courses (JMP, Matlab, Wizwhy, R, Ggobi, RapidMiner, Orange, Weka, RandomForests and TreeNet to name a few). The development of software is a fascinating undertaking, and each tools has its own strengths and weaknesses.

I personally gravitate toward tools with data flow / icon interface because I think more that way, and I’ve tired of learning more programming languages.

Since the predictive analytics algorithms are roughly the same (backdrop is backdrop no matter which tool you use), the key differentiators are

(1) how data can be loaded in and how tightly integrated can the tool be with the database,

(2) how well big data can be handled,

(3) how extensive are the data manipulation options,

(4) how flexible are the model reporting options, and

(5) how can you get the models and/or predictions out.

There are vast differences in the tools on these matters, so when I recommend tools for customers, I usually interview them quite extensively to understand better how they use data and how the models will be integrated into their business practice.

A final consideration is related to the efficiency of using the tool: how much automation can one introduce so that user-interaction is minimized once the analytics process has been defined. While I don’t like new programming languages, scripting and programming often helps here, though some tools have a way to run the visual programming data diagram itself without converting it to code.

Ajay- What are your views on the increasing trend of consolidation and mergers and acquisitions in the predictive analytics space. Does this increase the need for vendor neutral analysts and consultants as well as conferences.

Dean- When companies buy a predictive analytics software package, it’s a mixed bag. SPSS purchasing of Clementine was ultimately good for the predictive analytics, though it took several years for SPSS to figure out what they wanted to do with it. Darwin ultimately disappeared after being purchased by Oracle, but the newer Oracle data mining tool, ODM, integrates better with the database than Darwin did or even would have been able to.

The biggest trend and pressure for the commercial vendors is the improvements in the Open Source and GNU tools. These are becoming more viable for enterprise-level customers with big data, though from what I’ve seen, they haven’t caught up with the big commercial players yet. There is great value in bringing both commercial and open source tools to the attention of end-users in the context of solutions (rather than sales) in a conference setting, which is I think an advantage that Predictive Analytics World has.

As a vendor-neutral consultant, flux is always a good thing because I have to be proficient in a variety of tools, and it is the breadth that brings value for customers entering into the predictive analytics space. But it is very difficult to keep up with the rapidly-changing market and that is something I am weighing myself: how many tools should I keep in my active toolbox.

Ajay-  Describe your career and how you came into the Predictive Analytics space. What are your views on various MS Analytics offered by Universities.

Dean- After getting a masters degree in Applied Mathematics, my first job was at a small aerospace engineering company in Charlottesville, VA called Barron Associates, Inc. (BAI); it is still in existence and doing quite well! I was working on optimal guidance algorithms for some developmental missile systems, and statistical learning was a key part of the process, so I but my teeth on pattern recognition techniques there, and frankly, that was the most interesting part of the job. In fact, most of us agreed that this was the most interesting part: John Elder (Elder Research) was the first employee at BAI, and was there at that time. Gerry Montgomery and Paul Hess were there as well and left to form a data mining company called AbTech and are still in analytics space.

After working at BAI, I had short stints at Martin Marietta Corp. and PAR Government Systems were I worked on analytics solutions in DoD, primarily radar and sonar applications. It was while at Elder Research in the 90s that began working in the commercial space more in financial and risk modeling, and then in 1999 I began working as an independent consultant.

One thing I love about this field is that the same techniques can be applied broadly, and therefore I can work on CRM, web analytics, tax and financial risk, credit scoring, survey analysis, and many more application, and cross-fertilize ideas from one domain into other domains.

Regarding MS degrees, let me first write that I am very encouraged that data mining and predictive analytics are being taught in specific class and programs rather than as just an add-on to an advanced statistics or business class. That stated, I have mixed feelings about analytics offerings at Universities.

I find that most provide a good theoretical foundation in the algorithms, but are weak in describing the entire process in a business context. For those building predictive models, the model-building stage nearly always takes much less time than getting the data ready for modeling and reporting results. These are cross-discipline tasks, requiring some understanding of the database world and the business world for us to define the target variable(s) properly and clean up the data so that the predictive analytics algorithms to work well.

The programs that have a practicum of some kind are the most useful, in my opinion. There are some certificate programs out there that have more of a business-oriented framework, and the NC State program builds an internship into the degree itself. These are positive steps in the field that I’m sure will continue as predictive analytics graduates become more in demand.

Biography-

DEAN ABBOTT is President of Abbott Analytics in San Diego, California. Mr. Abbott has over 21 years of experience applying advanced data mining, data preparation, and data visualization methods in real-world data intensive problems, including fraud detection, response modeling, survey analysis, planned giving, predictive toxicology, signal process, and missile guidance. In addition, he has developed and evaluated algorithms for use in commercial data mining and pattern recognition products, including polynomial networks, neural networks, radial basis functions, and clustering algorithms, and has consulted with data mining software companies to provide critiques and assessments of their current features and future enhancements.

Mr. Abbott is a seasoned instructor, having taught a wide range of data mining tutorials and seminars for a decade to audiences of up to 400, including DAMA, KDD, AAAI, and IEEE conferences. He is the instructor of well-regarded data mining courses, explaining concepts in language readily understood by a wide range of audiences, including analytics novices, data analysts, statisticians, and business professionals. Mr. Abbott also has taught both applied and hands-on data mining courses for major software vendors, including Clementine (SPSS, an IBM Company), Affinium Model (Unica Corporation), Statistica (StatSoft, Inc.), S-Plus and Insightful Miner (Insightful Corporation), Enterprise Miner (SAS), Tibco Spitfire Miner (Tibco), and CART (Salford Systems).

The Comic Water Games (aka Common Wealth Games)

We in Delhi, India are a tough people. With summer temperatures from 46 Degree Celcius (114 Degree Fahrenheit) and Winter temperatures from 2-3 Degree Celcius (just above freezing), high pollution levels, the worst traffic jams (and highest per capita cars)- there is very little that intimidates the Average Delhiite-

But the Return of the British Empire is scaring us- and it is called Common Wealth Games. The Common Wealth is a group of countries that used to be colonized by Britain in her colonial days ( USA is not a member though- as they probably kicked way too much British butt while gaining independence).

And every 4 years they have CommonWealth games (read games for the non US English speaking world). So when our commie neighborhood– the Chinese went and got themselves an Olympics- we decided to get ourselves this CWG games too. Big deal- national pride- rising economic power and all that.

So far the Games has meant the following- lots of roads dug up, lot of stadiums in various degrees of preparation, a total cost of 2 Billion USD, rampant allegations of corruption due to the ten times increase in budget – including rather suspicious looking documents procured by our local press (yes Indian press is free as it is a democracy)

And add divine grace. Delhi has the wettest monsoon since 1978- it rains cats and dogs in September- and we now have a mini dengue malaria epidemic. 4 countries have declared the living quarters for athletes as uninhabitable , some have walked out, the inevitable terrorists injured two Taiwanese tourists this weekend (in a semi ironic email they said they were prepared as the government was prepared- it isn’t)

Today a bridge collapsed-

http://www.nytimes.com/2010/09/22/sports/22iht-GAMES.html?_r=1&hp

On Tuesday afternoon, a bridge next to Jawaharlal Nehru Stadium, the main Games venue, fell apart. The footbridge collapsed into three pieces, taking several workers with it and uprooting one side of the arch that supported it.

A police officer at the scene said that 27 people had been injured, four of them seriously, in the collapse.

“This will not affect the Games,” said Raj Kumar Chauhan, a Delhi minister for development, who spoke on the scene. “We can put the bridge up again, or make a new one.”

and

http://www.nytimes.com/2010/09/20/world/asia/20india.html?ref=sports

“We really need to learn how to plan,” said Vrinda Walavalkar, a public relations executive who is not connected to the Games.

“Maybe we feel we have so many lifetimes to achieve things” that it does not matter if it gets done this time, she said.

Mr. Gupta, the shopkeeper, found a metaphor in Hindu wedding tradition.

The groom’s party, known as the barat, traditionally marches to the bride’s house on horseback with his friends and family, he explained. When the barat appears, the bride has to come to the door, he said.

“If the bride is not ready, you patch her up and try to hide all her defects,” Mr. Gupta said, and then you send her outside.

————————————————————————————————————–

To some this may be shocking. To the average Delhi-ite battling traffic and rain , this is one more episode in the chaotic Capital. As a small solace- Delhi still has the best and cheapest street food this part of the world- with golgappas, tikki and chat. If only you can beat the rain to get them !

Also see http://en.wikipedia.org/wiki/Delhi if you like to know more.

JMP 9 releasing on Oct 12

JMP 9 releases on Oct 12- it is a very good reliable data visualization and analytical tool ( AND available on Mac as well)

AND IT is advertising R Graphics as well (lol- I can visualize the look on some ahem SAS fans in the R Project)

Updated Pricing- note I am not sure why they are charging US academics 495$ when SAS On Demand is free for academics. Shouldnt JMP be free to students- maybe John Sall and his people can do a tradeoff analysis for this given JMP’s graphics are better than Base SAS (which is under some pressure from WPS and R)

http://www.sas.com/govedu/edu/programs/soda-account-setup.html

and http://www.enterpriseinnovation.net/content/sas-delivers-free-data-management-and-analytics-solutions-academe

*Offer good in the U.S. only.

OFFER PRICING DETAILS
New Corporate Customer

$1,595

Save $300.

No special requirements.
ORDER NOW (WIN) ORDER NOW (MAC)
Corporate Upgrade

$795

Save $155.

Complete the form below or call 1-877-594-6567. Requires valid JMP® 8 serial number.
New Academic

$495

Save $100.

Complete the form below or call 1-877-594-6567. Requires campus street address and campus e-mail address.
Academic Upgrade

$250

Save $45.

Complete the form below or call 1-877-594-6567. Requires campus street address and campus e-mail address.

From- the mailer-

Be First in Line for JMP® 9
Save up to $300 when you pre-order a
single-user license by Oct. 11

Pre-Order JMP 9

Make JMP your analytic hub for visual data discovery with this special offer, good through Oct. 11, 2010. Pre-order a single-user license of JMP 9 – for a discount of up to $300 – and get ready for a leap in data interactivity.

Order now and enjoy the compelling new features of JMP 9 when the software is released Oct. 12. New capabilities in JMP 9 let you:

  • Optimize and simulate using your Microsoft Excel spreadsheets.
  • Use maps to find patterns in your geographic data.
  • Enjoy the updated look and flexibility of JMP 9 on Microsoft Windows.
  • Create and share custom add-ins that extend JMP.
  • Leverage an expanded array of advanced statistical methodologies.
  • Display analytic results from R using interactive graphics.

PRE-ORDER JMP 9

What if I already have a JMP 8 single-user license?
Great news! You can upgrade to JMP 9 for less than half the regular price.

What if I’m an annual license customer?
Don’t worry, we’ve got you covered. Annual license customers enjoy priority access to all the latest JMP releases as soon as they become available. JMP 9 will be shipped to you automatically.

What if I work or study in the academic world?
Call 1-877-594-6567 to learn about significant discounts for students and professors through the JMP Academic Program.

Please feel free to forward this offer to interested colleagues.


Got two or more users?
A JMP® annual license is the way to go. Call for details.
1-877-594-6567

Remember: Act by Oct. 11!

JMP runs on Macintosh and Windows

Making NeW R

Tal G in his excellent blog piece talks of “Why R Developers  should not be paid” http://www.r-statistics.com/2010/09/open-source-and-money-why-r-developers-shouldnt-be-paid/

His argument of love is not very original though it was first made by these four guys

I am going to argue that “some” R developers should be paid, while the main focus should be volunteers code. These R developers should be paid as per usage of their packages.

Let me expand.

Imagine the following conversation between Ross Ihaka, Norman Nie and Peter Dalgaard.

Norman- Hey Guys, Can you give me some code- I got this new startup.

Ross Ihaka and Peter Dalgaard- Sure dude. Here is 100,000 lines of code, 2000 packages and 2 decades of effort.

Norman- Thanks guys.

Ross Ihaka- Hey, What you gonna do with this code.

Norman- I will better it. Sell it. Finally beat Jim Goodnight and his **** Proc GLM and **** Proc Reg.

Ross- Okay, but what will you give us? Will you give us some code back of what you improve?

Norman – Uh, let me explain this open core …

Peter D- Well how about some royalty?

Norman- Sure, we will throw parties at all conferences, snacks you know at user groups.

Ross – Hmm. That does not sound fair. (walks away in a huff muttering)-He takes our code, sells it and wont share the code

Peter D- Doesnt sound fair. I am back to reading Hamlet, the great Dane, and writing the next edition of my book. I am glad I wrote a book- Ross didnt even write that.

Norman-Uh Oh. (picks his phone)- Hey David Smith, We need to write some blog articles pronto – these open source guys ,man…

———–I think that sums what has been going on in the dynamics of R recently. If Ross Ihaka and R Gentleman had adopted an open core strategy- meaning you can create packages to R but not share the original where would we all be?

At this point if he is reading this, David Smith , long suffering veteran of open source  flameouts is rolling his eyes while Tal G is wondering if he will publish this on R Bloggers and if so when or something.

Lets bring in another R veteran-  Hadley Wickham who wrote a book on R and also created ggplot. Thats the best quality, most often used graphics package.

In terms of economic utilty to end user- the ggplot package may be as useful if not more as the foreach package developed by Revolution Computing/Analytics.

Now http://cran.r-project.org/web/packages/foreach/index.html says that foreach is licensed under http://www.apache.org/licenses/LICENSE-2.0

However lets come to open core licensing ( read it here http://alampitt.typepad.com/lampitt_or_leave_it/2008/08/open-core-licen.html ) which is where the debate is- Revolution takes code- enhances it (in my opinion) substantially with new formats XDF for better efficieny, web services API, and soon coming next year a GUI (thanks in advance , Dr Nie and guys)

and sells this advanced R code to businesses happy to pay ( they are currently paying much more to DR Goodnight and HIS guys)

Why would any sane customer buy it from Revolution- if he could download exactly the same thing from http://r-project.org

Hence the business need for Revolution Analytics to have an enhanced R- as they are using a product based software model not software as a service model.

If Revolution gives away source code of these new enhanced codes to R core team- how will R core team protect the above mentioned intelectual property- given they have 2 decades experience of giving away free code , and back and forth on just code.

Now Revolution also has a marketing budget- and thats how they sponsor some R Core events, conferences, after conference snacks.

How would people decide if they are being too generous or too stingy in their contribution (compared to the formidable generosity of SAS Institute to its employees, stakeholders and even third party analysts).

Would it not be better- IF Revolution can shift that aspect of relationship to its Research and Development budget than it’s marketing budget- come with some sort of incentive for “SOME” developers – even researchers need grants and assistantships, scholarships, make a transparent royalty formula say 17.5 % of the NEW R sales goes to R PACKAGE Developers pool, which in turn examines usage rate of packages and need/merit before allocation- that would require Revolution to evolve from a startup to a more sophisticated corporate and R Core can use this the same way as John M Chambers software award/scholarship

Dont pay all developers- it would be an insult to many of them – say Prof Harrell creator of HMisc to accept – but can Revolution expand its dev base (and prospect for future employees) by even sponsoring some R Scholarships.

And I am sure that if Revolution opens up some more code to the community- they would the rest of the world and it’s help useful. If it cant trust people like R Gentleman with some source code – well he is a board member.

——————————————————————————————–

Now to sum up some technical discussions on NeW R

1)  An accepted way of benchmarking efficiencies.

2) Code review and incorporation of efficiencies.

3) Multi threading- Multi core usage are trends to be incorporated.

4) GUIs like R Commander E Plugins for other packages, and Rattle for Data Mining to have focussed (or Deducer). This may involve hiring User Interface Designers (like from Apple 😉  who will work for love AND money ( Even the Beatles charge royalty for that song)

5) More support to cloud computing initiatives like Biocep and Elastic R – or Amazon AMI for using cloud computers- note efficiency arguements dont matter if you just use a Chrome Browser and pay 2 cents a hour for an Amazon Instance. Probably R core needs more direct involvement of Google (Cloud OS makers) and Amazon as well as even Salesforce.com (for creating Force.com Apps). Note even more corporates here need to be involved as cloud computing doesnot have any free and open source infrastructure (YET)

_______________________________________________________

Debates will come and go. This is an interesting intellectual debate and someday the liitle guys will win the Revolution-

From Hugh M of Gaping Void-

http://www.gapingvoid.com/Moveable_Type/archives/cat_microsoft_blue_monster_series.html

HOW DOES A SOFTWARE COMPANY MAKE MONEY, IF ALL

SOFTWARE IS FREE?

“If something goes wrong with Microsoft, I can phone Microsoft up and have it fixed. With Open Source, I have to rely on the community.”

And the community, as much as we may love it, is unpredictable. It might care about your problem and want to fix it, then again, it may not. Anyone who has ever witnessed something online go “viral”, good or bad, will know what I’m talking about.

and especially-

http://gapingvoid.com/2007/04/16/how-well-does-open-source-currently-meet-the-needs-of-shareholders-and-ceos/

Source-http://gapingvoidgallery.com/

Kind of sums up why the open core licensing is all about.