Keynote Session by James Kobielus,
Senior Analyst at Forrester Research, Inc. and author
of “The Forrester WaveTM: Predictive Analytics & Data Mining Solutions, Q1 2010” report
11:30-12:05 AM
Customer Case Study: The European Commission (Government)
12:05-12:50 PM
General Session: Teradata Advanced Analytics
12:50-02:00 PM
Lunch Break & Exhibition
02:00-02:35 PM
Customer Case Study:
Virgin Media (Communications)
02:35-03:05 PM
General Session: Sponsor Presentation
03:05-03:40 PM
Coffee Break & Exhibition
03:40-04:40 PM
General Session: The Factory Approach to Compete on Analytics
04:40-05:25 PM
Customer Case Study: Insurance
05:30-06:30 PM
Cocktail & Exhibition
07:30-00:00 PM
Gala Dinner
FRIDAY, OCTOBER 29, 2010
08:30-09:00 AM
Registration & Breakfast
09:00-10:00 AM
Keynote Presentation: The CTO Talk
10:00-10:30 AM
Customer Case Study:
MonotaRO (Japan – Retail)
10:30-10:55 AM
Coffee Break & Exhibition
10:55-11:30 AM
General Session:
Sponsor Presentation
11:30-12:05 PM
Customer Case Study:
Aviva (Poland – Insurance)
12:05-01:00 PM
Lunch Break & Exhibition
01:00-01:45 PM
General Session:
How Social Network Analysis Can Boost your Marketing Performance
01:45-02:20 PM
Customer Case Study:
Financial Services
02:20-02:45 PM
Closing Remarks,
by John Ball, CEO KXEN
02:45-03:00 PM
Coffee Break & Exhibition
Optional: Technical Training (Complimentary to all Attendees)
02:45-04:00 PM
Hands-On Training #1: Getting Started with KXEN Analytical Data Management (ADM)
04:00-04:15 PM
Coffee Break
04:15-05:30 PM
Hands-On Training #2: Getting Started with KXEN Modeling Factory (KMF)
From the good folks at AsterData, a webcast on a slightly interesting analytics topic
Enterprises and government agencies can become overwhelmed with information. The value of all that data lies in the insights it can reveal. To get the maximum value, you need an analytic platform that lets you analyze terabytes of information rapidly for immediate actionable insights.
Aster Data’s massively parallel database with an integrated analytics engine can quickly reveal hard-to-recognize trends on huge datasets which other systems miss. The secret? A patent-pending SQL-MapReduce framework that enables business analysts and business intelligence (BI) tools to iteratively analyze big data more quickly. This allows you to find anomalies more quickly and stop disasters before they happen.
Discover how you can improve:
Network intelligence via graph analysis to understand connectivity among suspects, information propagation, and the flow of goods
Security analysis to prevent fraud, bot attacks, and other breaches
Geospatial analytics to quickly uncover details about regions and subsets within those communities
Visual analytics to derive deeper insights more quickly
Revolution News
Every month, we’ll bring you the latest news about Revolution’s products and events in this section. Follow us on Twitter at @RevolutionR for up-to-the-minute news and updates from Revolution Analytics!
Revolution R Enterprise 4.0 for Windows now available. Based on the latest R 2.11.1 and including the RevoScaleR package for big-data analysis in R, Revolution R Enterprise is now available for download for Windows 32-bit and 64-bit systems. Click here to subscribe, or available free to academia.
New! Integrate R with web applications, BI dashboards and more with web services. RevoDeployR is a new Web Services framework that integrates dynamic R-based computations into applications for business users. It will be available September 30 with Revolution R Enterprise Server on RHEL 5. Click here to learn more.
Inside-R: A new site for the R Community. At www.inside-R.org you’ll find the latest information about R from around the Web, searchable R documentation and packages, hints and tips about R, and more. You can even add a “Download R” badge to your own web-page to help spread the word about R.
R News, Tips and Tricks from the Revolutions blog
The Revolutions blog brings you daily news and tips about R, statistics and open source. Here are some highlights from Revolutions from the past month.
R’s key role in the oil spill response: Read how NIST’s Division Chief of Statistical Engineering used R to provide critical analysis in real time to the Secretaries of Energy and the Interior, and helped coordinate the government’s response.
Animating data with R and Google Earth: Learn how to use R to create animated visualizations of geographical data with Google Earth, such as this video showing how tuna migrations intersect with the location of the Gulf oil spill.
Are baseball games getting longer? Or is it just Red Sox games? Ryan Elmore uses nonparametric regression in R to find out.
Keynote presentations from useR! 2010: the worldwide R user’s conference was a great success, and there’s a wealth of useful tips and information in the presentations. Video of the keynote presentations are available too: check out in particular Frank Harrell’s talk Information Allergy, and Friedrich Leisch’s talk on reproducible statistical research.
Looking for more R tips and tricks? Check out the monthly round-ups at the Revolutions blog.
Upcoming Events Every month, we’ll highlight some upcoming events from R Community Calendar.
September 23: The San Diego R User Group has a meetup on BioConductor and microarray data analysis.
September 28: The Sydney Users of R Forum has a meetup on building world-class predictive models in R (with dinner to follow).
September 28: The Los Angeles R User Group presents an introduction to statistical finance with R.
September 28: The Seattle R User Group meets to discuss, “What are you doing with R?”
His argument of love is not very original though it was first made by these four guys
I am going to argue that “some” R developers should be paid, while the main focus should be volunteers code. These R developers should be paid as per usage of their packages.
Let me expand.
Imagine the following conversation between Ross Ihaka, Norman Nie and Peter Dalgaard.
Norman- Hey Guys, Can you give me some code- I got this new startup.
Ross Ihaka and Peter Dalgaard- Sure dude. Here is 100,000 lines of code, 2000 packages and 2 decades of effort.
Norman- Thanks guys.
Ross Ihaka- Hey, What you gonna do with this code.
Norman- I will better it. Sell it. Finally beat Jim Goodnight and his **** Proc GLM and **** Proc Reg.
Ross- Okay, but what will you give us? Will you give us some code back of what you improve?
Norman – Uh, let me explain this open core …
Peter D- Well how about some royalty?
Norman- Sure, we will throw parties at all conferences, snacks you know at user groups.
Ross – Hmm. That does not sound fair. (walks away in a huff muttering)-He takes our code, sells it and wont share the code
Peter D- Doesnt sound fair. I am back to reading Hamlet, the great Dane, and writing the next edition of my book. I am glad I wrote a book- Ross didnt even write that.
Norman-Uh Oh. (picks his phone)- Hey David Smith, We need to write some blog articles pronto – these open source guys ,man…
———–I think that sums what has been going on in the dynamics of R recently. If Ross Ihaka and R Gentleman had adopted an open core strategy- meaning you can create packages to R but not share the original where would we all be?
At this point if he is reading this, David Smith , long suffering veteran of open source flameouts is rolling his eyes while Tal G is wondering if he will publish this on R Bloggers and if so when or something.
Lets bring in another R veteran- Hadley Wickham who wrote a book on R and also created ggplot. Thats the best quality, most often used graphics package.
In terms of economic utilty to end user- the ggplot package may be as useful if not more as the foreach package developed by Revolution Computing/Analytics.
However lets come to open core licensing ( read it here http://alampitt.typepad.com/lampitt_or_leave_it/2008/08/open-core-licen.html ) which is where the debate is- Revolution takes code- enhances it (in my opinion) substantially with new formats XDF for better efficieny, web services API, and soon coming next year a GUI (thanks in advance , Dr Nie and guys)
and sells this advanced R code to businesses happy to pay ( they are currently paying much more to DR Goodnight and HIS guys)
Why would any sane customer buy it from Revolution- if he could download exactly the same thing from http://r-project.org
Hence the business need for Revolution Analytics to have an enhanced R- as they are using a product based software model not software as a service model.
If Revolution gives away source code of these new enhanced codes to R core team- how will R core team protect the above mentioned intelectual property- given they have 2 decades experience of giving away free code , and back and forth on just code.
Now Revolution also has a marketing budget- and thats how they sponsor some R Core events, conferences, after conference snacks.
How would people decide if they are being too generous or too stingy in their contribution (compared to the formidable generosity of SAS Institute to its employees, stakeholders and even third party analysts).
Would it not be better- IF Revolution can shift that aspect of relationship to its Research and Development budget than it’s marketing budget- come with some sort of incentive for “SOME” developers – even researchers need grants and assistantships, scholarships, make a transparent royalty formula say 17.5 % of the NEW R sales goes to R PACKAGE Developers pool, which in turn examines usage rate of packages and need/merit before allocation- that would require Revolution to evolve from a startup to a more sophisticated corporate and R Core can use this the same way as John M Chambers software award/scholarship
Dont pay all developers- it would be an insult to many of them – say Prof Harrell creator of HMisc to accept – but can Revolution expand its dev base (and prospect for future employees) by even sponsoring some R Scholarships.
And I am sure that if Revolution opens up some more code to the community- they would the rest of the world and it’s help useful. If it cant trust people like R Gentleman with some source code – well he is a board member.
——————————————————————————————–
Now to sum up some technical discussions on NeW R
1) An accepted way of benchmarking efficiencies.
2) Code review and incorporation of efficiencies.
3) Multi threading- Multi core usage are trends to be incorporated.
4) GUIs like R Commander E Plugins for other packages, and Rattle for Data Mining to have focussed (or Deducer). This may involve hiring User Interface Designers (like from Apple 😉 who will work for love AND money ( Even the Beatles charge royalty for that song)
5) More support to cloud computing initiatives like Biocep and Elastic R – or Amazon AMI for using cloud computers- note efficiency arguements dont matter if you just use a Chrome Browser and pay 2 cents a hour for an Amazon Instance. Probably R core needs more direct involvement of Google (Cloud OS makers) and Amazon as well as even Salesforce.com (for creating Force.com Apps). Note even more corporates here need to be involved as cloud computing doesnot have any free and open source infrastructure (YET)
“If something goes wrong with Microsoft, I can phone Microsoft up and have it fixed. With Open Source, I have to rely on the community.”
And the community, as much as we may love it, is unpredictable. It might care about your problem and want to fix it, then again, it may not. Anyone who has ever witnessed something online go “viral”, good or bad, will know what I’m talking about.
John Ball brings 20 years of experience in enterprise software, deep expertise in business intelligence and CRM applications, and a proven track record of success driving rapid growth at highly innovative companies.
Prior to joining KXEN, Mr. Ball served in several executive roles at salesforce.com, the leading provider of SaaS applications. Most recently, John served as VP & General Manager, Analytics and Reporting Products, where he spearheaded salesforce.com’s foray into CRM analytics and business intelligence. John also served as VP & General Manager, Service and Support Applications at salesforce.com, where he successfully grew the business to become the second largest and fastest growing product line at salesforce.com. Before salesforce.com, Ball was founder and CEO of Netonomy, the leading provider of customer self-service solutions for the telecommunications industry. Ball also held a number of executive roles at Business Objects, including General Manager, Web Products, where delivered to market the first 3 versions of WebIntelligence. Ball has a master’s degree in electrical engineering from Georgia Tech and a master’s degree in electric
I hope John atleast helps build a KXEN Force.com application- there are only 2 data mining apps there on App Exchange. Also on the wish list more social media presence, a Web SaaS/Amazon API for KXEN, greater presence in American/Asian conferences, and a solution for SME’s (which cannot afford the premium pricing of the flagship solution. An alliance with bigger BI vendors like Oracle, SAP or IBM for selling the great social network analysis.
Bill Russell as Non Executive Chairman-
Bill Russell as Non-executive Chairman of the Board, effective July 16 2010. Russell has 30 years of operational experience in enterprise software, with a special focus on business intelligence, analytics, and databases.Russell held a number of senior-level positions in his more than 20 years at Hewlett-Packard, including Vice President and General Manager of the multi-billion dollar Enterprise Systems Group. He has served as Non-executive Chairman of the Board for Sylantro Systems Corporation, webMethods Inc., and Network Physics, Inc. and has served as a board director for Cognos Inc. In addition to KXEN, Russell currently serves on the boards of Saba, PROS Holdings Inc., Global 360, ParAccel Inc., and B.T. Mancini Company.
Xavier Haffreingue as senior vice president, worldwide professional services and solutions.
He has almost 20 years of international enterprise software experience gained in the CRM, BI, Web and database sectors. Haffreingue joins KXEN from software provider Axway where he was VP global support operations. Prior to Axway, he held various leadership roles in the software industry, including VP self service solutions at Comverse Technologies and VP professional services and support at Netonomy, where he successfully delivered multi-million dollar projects across Europe, Asia-Pacific and Africa. Before that he was with Business Objects and Sybase, where he ran support and services in southern Europe managing over 2,500 customers in more than 20 countries.
David Guercio as senior vice president, Americas field operations. Guercio brings to the role more than 25 years experience of building and managing high-achieving sales teams in the data mining, business intelligence and CRM markets. Guercio comes to KXEN from product lifecycle management vendor Centric Software, where he was EVP sales and client services. Prior to Centric, he was SVP worldwide sales and client services at Inxight Software, where he was also Chairman and CEO of the company’s Federal Systems Group, a subsidiary of Inxight that saw success in the US Federal Government intelligence market. The success in sales growth and penetration into the federal government led to the acquisition of Inxight by Business Objects in 2007, where Guercio then led the Inxight sales organization until Business Objects was acquired by SAP. Guercio was also a key member of the management team and a co-founder at Neovista, an early pioneer in data mining and predictive analytics. Additionally, he held the positions of director of sales and VP of professional services at Metaphor Computer Systems, one of the first data extraction solutions companies, which was acquired by IBM. During his career, Guercio also held executive positions at Resonate and SiGen.
3) Venture Capital funding to fund expansion-
It has closed $8 million in series D funding to further accelerate its growth and international expansion. The round was led by NextStage and included participation from existing investors XAnge Capital, Sofinnova Ventures, Saints Capital and Motorola Ventures.
This was done after John Ball had joined as CEO.
4) Continued kudos from analysts and customers for it’s technical excellence.
KXEN was named a leader in predictive analytics and data mining by Forrester Research (1) and was rated highest for commercial deployments of social network analytics by Frost & Sullivan (2)
Also it became an alliance partner of Accenture- which is also a prominent SAS partner as well.
In Database Optimization-
In KXEN V5.1, a new data manipulation module (ADM) is provided in conjunction with scoring to optimize database workloads and provide full in-database model deployment. Some leading data mining vendors are only now beginning to offer this kind of functionality, and then with only one or two selected databases, giving KXEN a more than five-year head start. Some other vendors are only offering generic SQL generation, not optimized for each database, and do not provide the wealth of possible outputs for their scoring equations: For example, real operational applications require not only to generate scores, but decision probabilities, error bars, individual input contributions – used to derive reasons of decision and more, which are available in KXEN in-database scoring modules.
Since 2005, KXEN has leveraged databases as the data manipulation engine for analytical dataset generation. In 2008, the ADM (Analytical Data Management) module delivered a major enhancement by providing a very easy to use data manipulation environment with unmatched productivity and efficiency. ADM works as a generator of optimized database-specific SQL code and comes with an integrated layer for the management of meta-data for analytics.
KXEN Modeling Factory (KMF) has been designed to automate the development and maintenance of predictive analytics-intensive systems, especially systems that include large numbers of models, vast amounts of data or require frequent model refreshes. Information about each project and model is monitored and disseminated to ensure complete management and oversight and to facilitate continual improvement in business performance.
Main Functions
Schedule
: creation of the Analytic Data Set (ADS), setup of how and when to score, setup of when and how to perform model retraining and refreshes …
Report: Monitormodel execution over time, Track changes in model quality over time, see how useful one variable is by considering its multiple instance in models …
Notification: Rather than having to wade through pages of event logs, KMF Department allows users to manage by exception through notifications.
Thats all for the KXEN update- all the best to the new management team and a splendid job done by Roger Haddad in creating what is France and Europe’s best known data mining company.
Here is a great new tool for techies to start creating Android Apps right away- even if you have no knowledge of the platform. Of course there are existing great number of apps- including my favorite Android Data Mining App in R – called AnalyticDroid http://analyticdroid.togaware.com/
Basically it calls the Rattle (R Analytical Tool To Learn Easily) Data Mining GUI -enabling data mining from an Android Mobile using remote computing.
I dont know if any other statistical application is available on Android Mobiles- though SAS did have a presentation on using SAS on IPhone
Because App Inventor provides access to a GPS-location sensor, you can build apps that know where you are. You can build an app to help you remember where you parked your car, an app that shows the location of your friends or colleagues at a concert or conference, or your own custom tour app of your school, workplace, or a museum.
You can write apps that use the phone features of an Android phone. You can write an app that periodically texts “missing you” to your loved ones, or an app “No Text While Driving” that responds to all texts automatically with “sorry, I’m driving and will contact you later”. You can even have the app read the incoming texts aloud to you (though this might lure you into responding).
App Inventor provides a way for you to communicate with the web. If you know how to write web apps, you can use App Inventor to write Android apps that talk to your favorite web sites, such as Amazon and Twitter.
Here is a not so statistical Android App I am trying to create called Hang-Out
using the current GPS location of your phone to find nearest Pub, Movie or Diner and catch Bus- Train based on your location city, the GPS and time of request and schedule of those cities public transport- very much WIP