Interview Kelci Miclaus, SAS Institute Using #rstats with JMP

Here is an interview with Kelci Miclaus, a researcher working with the JMP division of the SAS Institute, in which she demonstrates examples of how the R programming language is a great hit with JMP customers who like to be flexible.

 

Ajay- How has JMP been using integration with R? What has been the feedback from customers so far? Is there a single case study you can point out where the combination of JMP and R was better than any one of them alone?

Kelci- Feedback from customers has been very positive. Some customers are using JMP to foster collaboration between SAS and R modelers within their organizations. Many are using JMP’s interactive visualization to complement their use of R. Many SAS and JMP users are using JMP’s integration with R to experiment with more bleeding-edge methods not yet available in commercial software. It can be used simply to smooth the transition with regard to sending data between the two tools, or used to build complete custom applications that take advantage of both JMP and R.

One customer has been using JMP and R together for Bayesian analysis. He uses R to create MCMC chains and has found that JMP is a great tool for preparing the data for analysis, as well as displaying the results of the MCMC simulation. For example, the Control Chart platform and the Bubble Plot platform in JMP can be used to quickly verify convergence of the algorithm. The use of both tools together can increase productivity since the results of an analysis can be achieved faster than through scripting and static graphics alone.

I, along with a few other JMP developers, have written applications that use JMP scripting to call out to R packages and perform analyses like multidimensional scaling, bootstrapping, support vector machines, and modern variable selection methods. These really show the benefit of interactive visual analysis of coupled with modern statistical algorithms. We’ve packaged these scripts as JMP add-ins and made them freely available on our JMP User Community file exchange. Customers can download them and now employ these methods as they would a regular JMP platform. We hope that our customers familiar with scripting will also begin to contribute their own add-ins so a wider audience can take advantage of these new tools.

(see http://www.decisionstats.com/jmp-and-r-rstats/)

Ajay- Are there plans to extend JMP integration with other languages like Python?

Kelci- We do have plans to integrate with other languages and are considering integrating with more based on customer requests. Python has certainly come up and we are looking into possibilities there.

 Ajay- How is R a complimentary fit to JMP’s technical capabilities?

Kelci- R has an incredible breadth of capabilities. JMP has extensive interactive, dynamic visualization intrinsic to its largely visual analysis paradigm, in addition to a strong core of statistical platforms. Since our brains are designed to visually process pictures and animated graphs more efficiently than numbers and text, this environment is all about supporting faster discovery. Of course, JMP also has a scripting language (JSL) allowing you to incorporate SAS code, R code, build analytical applications for others to leverage SAS, R and other applications for users who don’t code or who don’t want to code.

JSL is a powerful scripting language on its own. It can be used for dialog creation, automation of JMP statistical platforms, and custom graphic scripting. In other ways, JSL is very similar to the R language. It can also be used for data and matrix manipulation and to create new analysis functions. With the scripting capabilities of JMP, you can create custom applications that provide both a user interface and an interactive visual back-end to R functionality. Alternatively, you could create a dashboard using statistical and/or graphical platforms in JMP to explore the data and with the click of a button, send a portion of the data to R for further analysis.

Another JMP feature that complements R is the add-in architecture, which is similar to how R packages work. If you’ve written a cool script or analysis workflow, you can package it into a JMP add-in file and send it to your colleagues so they can easily use it.

Ajay- What is the official view on R from your organization? Do you think it is a threat, or a complimentary product or another statistical platform that coexists with your offerings?

Kelci- Most definitely, we view R as complimentary. R contributors are providing a tremendous service to practitioners, allowing them to try a wide variety of methods in the pursuit of more insight and better results. The R community as a whole is providing a valued role to the greater analytical community by focusing attention on newer methods that hold the most promise in so many application areas. Data analysts should be encouraged to use the tools available to them in order to drive discovery and JMP can help with that by providing an analytic hub that supports both SAS and R integration.

Ajay-  While you do use R, are there any plans to give back something to the R community in terms of your involvement and participation (say at useR events) or sponsoring contests.

 Kelci- We are certainly open to participating in useR groups. At Predictive Analytics World in NY last October, they didn’t have a local useR group, but they did have a Predictive Analytics Meet-up group comprised of many R users. We were happy to sponsor this. Some of us within the JMP division have joined local R user groups, myself included.  Given that some local R user groups have entertained topics like Excel and R, Python and R, databases and R, we would be happy to participate more fully here. I also hope to attend the useR! annual meeting later this year to gain more insight on how we can continue to provide tools to help both the JMP and R communities with their work.

We are also exploring options to sponsor contests and would invite participants to use their favorite tools, languages, etc. in pursuit of the best model. Statistics is about learning from data and this is how we make the world a better place.

About- Kelci Miclaus

Kelci is a research statistician developer for JMP Life Sciences at SAS Institute. She has a PhD in Statistics from North Carolina State University and has been using SAS products and R for several years. In addition to research interests in statistical genetics, clinical trials analysis, and multivariate analysis/visualization methods, Kelci works extensively with JMP, SAS, and R integration.

.

 

CrowdANALYTIX

Here is a contest based community called CrowdANALYTIX.com which is quite nice and offers you free Revolution R for the statistical and analytical contests based there (a bit like Kaggle.com http://www.kaggle.com/). There are only 3 contests right now and that too low volume but I guess that number should increase. Also they seem to have a consulting arm.

Latest Analytics website- welcome! http://www.crowdanalytix.com/contests

Free and Open Source cannot get basic economics correct

Nutch robots
Image via Wikipedia

Before you rev up those keyboards, and shoot off a snarky comment- consider this statement- there are many ways to run (and ruin economies). But they still have not found a replacement for money. Yes Happiness is important. Search Engine is good.

So unless they start a new branch of economics with lots more motivational theory and psychology and lot less quant especially for open source projects, money ,revenue, sales is the only true measure of success in enterprise software. Particularly if you have competitors who are making more money selling the same class of software.

Popularity contests are for high school quarterbacks —so even if your open source software is popular in downloads, email discussions, stack overflow or Continue reading “Free and Open Source cannot get basic economics correct”

Intel® Threading Challenge 2011 Software Contest

Logo of Intel, Jul 1968 - Dec 2005
Image via Wikipedia

One more software contests for you, but in the sub million dollar prize range

http://software.intel.com/en-us/contests/intel-threading-challenge-2011/contests.php

Intel® Threading Challenge 2011 – Win a Trip to Intel Developer Forum in San Francisco

Intel® Threading Challenge 2011 is going BIG this year! After three exciting threading competitions, our fourth Threading Challenge is stepping up the excitement with a BIG Grand Prize, a trip to the Intel Developer Forum (IDF) in San Francisco (September 13-15, 2011).

Since 2008, the Intel® Threading Challenge has attracted developers of varying experience from around the world. The active participation from the community has made the Threading Challenge not only a great programming competition, but a great way for community members to engage with each other, trade threading tips, and discover new parallel programming resources.

Last year’s format of two competition levels, Master and Apprentice, generated great excitement and opened the Threading Challenge to a new group of participants. So, we are going to continue the competition with a Master level and Apprentice level, each competing for the Grand Prize for their level, as well as individual problem awards. We know you love a great challenge and great prizes, so our Threading Challenge Team is putting together some exciting threading problems for you.

Monday, April 18, 2011 – Threading Challenge 2011 (Phase 1) Launches (both levels) at 12:00 PM (noon PDT)– The competition for 2011 is very similar to last year’s, but read on whether you’re a previous participant or new to the Threading Challenge, so you will be aware of all elements of the competition and how to compete. Then, you can start threading your way to prizes today!

Choose the right level for you!

 

Threading Challenge 2011:

• Two levels available for entry: Apprentice & Master
• Phase 1: 3 problems in each level
• Phase 2: Stay tuned for details, coming in Autumn 2011
• We will award 1st, 2nd & 3rd place prizes for each problem in each level
• No overlap of problems and each level’s problems will be offered consecutively
• Participants have the option to use the Intel® Manycore Testing Lab (MTL), consisting of 40 cores, 80 threads
• To enter the Threading Challenge 2011, please read the Official Rules and register for the competition with link in the “To Enter” Section.

The Threading Challenge will be implemented in two phases, with the 1st Phase consisting of 3 problems in each level. The details of the 2nd Phase will be announced in September 2011. For Phase 1, a new problem in each level will be launched on the days listed below at 12:00 noon (PDT) and will be open for entry for 22 days (inclusive of the problem starting day), until closing on the final problem day at 12:00 noon (PDT).

Problem Start and Closing Dates (both Master and Apprentice levels):

Problem 1:
Starts: Monday, April 18, 2011 at 12:00pm (PDT)
Ends. Monday, May 9, 2011 at 12:00pm (PDT)

Problem 2:
Starts: Monday, May 9, 2011 at 12:00pm (PDT)
Ends: Monday, May 30, 2011 at 12:00pm (PDT)

Problem 3: (Due to U.S. Memorial Day Holiday, Problem 2 will start on Tuesday, May 31, 2011)
Starts: Tuesday, May 31, 2011 at 12:00pm (PDT)
Ends: Tuesday, June 21, 2011 at 12:00pm (PDT)

*All problems start and end at 12:00 noon (Pacific Daylight Time)

Contestants will have 22 days to complete their entry submission (solution only for Apprentice OR solution and write-up for Master) for each problem. You may enter ONLY 1 problem at a time and will need to choose which level (Apprentice or Master) you wish to participate in during each problem cycle. You will be awarded points based on your solution submitted. Be sure to take advantage of our threading resources and tools, and you may validate your solution (optional) using the Intel® Manycore Testing Lab to solve your problems and get involved in the dedicated forums to earn extra points.

Each problems winners will be announced on the site after the problem is closed, and Prizes will be awarded to those problem winners (see official rules for prize distribution information). The Grand Prize, a Trip to Intel® Developer Forum (IDF) in San Francisco, will be awarded for each level to the participant that has the highest total points earned for the three problems in each level (i.e., highest total points for Master level problems and Apprentice level problems).

The Intel® Threading Challenge attracts some of the most talented developers in the world to solve parallelism code challenges. Now is your chance to take multithreading to the next level and possibly win great prizes. Demonstrate your threading expertise today!

More Details:

Intel® Threading Challenge 2011 is organized so any level of developer can have the opportunity to participate. Two levels of participation are available. The Apprentice level gives those just getting started in multithreading development a chance to try out and improve their threading skills. The Master level will be executed similarly to previous threading challenges, providing those with more experience a chance to test their skills and compete against other experienced developers.

Intel® Manycore Testing Lab – Available as Option for Threading Challenge 2011 Participants

This year competitors will have the optional opportunity to develop and validate their code using the Intel® Manycore Testing Lab. This 40-core, 80-thread development environment has the latest hardware and software available and will be used by this year’s judges to test the winning entries in Threading Challenge 2011 Phase 1.

The Intel® Manycore Testing Lab (MTL) will be made available to Threading Challenge 2011 contestants. Use of the MTL will give participants the opportunity to write and test their code on systems exactly configured to what the judges will be using to score submitted entries. No more guessing about if your code will build or how it will run. (There is no requirement to use the MTL for any part of the contest. It is strictly an optional alternative being made available to those that wish to use it.)

Dataists shake up R community with a rocking contest

Flipboard
Image by Johan Larsson via Flickr

Newly created Dataists are creating waves on Hacker News and beyond with their innovative contest- A Recommendation Engine for R Packages.

Not only is the contest useful, it is likely to teach R Users some data hacking skills, as well as the basics of creating a GitHub Project.

Read more here-http://www.dataists.com/2010/10/using-data-tools-to-find-data-tools-the-yo-dawg-of-data-hacking/

For that reason, we’ve settled on the more manageable question, “which packages are most often installed by normal R users?”

This last question could potentially be answered in a variety of ways. Our current approach uses a convenience sample of installation data that we’ve collected from volunteers in the R community, who kindly agreed to send us a list of the packages they have on their systems. We’ve anonymized this data and compiled a set of metadata-based predictors that allow us to predict the installation probabilities quite well. We’re releasing all of our current work, including the data we have and all of the code we’ve used so far for our exploratory analyses. The contest itself will go live on Kaggle on Sunday and will end four months from Sunday on February 10, 2011. The rules, prizes and official data sets are all described below.

Rules and Prizes

To win the contest, you need to predict the probability that a user U has a package P installed on their system for every pair, (U, P). We’ll assess your performance using ROC methods, which will be evaluated against a held out test data set. The winning team will receive 3 UseR! books of their choosing. In order to win the contest, you’ll have to provide your analysis code to us by creating a fork of our GitHub repository. You’ll also be required to provide a written description of your approach. We’re asking for so much openness from the winning team because we want this contest to serve as a stepping stone for the R community. We’re also hoping that enterprising data hackers will extend the lessons learned through this contest to other programming languages.

Extract from-http://www.dataists.com/2010/10/using-data-tools-to-find-data-tools-the-yo-dawg-of-data-hacking/

Read the full article there