Amazon goes HPC and GPU: Dirk E to revise his R HPC book

Looking south above Interstate 80, the Eastsho...
Image via Wikipedia

Amazon just did a cluster Christmas present for us tech geek lizards- before Google could out doogle them with end of the Betas (cough- its on NDA)

Clusters used by Academic Departments now have a great chance to reduce cost without downsizing- but only if the CIO gets the email.

While Professor Goodnight of SAS / North Carolina University is still playing time sharing versus mind sharing games with analytical birdies – his 70 mill server farm set in Feb last is about to get ready

( I heard they got public subsidies for environment- but thats historic for SAS– taking public things private -right Prof as SAS itself began as a publicly funded project. and that was in the 1960s and they didnt even have no lobbyists as well. )

In realted R news, Dirk E has been thinking of a R HPC book without paying attention to Amazon but would now have to include Amazon

(he has been thinking of writing that book for 5 years, but hey he’s got a day job, consulting gigs with revo, photo ops at Google, a blog, packages to maintain without binaries, Dirk E we await thy book with bated holes.

Whos Dirk E – well http://dirk.eddelbuettel.com/ is like the Terminator of R project (in terms of unpronounceable surnames)

Back to the cause du jeure-

 

From http://aws.amazon.com/ec2/hpc-applications/ but minus corporate buzz words.

 

Unique to Cluster Compute and Cluster GPU instances is the ability to group them into clusters of instances for use with HPC

applications. This is particularly valuable for those applications that rely on protocols like Message Passing Interface (MPI) for tightly coupled inter-node communication.

Cluster Compute and Cluster GPU instances function just like other Amazon EC2 instances but also offer the following features for optimal performance with HPC applications:

  • When run as a cluster of instances, they provide low latency, full bisection 10 Gbps bandwidth between instances. Cluster sizes up through and above 128 instances are supported.
  • Cluster Compute and Cluster GPU instances include the specific processor architecture in their definition to allow developers to tune their applications by compiling applications for that specific processor architecture in order to achieve optimal performance.

The Cluster Compute instance family currently contains a single instance type, the Cluster Compute Quadruple Extra Large with the following specifications:

23 GB of memory
33.5 EC2 Compute Units (2 x Intel Xeon X5570, quad-core “Nehalem” architecture)
1690 GB of instance storage
64-bit platform
I/O Performance: Very High (10 Gigabit Ethernet)
API name: cc1.4xlarge

The Cluster GPU instance family currently contains a single instance type, the Cluster GPU Quadruple Extra Large with the following specifications:

22 GB of memory
33.5 EC2 Compute Units (2 x Intel Xeon X5570, quad-core “Nehalem” architecture)
2 x NVIDIA Tesla “Fermi” M2050 GPUs
1690 GB of instance storage
64-bit platform
I/O Performance: Very High (10 Gigabit Ethernet)
API name: cg1.4xlarge

.

Sign Up for Amazon EC2

Interview James Dixon Pentaho

Here is an interview with James Dixon the founder of Pentaho, self confessed Chief Geek and CTO. Pentaho has been growing very rapidly and it makes open source Business Intelligence solutions- basically the biggest chunk of enterprise software market currently.

Ajay-  How would you describe Pentaho as a BI product for someone who is completely used to traditional BI vendors (read non open source). Do the Oracle lawsuits over Java bother you from a business perspective?

James-

Pentaho has a full suite of BI software:

* ETL: Pentaho Data Integration

* Reporting: Pentaho Reporting for desktop and web-based reporting

* OLAP: Mondrian ROLAP engine, and Analyzer or Jpivot for web-based OLAP client

* Dashboards: CDF and Dashboard Designer

* Predictive Analytics: Weka

* Server: Pentaho BI Server, handles web-access, security, scheduling, sharing, report bursting etc

We have all of the standard BI functionality.

The Oracle/Java issue does not bother me much. There are a lot of software companies dependent on Java. If Oracle abandons Java a lot resources will suddenly focus on OpenJDK. It would be good for OpenJDK and might be the best thing for Java in the long term.

Ajay-  What parts of Pentaho’s technology do you personally like the best as having an advantage over other similar proprietary packages.

Describe the latest Pentaho for Hadoop offering and Hadoop/HIVE ‘s advantage over say Map Reduce and SQL.

James- The coolest thing is that everything is pluggable:

* ETL: New data transformation steps can be added. New orchestration controls (job entries) can be added. New perspectives can be added to the design UI. New data sources and destinations can be added.

* Reporting: New content types and report objects can be added. New data sources can be added.

* BI Server: Every factory, engine, and layer can be extended or swapped out via configuration. BI components can be added. New visualizations can be added.

This means it is very easy for Pentaho, partners, customers, and community member to extend our software to do new things.

In addition every engine and component can be fully embedded into a desktop or web-based application. I made a youtube video about our philosophy: http://www.youtube.com/watch?v=uMyR-In5nKE

Our Hadoop offerings allow ETL developers to work in a familiar graphical design environment, instead of having to code MapReduce jobs in Java or Python.

90% of the Hadoop use cases we hear about are transformation/reporting/analysis of structured/semi-structured data, so an ETL tool is perfect for these situations.

Using Pentaho Data Integration reduces implementation and maintenance costs significantly. The fact that our ETL engine is Java and is embeddable means that we can deploy the engine to the Hadoop data nodes and transform the data within the nodes.

Ajay-  Do you think the combination of recession, outsourcing,cost cutting, and unemployment are a suitable environment for companies to cut technology costs by going out of their usual vendor lists and try open source for a change /test projects.

Jamie- Absolutely. Pentaho grew (downloads, installations, revenue) throughout the recession. We are on target to do 250% of what we did last year, while the established vendors are flat in terms of new license revenue.

Ajay-  How would you compare the user interface of reports using Pentaho versus other reporting software. Please feel free to be as specific.

James- We have all of the everyday, standard reporting features covered.

Over the years the old tools, like Crystal Reports, have become bloated and complicated.

We don’t aim to have 100% of their features, because we’d end us just as complicated.

The 80:20 rule applies here. 80% of the time people only use 20% of their features.

We aim for 80% feature parity, which should cover 95-99% of typical use cases.

Ajay-  Could you describe the Pentaho integration with R as well as your relationship with Weka. Jaspersoft already has a partnership with Revolution Analytics for RevoDeployR (R on a web server)-

Any  R plans for Pentaho as well?

James- The feature set of R and Weka overlap to a small extent – both of them include basic statistical functions. Weka is focused on predictive models and machine learning, whereas R is focused on a full suite of statistical models. The creator and main Weka developer is a Pentaho employee. We have integrated R into our ETL tool. (makes me happy 🙂 )

(probably not a good time to ask if SAS integration is done as well for a big chunk of legacy base SAS/ WPS users)

About-

As “Chief Geek” (CTO) at Pentaho, James Dixon is responsible for Pentaho’s architecture and technology roadmap. James has over 15 years of professional experience in software architecture, development and systems consulting. Prior to Pentaho, James held key technical roles at AppSource Corporation (acquired by Arbor Software which later merged into Hyperion Solutions) and Keyola (acquired by Lawson Software). Earlier in his career, James was a technology consultant working with large and small firms to deliver the benefits of innovative technology in real-world environments.

SAS for Job Interviews

SAS Institute, Solutions
Image via Wikipedia

Yeah. I hope someone wrote a book like that.

Basically,

  1. Libname
  2. Proc Datasets
  3. Proc Import
  4. Proc Contents
  5. Proc Freq
  6. Proc Means
  7. Proc Univariate
  8. Proc Reg
  9. Proc Logistic
  10. Proc Export (to excel where you do the graphs)
  11. ODS
  12. Proc Tabulate

(note – it would be interesting to do a proc freq on all procs say used say on SAS On Demand)

Any thing else is not needed for a entry level job for a fresh grad student or job for a veteran re-trained worker.

Just like society needs science and commerce as twin pillars, analytics needs SAS (great Marketing) and R (great research) for expanding the pie of analytics which is woefully underutilized and stupidly overcapitalized by jazzy-copy-paste-data-from-query- software disguised as “intelligent software”.  R has no certification and no formal training for jobs (as yet) though this should change. SAS looks great (still) for getting jobs for grad students. R looks great (yup) for getting research jobs probably not corporate analytics jobs ?What do you think?

 

Using Reshape2 for transposing datasets in R

Note Problem Statement-This is quite similar to using Proc Transpose using values in SAS. see http://analytics.ncsu.edu/sesug/2005/TU12_05.PDF
Diagram using geneplotter from R Graph Gallery
In R however this can be done as follow.
convert dataframe
Subject     Item      Score
1 Subject 1     Item 1     1
2 Subject 1     Item 2     0
3 Subject 1     Item 3     1
4 Subject 2     Item 1     1
5 Subject 2     Item 2     1
6 Subject 2      Item 3     0
to
Subject     Item 1 Item 2 Item 3 Item 4
1 Subject 1      1              0               1                 1
5 Subject 2      1              1                0                0
Note- I am using http://www.inside-r.org/pretty-r/tool for auto-generating the color coded R Code.
library("reshape2")
tDat.m<- melt(tDat)tDatCast<- acast(tDat.m,Subject~Item)
and that's it!
Another way (this one is  not recommended as it seems to take longer
 and more memory)
df.wide <- reshape(df, idvar="Subject", timevar="Item", direction="wide")

Open Source's worst enemy is itself not Microsoft/SAS/SAP/Oracle

The decision of quality open source makers to offer their software at bargain basement prices even to enterprise customers who are used to pay prices many times more-pricing is the reason open source software is taking a long time to command respect in enterprise software.

I hate to be the messenger who brings the bad news to my open source brethren-

but their worst nightmare is not the actions of their proprietary competitors like Oracle, SAP, SAS, Microsoft ( they hate each other even more than open source )

nor the collective marketing tactics which are textbook like (but referred as Fear Uncertainty Doubt by those outside that golden quartet)- it is their own communities and their own cheap pricing.

It is community action which prevents them from offering their software by ridiculously low bargain basement prices. James Dixon, head geek and founder at Pentaho has a point when he says traditional metrics like revenue need o be adjusted for this impact in his article at http://jamesdixon.wordpress.com/2010/11/02/comparing-open-source-and-proprietary-software-markets/

But James, why offer software to enterprise customers at one tenth the next competitor- one reason is open source companies more often than not compete more with their free community version software than with big proprietary packages.

Communities including academics are used to free- hey how about paying say 1$ for each download.

There are two million R users- if say even 50 % of them  paid 1 $ as a lifetime license fee- you could sponsor enough new packages than twenty years of Google Summer of Code does right now.

Secondly, this pricing can easily be adjusted by shifting the licensing to say free for businesses less than 2 people (even for the enhanced corporate software version not just the plain vanilla community software thus further increasing the spread of the plain vanilla versions)- for businesses from 10 to 20 people offer a six month trial rather than one month trial.

– but adjust the pricing to much more realistic levels compared to competing software. Make enterprise software pay a real value.

That’s the only way to earn respect. as well as a few dollars more.

As for SAS, it is time it started ridiculing Python now that it has accepted R.

Python is even MORE powerful than R in some use cases for stat computing

Dixon’s Pentaho and the Jaspersoft/ Revolution combo are nice _ I tested both Jasper and Pentaho thanks to these remarks this week 🙂  (see slides at http://www.jaspersoft.com/sites/default/files/downloads/events/Analytics%20-Jaspersoft-SEP2010.pdf or http://www.revolutionanalytics.com/news-events/free-webinars/2010/deploying-r/index.php )

Pentaho and Jasper do give good great graphics in BI (Graphical display in BI is not a SAS forte though probably I dont know how much they cross sell JMP to BI customers- probably too much JMP is another division syndrome there)

Open Source’s worst enemy is itself not Microsoft/SAS/SAP/Oracle

The decision of quality open source makers to offer their software at bargain basement prices even to enterprise customers who are used to pay prices many times more-pricing is the reason open source software is taking a long time to command respect in enterprise software.

I hate to be the messenger who brings the bad news to my open source brethren-

but their worst nightmare is not the actions of their proprietary competitors like Oracle, SAP, SAS, Microsoft ( they hate each other even more than open source )

nor the collective marketing tactics which are textbook like (but referred as Fear Uncertainty Doubt by those outside that golden quartet)- it is their own communities and their own cheap pricing.

It is community action which prevents them from offering their software by ridiculously low bargain basement prices. James Dixon, head geek and founder at Pentaho has a point when he says traditional metrics like revenue need o be adjusted for this impact in his article at http://jamesdixon.wordpress.com/2010/11/02/comparing-open-source-and-proprietary-software-markets/

But James, why offer software to enterprise customers at one tenth the next competitor- one reason is open source companies more often than not compete more with their free community version software than with big proprietary packages.

Communities including academics are used to free- hey how about paying say 1$ for each download.

There are two million R users- if say even 50 % of them  paid 1 $ as a lifetime license fee- you could sponsor enough new packages than twenty years of Google Summer of Code does right now.

Secondly, this pricing can easily be adjusted by shifting the licensing to say free for businesses less than 2 people (even for the enhanced corporate software version not just the plain vanilla community software thus further increasing the spread of the plain vanilla versions)- for businesses from 10 to 20 people offer a six month trial rather than one month trial.

– but adjust the pricing to much more realistic levels compared to competing software. Make enterprise software pay a real value.

That’s the only way to earn respect. as well as a few dollars more.

As for SAS, it is time it started ridiculing Python now that it has accepted R.

Python is even MORE powerful than R in some use cases for stat computing

Dixon’s Pentaho and the Jaspersoft/ Revolution combo are nice _ I tested both Jasper and Pentaho thanks to these remarks this week 🙂  (see slides at http://www.jaspersoft.com/sites/default/files/downloads/events/Analytics%20-Jaspersoft-SEP2010.pdf or http://www.revolutionanalytics.com/news-events/free-webinars/2010/deploying-r/index.php )

Pentaho and Jasper do give good great graphics in BI (Graphical display in BI is not a SAS forte though probably I dont know how much they cross sell JMP to BI customers- probably too much JMP is another division syndrome there)

JMP Genomics 5 released

Animation of the structure of a section of DNA...
Image via Wikipedia

Close to the launch of JMP9 with it’s R integration comes the announcement of JMP Genomics 5 released. The product brief is available here http://jmp.com/software/genomics/pdf/103112_jmpg5_prodbrief.pdf and it has an interesting mix of features. If you want to try out the features you can see http://jmp.com/software/license.shtml

As per me, I snagged some “new”stuff in this release-

  • Perform enrichment analysis using functional information from Ingenuity Pathways Analysis.+
  • New bar chart track allows summarization of reads or intensities.
  • New color map track displays heat plots of information for individual subjects.
  • Use a variety of continuous measures for summarization.
  • Using a common identifier, compare list membership for up tofive groups and display overlaps with Venn diagrams.
  • Filter or shade segments by mean intensity, with an optionto display segment mean intensity and set a reference valuefor shading.
  • Adjust intensities or counts for experimental samples using paired or grouped control samples.
  • Screen paired DNA and RNA intensities for allele-specific expression.
  • Standardize using a shifting factor and perform log2transformation after standardization.
  • Use kernel density information in loess and quantile normalization.
  • Depict partition tree information graphically for standard models with new Tree Viewer
  • Predictive modeling for survival analysis with Harrell’s assessment method and integration with Cross-Validation Model Comparison.

That’s right- that is incorporating the work of our favorite professor from R Project himself- http://biostat.mc.vanderbilt.edu/wiki/Main/FrankHarrell

Apparently Prof Frank E was quite a SAS coder himself (see http://biostat.mc.vanderbilt.edu/wiki/Main/SasMacros)

Back to JMP Genomics 5-

The JMP software platform provides:

• New integration capabilities let R users leverage JMP’s interactivegraphics to display analytic results.

• Tools for R programmers to build and package user interfaces that let them share customized R analytics with a broader audience.•

A new add-in infrastructure that simplifies the integration of external analytics into JMP.

 

+ For people in life sciences who like new stats software you can also download a trial version of IPA here at http://www.ingenuity.com/products/IPA/Free-Trial-Software.html