IBM and Revolution team to create new in-database R

From the Press Release at http://www.revolutionanalytics.com/news-events/news-room/2011/revolution-analytics-netezza-partnership.php

Under the terms of the agreement, the companies will work together to create a version of Revolution’s software that takes advantage of IBM Netezza’s i-class technology so that Revolution R Enterprise can run in-database in an optimal fashion.

About IBM

For information about IBM Netezza, please visit: http://www.netezza.com.
For Information on IBM Information Management, please visit: http://www.ibm.com/software/data/information-on-demand/
For information on IBM Business Analytics, please visit the online press kit: http://www.ibm.com/press/us/en/presskit/27163.wss
Follow IBM and Analytics on Twitter: http://twitter.com/ibmbizanalytics
Follow IBM analytics on Tumblr: http://smarterplanet.tumblr.com/tagged/new_intelligence
IBM YouTube Analytics Channel: http://www.youtube.com/user/ibmbusinessanalytics
For information on IBM Smarter Systems: http://www-03.ibm.com/systems/smarter/

About Revolution Analytics

Revolution Analytics is the leading commercial provider of software and services based on the open source R project for statistical computing.  Led by predictive analytics pioneer Norman Nie, the company brings high performance, productivity and enterprise readiness to R, the most powerful statistics language in the world. The company’s flagship Revolution R product is designed to meet the production needs of large organizations in industries such as finance, life sciences, retail, manufacturing and media.  Used by over 2 million analysts in academia and at cutting-edge companies such as Google, Bank of America and Acxiom, R has emerged as the standard of innovation in statistical analysis. Revolution Analytics is committed to fostering the continued growth of the R community through sponsorship of the Inside-R.org community site, funding worldwide R user groups and offers free licenses of Revolution R Enterprise to everyone in academia.


Netezza, an IBM Company, is the global leader in data warehouse, analytic and monitoring appliances that dramatically simplify high-performance analytics across an extended enterprise. IBM Netezza’s technology enables organizations to process enormous amounts of captured data at exceptional speed, providing a significant competitive and operational advantage in today’s data-intensive industries, including digital media, energy, financial services, government, health and life sciences, retail and telecommunications.

The IBM Netezza TwinFin® appliance is built specifically to analyze petabytes of detailed data significantly faster than existing data warehouse options, and at a much lower total cost of ownership. It stores, filters and processes terabytes of records within a single unit, analyzing only the relevant information for each query.

Using Revolution R Enterprise & Netezza Together

Revolution Analytics and IBM Netezza have announced a partnership to integrate Revolution R Enterprise and the IBM Netezza TwinFin  Data Warehouse Appliance. For the first time, customers seeking to run high performance and full-scale predictive analytics from within a data warehouse platform will be able to directly leverage the power of the open source R statistics language. The companies are working together to create a version of Revolution’s software that takes advantage of IBM Netezza’s i-class technology so that Revolution R Enterprise can run in-database in an optimal fashion.

This partnership integrates Revolution R Enterprise with IBM Netezza’s high performance data warehouse and advanced analytics platform to help organizations combat the challenges that arise as complexity and the scale of data grow.  By moving the analytics processing next to the data, this integration will minimize data movement – a significant bottleneck, especially when dealing with “Big Data”.  It will deliver high performance on large scale data, while leveraging the latest innovations in analytics.

With Revolution R Enterprise for IBM Netezza, advanced R computations are available for rapid analysis of hundreds of terabyte-class data volumes — and can deliver 10-100x performance improvements at a fraction of the cost compared to traditional analytics vendors.

Additional Resources


HIGHLIGHTS from REXER Survey :R gives best satisfaction

Simple graph showing hierarchical clustering. ...
Image via Wikipedia

A Summary report from Rexer Analytics Annual Survey

 

HIGHLIGHTS from the 4th Annual Data Miner Survey (2010):

 

•   FIELDS & GOALS: Data miners work in a diverse set of fields.  CRM / Marketing has been the #1 field in each of the past four years.  Fittingly, “improving the understanding of customers”, “retaining customers” and other CRM goals are also the goals identified by the most data miners surveyed.

 

•   ALGORITHMS: Decision trees, regression, and cluster analysis continue to form a triad of core algorithms for most data miners.  However, a wide variety of algorithms are being used.  This year, for the first time, the survey asked about Ensemble Models, and 22% of data miners report using them.
A third of data miners currently use text mining and another third plan to in the future.

 

•   MODELS: About one-third of data miners typically build final models with 10 or fewer variables, while about 28% generally construct models with more than 45 variables.

 

•   TOOLS: After a steady rise across the past few years, the open source data mining software R overtook other tools to become the tool used by more data miners (43%) than any other.  STATISTICA, which has also been climbing in the rankings, is selected as the primary data mining tool by the most data miners (18%).  Data miners report using an average of 4.6 software tools overall.  STATISTICA, IBM SPSS Modeler, and R received the strongest satisfaction ratings in both 2010 and 2009.

 

•   TECHNOLOGY: Data Mining most often occurs on a desktop or laptop computer, and frequently the data is stored locally.  Model scoring typically happens using the same software used to develop models.  STATISTICA users are more likely than other tool users to deploy models using PMML.

 

•   CHALLENGES: As in previous years, dirty data, explaining data mining to others, and difficult access to data are the top challenges data miners face.  This year data miners also shared best practices for overcoming these challenges.  The best practices are available online.

 

•   FUTURE: Data miners are optimistic about continued growth in the number of projects they will be conducting, and growth in data mining adoption is the number one “future trend” identified.  There is room to improve:  only 13% of data miners rate their company’s analytic capabilities as “excellent” and only 8% rate their data quality as “very strong”.

 

Please contact us if you have any questions about the attached report or this annual research program.  The 5th Annual Data Miner Survey will be launching next month.  We will email you an invitation to participate.

 

Information about Rexer Analytics is available at www.RexerAnalytics.com. Rexer Analytics continues their impressive journey see http://www.rexeranalytics.com/Clients.html

|My only thought- since most data miners are using multiple tools including free tools as well as paid software, Perhaps a pie chart of market share by revenue and volume would be handy.

Also some ideas on comparing diverse data mining projects by data size, or complexity.

 

Pentaho and R: working together

open_source_communism
Image by jagelado via Flickr

I interview Pentaho Co-founder here at https://decisionstats.com/2010/11/14/pentaho/

and recently became aware of the R Pentaho integration.

“R” is a popular open source statistical and analytical language that academics and commercial organizations alike have used for years to get maximum insight out of information using advanced analytic techniques. In this twelve-minute video, David Reinke from Pentaho Certified Partner OpenBI provides an overview of R, as well as a demonstration of integration between R and Pentaho.

http://www.pentaho.com/products/demos/r_project_with_pentaho/

or http://www.pentaho.com/products/demos/showNtell.php

Related-

M.S. in Applied Statistics

http://www.information-management.com/blogs/analytics_business_intelligence_BI_statistics-10019474-1.html

R and BI – Integrating R with Open Source BusinessIntelligence Platforms Pentaho and Jaspersoft

http://www.r-project.org/conferences/useR-2010/abstracts/Reinke+Miller.pdf

Web development with R

http://www.r-project.org/conferences/useR-2010/slides/Ooms.pdf

In-database analytics with R

http://www.r-project.org/conferences/useR-2010/slides/Hess+Chambers_1.pdf

R role in Business Intelligence Software Architecture

http://www.r-project.org/conferences/useR-2010/slides/Colombo+Ronzoni+Fontana.pdf

Using R from other Software

Bridge to R for WPS

http://www.minequest.com/Bridge2R.html

SAS/IML Interface to R

http://www.sas.com/technologies/analytics/statistics/iml/index.html

Official Screenshot-

RapidMiner Extension to R

https://rapid-i.com/content/view/202/206/lang,en/#r

(UN)Official Screenshot-

IBM SPSS plugin for R

https://www.spss.com/software/statistics/developer/

and

https://www.spss.com/devcentral/index.cfm?pg=rresources

Tutorial-

https://sites.google.com/site/r4statistics/running-r-from-spss

http://rwiki.sciviews.org/doku.php?id=tips:callingr:spss

(UN)official Screenshot

Knime

http://www.knime.org/downloads/extensions

Official Screenshot-

Oracle Data Miner

http://www.oracle.com/technetwork/database/options/odm/odm-r-integration-089013.html

Official Screenshot-

JMP

http://jmp.com/software/jmp9/keyfeatures.shtml

and

http://www.jmp.com/applications/analytical_apps/

Tutorial

http://blogs.sas.com/jmp/index.php?/archives/298-JMP-Into-R!.html

Screenshot-

WPS Version 2.5.1 Released – can still run SAS language/data and R

However this is what Phil Rack the reseller is quoting on http://www.minequest.com/Pricing.html

Windows Desktop Price: $884 on 32-bit Windows and $1,149 on 64-bit Windows.

The Bridge to R is available on the Windows platforms and is available for free to customers who
license WPS through MineQuest,LLC. Companies and organizations outside of North America
may purchase a license for the Bridge to R which starts at $199 per desktop or $599 per server

Windows Server Price: $1,903 per logical CPU for 32-bit and $2,474 for 64-bit.

Note that Linux server versions are available but do not yet support the Eclipse IDE and are
command line only

WPS sure seems going well-but their pricing is no longer fixed and on the home website, you gotta fill a form. Ditt0 for the 30 day free evaluation

http://www.teamwpc.co.uk/products/wps/modules/core

Data File Formats

The table below provides a summary of data formats presently supported by the WPS Core module.

Data File Format Un-Compressed
Data
Compressed
Data
Read Write Read Write
SD2 (SAS version 6 data set)
SAS7BDAT (SAS version 7 data set)
SAS7BDAT (SAS version 8 data set)
SAS7BDAT (SAS version 9 data set)
SASSEQ (SAS version 8/9 sequential file)
V8SEQ (SAS version 8 sequential file)
V9SEQ (SAS version 9 sequential file)
WPD (WPS native data set)
WPDSEQ (WPS native sequential file)
XPORT (transport format)

Additional access to EXCEL, SPSS and dBASE files is supported by utilising the WPS Engine for DB Filesmodule.

and they have a new product release on Valentine Day 2011 (oh these Europeans!)

From the press release at http://www.teamwpc.co.uk/press/wps2_5_1_released

WPS Version 2.5.1 Released 

New language support, new data engines, larger datasets, improved scalability

LONDON, UK – 14 February 2011 – World Programming today released version 2.5.1 of their WPS software for workstations, servers and mainframes.

WPS is a competitively priced, high performance, highly scalable data processing and analytics software product that allows users to execute programs written in the language of SAS. WPS is supported on a wide variety of hardware and operating system platforms and can connect to and work with many types of data with ease. The WPS user interface (Workbench) is frequently praised for its ease of use and flexibility, with the option to include numerous third-party extensions.

This latest version of the software has the ability to manipulate even greater volumes of data, removing the previous 2^31 (2 billion) limit on number of observations.

Complimenting extended data processing capabilities, World Programming has worked hard to boost the performance, scalability and reliability of the WPS software to give users the confidence they need to run heavy workloads whilst delivering maximum value from available computer power.

WPS version 2.5.1 offers additional flexibility with the release of two new data engines for accessing Greenplum and SAND databases. WPS now comes with eleven data engines and can access a huge range of commonly used and industry-standard file-formats and databases.

Support in WPS for the language of SAS continues to expand with more statistical procedures, data step functions, graphing controls and many other language items and options.

WPS version 2.5.1 is available as a free upgrade to all licensed users of WPS.

Summary of Main New Features:

  • Supporting Even Larger Datasets
    WPS is now able to process very large data sets by lifting completely the previous size limit of 2^31 observations.
  • Performance and Scalability Boosted
    Performance and scalability improvements across the board combine to ensure even the most demanding large and concurrent workloads are processed efficiently and reliably.
  • More Language Support
    WPS 2.5.1 continues the expansion of it’s language support with over 70 new language items, including new Procedures, Data Step functions and many other language items and options.
  • Statistical Analysis
    The procedure support in WPS Statistics has been expanded to include PROC CLUSTER and PROC TREE.
  • Graphical Output
    The graphical output from WPS Graphing has been expanded to accommodate more configurable graphics.
  • Hash Tables
    Support is now provided for hash tables.
  • Greenplum®
    A new WPS Engine for Greenplum provides dedicated support for accessing the Greenplum database.
  • SAND®
    A new WPS Engine for SAND provides dedicated support for accessing the SAND database.
  • Oracle®
    Bulk loading support now available in the WPS Engine for Oracle.
  • SQL Server®
    To enhance existing SQL Server database access, a new SQLSERVR (please note spelling) facility in the ODBC engine.

More Information:

Existing Users should visit www.teamwpc.co.uk/support/wps/release where you can download a readme file containing more information about all the new features and fixes in WPS 2.5.1.

New Users should visit www.teamwpc.co.uk/products/wps where you can explore in more detail all the features available in WPS or request a free evaluation.

and from http://www.teamwpc.co.uk/products/wps/data it seems they are going on the BIG DATA submarine as well-

Data Support 

Extremely Large Data Size Handling

WPS is now able to handle extremely large data sets now that the previous limit of 2^31 observations has been lifted.

Access Standard Databases

Use I/O Features in WPS Core

  • CLIPBOARD (Windows only)
  • DDE (Windows only)
  • EMAIL (via SMTP or MAPI)
  • FTP
  • HTTP
  • PIPE (Windows and UNIX only)
  • SOCKET
  • STDIO
  • URL

Use Standard Data File Formats

Linux Counter- Use Linux so be counted

Here’s a nice website at

http://counter.li.org/

You can basically spend 2 minutes and register yourself publicly/anonymously/or your machine

and some fun at http://counter.li.org/estimates.php

SAS Knowledge Exchange

Visual analytics : research and practice
Image via Wikipedia

Here is an interesting website by SAS.com – it showcases lots of business analytics content more from a conceptual rather than a tool based perspective- have a glance yourself.

http://www.sas.com/knowledge-exchange/business-analytics/

Copyright © SAS Institute Inc. All rights reserved