SAS Sentiment Analysis wins Award

From Business Wire, the new Sentiment Analysis product by SAS Institute (created by acquisition Teragram ) wins an award. As per wikipedia

http://en.wikipedia.org/wiki/Sentiment_analysis

Sentiment analysis or opinion mining refers to a broad (definitionally challenged) area of natural language processingcomputational linguistics and text mining. Generally speaking, it aims to determine the attitude of a speaker or a writer with respect to some topic. The attitude may be their judgment or evaluation (see appraisal theory), their affective state (that is to say, the emotional state of the author when writing) or the intended emotional communication (that is to say, the emotional effect the author wishes to have on the reader).

It was developed by Teragram. Here is another Sentiment Analysis tool from Stanford Grad school at http://twittersentiment.appspot.com/search?query=sas

See-

Sentiment analysis for sas

Image Citation-

http://threeminds.organic.com/2009/09/five_reasons_sentiment_analysi.html

Read an article on sentiment analysis here at http://www.nytimes.com/2009/08/24/technology/internet/24emotion.html

And the complete press release at http://goo.gl/iVzf`

SAS Sentiment Analysis delivers insights on customer, competitor and organizational opinions to a degree never before possible via manual review of electronic text. As a result, SAS, the leader in business analytics software and services, has earned the prestigious Communications Solutions Product of the Year Award fromTechnology Marketing Corporation (TMC).

“SAS has automated the time-consuming process of reading individual documents and manually extracting relevant information”

“SAS Sentiment Analysis has shown benefits for its customers and it provides ROI for the companies that use it,” said Rich Tehrani, CEO, TMC. “Congratulations to the entire team at SAS, a company distinguished by its dedication to software quality and superiority to address marketplace needs.”

Derive positive and negative opinions, evaluations and emotions

SAS Sentiment Analysis’ high-performance crawler locates and extracts sentiment from digital content sources, including mainstream websites, social media outlets, internal servers and incoming news feeds. SAS’ unique hybrid approach combines powerful statistical techniques with linguistics rules to improve accuracy to the detailed feature level. It summarizes the sentiment expressed in all available text collections – identifying trends and creating graphical reports that describe the expressed feelings of consumers, partners, employees and competitors in real time. Output from SAS Sentiment Analysis can be stored in document repositories, surfaced in corporate portals and used as input to additional SAS Text Analytics software or search engines to help decision makers evaluate trends, predict future outcomes, minimize risks and capitalize on opportunities.

“SAS has automated the time-consuming process of reading individual documents and manually extracting relevant information,” said Fiona McNeill, Global Analytics Product Marketing Manager at SAS. “Our integrated analytics framework helps organizations maximize the value of information to improve their effectiveness.”

SAS Sentiment Analysis is included in the SAS Text Analytics suite, which helps organizations discover insights from electronic text materials, associate them for delivery to the right person or place, and provide intelligence to select the best course of action. Whether answering complex search-and-retrieval questions, ensuring appropriate content is presented to internal or external constituencies, or predicting which activity or channel will produce the best effect on existing sentiments, SAS Text Analytics provides exceptional real-time processing speeds for large volumes of text.

SAS Text Analytics solutions are part of the SAS Business Analytics Framework, backed by the industry’s most comprehensive range of consulting, training and support services, ensuring customers maximum return from their IT investments.

Recognizing vision

The Communications Solutions Product of the Year Award recognizes vision, leadership and thoroughness. The most innovative products and services brought to the market from March 2008 through March 2009 were chosen as winners of this Product of the Year Award and are published on the INTERNET TELEPHONY and Customer Interaction Solutions websites.

Towards better analytical software

Here are some thoughts on using existing statistical software for better analytics and/or business intelligence (reporting)-

1) User Interface Design Matters- Most stats software have a legacy approach to user interface design. While the Graphical User Interfaces need to more business friendly and user friendly- example you can call a button T Test or You can call it Compare > Means of Samples (with a highlight called T Test). You can call a button Chi Square Test or Call it Compare> Counts Data. Also excessive reliance on drop down ignores the next generation advances in OS- namely touchscreen instead of mouse click and point.

Given the fact that base statistical procedures are the same across softwares, a more thoughtfully designed user interface (or revamped interface) can give softwares an edge over legacy designs.

2) Branding of Software Matters- One notable whine against SAS Institite products is a premier price. But really that software is actually inexpensive if you see other reporting software. What separates a Cognos from a Crystal Reports to a SAS BI is often branding (and user interface design). This plays a role in branding events – social media is often the least expensive branding and marketing channel. Same for WPS and Revolution Analytics.

3) Alliances matter- The alliances of parent companies are reflected in the sales of bundled software. For a complete solution , you need a database plus reporting plus analytical software. If you are not making all three of the above, you need to partner and cross sell. Technically this means that software (either DB, or Reporting or Analytics) needs to talk to as many different kinds of other softwares and formats. This is why ODBC in R is important, and alliances for small companies like Revolution Analytics, WPS and Netezza are just as important as bigger companies like IBM SPSS, SAS Institute or SAP. Also tie-ins with Hadoop (like R and Netezza appliance)  or  Teradata and SAS help create better usage.

4) Cloud Computing Interfaces could be the edge- Maybe cloud computing is all hot air. Prudent business planing demands that any software maker in analytics or business intelligence have an extremely easy to load interface ( whether it is a dedicated on demand website) or an Amazon EC2 image. Easier interfaces win and with the cloud still in early stages can help create an early lead. For R software makers this is critical since R is bad in PC usage for larger sets of data in comparison to counterparts. On the cloud that disadvantage vanishes. An easy to understand cloud interface framework is here ( its 2 years old but still should be okay) http://knol.google.com/k/data-mining-through-cloud-computing#

5) Platforms matter- Softwares should either natively embrace all possible platforms or bundle in middle ware themselves.

Here is a case study SAS stopped supporting Apple OS after Base SAS 7. Today Apple OS is strong  ( 3.47 million Macs during the most recent quarter ) and the only way to use SAS on a Mac is to do either

http://goo.gl/QAs2

or do a install of Ubuntu on the Mac ( https://help.ubuntu.com/community/MacBook ) and do this

http://ubuntuforums.org/showthread.php?t=1494027

Why does this matter? Well SAS is free to academics and students  from this year, but Mac is a preferred computer there. Well WPS can be run straight away on the Mac (though they are curiously not been able to provide academics or discounted student copies 😉 ) as per

http://goo.gl/aVKu

Does this give a disadvantage based on platform. Yes. However JMP continues to be supported on Mac. This is also noteworthy given the upcoming Chromium OS by Google, Windows Azure platform for cloud computing.

Special Issue of JSS on R GUIs

An announcement by the Journal of Statistical Software- call for papers on R GUIs. Initial deadline is December 2010 with final versions published along 2011.

Announce

Special issue of the Journal of Statistical Software on

Graphical User Interfaces for R

Editors: Pedro Valero-Mora and Ruben Ledesma

Since it original paper from Gentleman and Ihaka was published, R has managed to gain an ever-increasing percentage of academic and professional statisticians but the spread of its use among novice and occasional users of statistics have not progressed at the same pace. Among the reasons for this relative lack of impact, the lack of a GUI or point and click interface is one of the causes most widely mentioned. But, however, in the last few years, this situation has been quietly changing and a number of projects have equipped R with a number of different GUIs, ranging from the very simple to the more advanced, and providing the casual user with what could be still a new source of trouble: choosing what is the GUI for him. We may have moved from the “too few” situation to the “too many” situation
This special issue of the JSS intends as one of its main goals to offer a general overview of the different GUIs currently available for R. Thus, we think that somebody trying to find its way among different alternatives may find useful it as starting point. However, we do not want to stop in a mere listing but we want to offer a bit of a more general discussion about what could be good GUIs  for R (and how to build them). Therefore, we want to see papers submitted that discuss the whole concept of GUI in R, what elements it should include (or not), how this could be achieved, and, why not, if it is actually needed at all. Finally, despite the high success of R, this does not mean other systems may not treasure important features that we would like to see in R. Indeed, descriptions of these nice features that we do not have in R but are in other systems could be another way of driving the future progress of GUIs for R.

In summary, we envision papers for this special issue on GUIs for R in the following categories:

– General discussions on GUIs for statistics, and for R.

– Implementing GUI toolboxes for R so others can program GUIs with them.

– R GUIs examples (with two subcategories, in the desktop or in the cloud).

– Is there life beyond R? What features have other systems that R does not have and why R needs them.

Papers can be sent directly to Pedro Valero-Mora (valerop@uv.es) or Ruben Ledesma (rdledesma@gmail.com) and they will follow the usual JSS reviewing procedure. Initial deadline is December 2010 with final versions published along 2011.

====================================================
Jan de Leeuw; Distinguished Professor and Chair, UCLA Department of Statistics;
Director: UCLA Center for Environmental Statistics (CES);
Editor: Journal of Multivariate Analysis, Journal of Statistical Software;

Protected: Analyzing SAS Institute-WPS Lawsuit

This content is password-protected. To view it, please enter the password below.

Protected: SAS Institute lawsuit against WPS Episode 2 The Clone Wars

This content is password-protected. To view it, please enter the password below.

Open Source and Software Strategy

Curt Monash at Monash Research pointed out some ongoing open source GPL issues for WordPress and the Thesis issue (Also see http://ma.tt/2009/04/oracle-and-open-source/ and  http://www.mattcutts.com/blog/switching-things-around/).

As a user of both going upwards of 2 years- I believe open source and GPL license enforcement are general parts of software strategy of most software companies nowadays. Some thoughts on  open source and software strategy-Thesis remains a very very popular theme and has earned upwards of 100,000 $ for its creator (estimate based on 20k plus installs and 60$ avg price)

  • Little guys like to give away code to get some satisfaction/ recognition, big guys give away free code only when its necessary or when they are not making money in that product segment anyway.
  • As Ethan Hunt said, ” Every Hero needs a Villian”. Every software (market share) war between players needs One Big Company Holding more market share and Open Source Strategy between other player who is not able to create in house code, so effectively out sources by creating open source project. But same open source propent rarely gives away the secret to its own money making project.
    • Examples- Google creates open source Android, but wont reveal its secret algorithm for search which drives its main profits,
    • Google again puts a paper for MapReduce but it’s Yahoo that champions Hadoop,
    • Apple creates open source projects (http://www.apple.com/opensource/) but wont give away its Operating Source codes (why?) which help people buys its more expensive hardware,
    • IBM who helped kickstart the whole proprietary code thing (remember MS DOS) is the new champion of open source (http://www.ibm.com/developerworks/opensource/) and
    • Microsoft continues to spark open source debate but read http://blogs.technet.com/b/microsoft_blog/archive/2010/07/02/a-perspective-on-openness.aspx and  also http://www.microsoft.com/opensource/
    • SAS gives away a lot of open source code (Read Jim Davis , CMO SAS here , but will stick to Base SAS code (even though it seems to be making more money by verticals focus and data mining).
    • SPSS was the first big analytics company that helps supports R (open source stats software) but will cling to its own code on its softwares.
    • WordPress.org gives away its software (and I like Akismet just as well as blogging) for open source, but hey as anyone who is on WordPress.com knows how locked in you can get by its (pricy) platform.
    • Vendor Lock-in (wink wink price escalation) is the elephant in the room for Big Software Proprietary Companies.
    • SLA Quality, Maintenance and IP safety is the uh-oh for going in for open source software mostly.
  • Lack of IP protection for revenue models for open source code is the big bottleneck  for a lot of companies- as very few software users know what to do with source code if you give it to them anyways.
    • If companies were confident that they would still be earning same revenue and there would be less leakage or theft, they would gladly give away the source code.
    • Derivative softwares or extensions help popularize the original softwares.
      • Half Way Steps like Facebook Applications  the original big company to create a platform for third party creators),
      • IPhone Apps and Android Applications show success of creating APIs to help protect IP and software control while still giving some freedom to developers or alternate
      • User Interfaces to R in both SAS/IML and JMP is a similar example
  • Basically open source is mostly done by under dog while top dog mostly rakes in money ( and envy)
  • There is yet to a big commercial success in open source software, though they are very good open source softwares. Just as Google’s success helped establish advertising as an alternate ( and now dominant) revenue source for online companies , Open Source needs a big example of a company that made billions while giving source code away and still retaining control and direction of software strategy.
  • Open source people love to hate proprietary packages, yet there are more shades of grey (than black and white) and hypocrisy (read lies) within  the open source software movement than the regulated world of big software. People will be still people. Software is just a piece of code.  😉

(Art citation-http://gapingvoid.com/about/ and http://gapingvoidgallery.com/

The Popularity of Data Analysis Software

Here is a nice page by Bob Muenchen (author of “R for SAS and SPSS” and “R for Stata” books)

It is available at http://r4stats.com/popularity and uses a variety of methods, including Google Insights, Page Rank, Link analysis, as well as information from Rexer Analytics and KDNuggets.

I believe the following two graphs sum it all up:

1 Number of Jobs at Monster.com using keywords

2 Google Scholar’s analysis of academic papers

Despite R’s Rapid Growth which is clearly evident, in terms of jobs as well as publications, it lags behind SAS and SPSS. So if you are a corporate user or an academic user, it makes sense to have more than one skill just to be sure.  What do you think? Is learning R mutually exclusive and completely exhaustive from learning SAS or SPSS. See http://r4stats.com/popularity for the complete analysis by Bob Muenchen

Also it shows the tremendous opportunity for companies like Revolution Analytics and XL Solutions ( http://www.experience-rplus.com/ ) as the potential for growth is clearly evident.