Data Quality in R #rstats

Many Data Quality Formats give problems when importing in your statistical software.A statistical software is quite unable to distingush between $1,000, 1000% and 1,000 and 1000 and will treat the former three as character variables while the third as a numeric variable by default. This issue is further compounded by the numerous ways we can represent date-time variables.

The good thing is for specific domains like finance and web analytics, even these weird data input formats are fixed, so we can fix up a list of handy data quality conversion functions in R for reference.

 

After much muddling about with coverting internet formats (or data used in web analytics) (mostly time formats without date like 00:35:23)  into data frame numeric formats, I found that the way to handle Date-Time conversions in R is

Dataset$Var2= strptime(as.character(Dataset$Var1),”%M:%S”)

The problem with this approach is you will get the value as a Date Time format (02/31/2012 04:00:45-  By default R will add today’s date to it.)  while you are interested in only Time Durations (4:00:45 or actually just the equivalent in seconds).

this can be handled using the as.difftime function

dataset$Var2=as.difftime(paste(dataset$Var1))

or to get purely numeric values so we can do numeric analysis (like summary)

dataset$Var2=as.numeric(as.difftime(paste(dataset$Var1)))

(#Maybe there is  a more elegant way here- but I dont know)

The kind of data is usually one we get in web analytics for average time on site , etc.

 

 

 

 

 

and

for factor variables

Dataset$Var2= as.numeric(as.character(Dataset$Var1))

 

or

Dataset$Var2= as.numeric(paste(Dataset$Var1))

 

Slight problem is suppose there is data like 1,504 – it will be converted to NA instead of 1504

The way to solve this is use the nice gsub function ONLy on that variable. Since the comma is also the most commonly used delimiter , you dont want to replace all the commas, just only the one in that variable.

 

dataset$Variable2=as.numeric(paste(gsub(“,”,””,dataset$Variable)))

 

Now lets assume we have data in the form of % like 0.00% , 1.23%, 3.5%

again we use the gsub function to replace the % value in the string with  (nothing).

 

dataset$Variable2=as.numeric(paste(gsub(“%”,””,dataset$Variable)))

 

 

If you simply do the following for a factor variable, it will show you the level not the value. This can create an error when you are reading in CSV data which may be read as character or factor data type.

Dataset$Var2= as.numeric(Dataset$Var1)

An additional way is to use substr (using substr( and concatenate (using paste) for manipulating string /character variables.

 

iris$sp=substr(iris$Species,1,3) –will reduce the famous Iris species into three digits , without losing any analytical value.

The other issue is with missing values, and na.rm=T helps with getting summaries of numeric variables with missing values, we need to further investigate how suitable, na.omit functions are for domains which have large amounts of missing data and need to be treated.

 

 

New Economics Theories for the new Tech World

When I was doing my MBA (a decade ago), one of the principal theories on why corporations exist was 1) Shareholder Value creation (grow wealth for investors) and a notable second was 2) Stakeholder Value creation- creating jobs for societies, providing tax to countries, providing employees with stable employment and incentives,  and of course creating monetary value for shareholders.

There were two ways you could raise money- debt or equity. Debt had the advantage of interest payments being tax deductible. Debt payments had to be met regularly. Equity had the advantage that equity holders were the last ones to be paid in case of closing the company down, which justified that rate of return on equity is generally higher than cost of debt.  Dividend payouts to stockholders could be deferred in a low revenue year or due to planning reasons.

Or in plain English, over the long term borrowing money from share holders in lieu of stocks was more expensive than selling bonds or borrowing from the banks.

Hybrid combinations of debt and equity were warrants and debentures that started off as one form of instrument and over a period of time gave much more flexibility and risk safety nets to both issuers and subscribers of capital. Another hybrid was stock options (now considered as a default option of rewarding employees in technology companies, but this was not always the case).

The use of call and put options in debentures, and the idea of vesting period in stock options was to promote lone term stability and minimize fluctuations in stock prices, employee attrition, besides of course to minimize the weighted average cost of capital. Venture capital was another class of capital known for both huge rates of return and risk taking (?)

But in today’s world where a Google has three classes of shares, companies trade shares before IPOs, and valuations of technology companies sink and rise by huge % over weeks (especially as they near IPO dates)- I wonder if traditional theories in finance need a much stronger overhaul.

or do markets need a regulatory overhaul, that would enable stock exchanges to have once more the credibility they had as the primary sources of raising capital.

 

Who will guard the guardians? Their conscience- the regulators or the news media?

There are ways of raising money that are not evil.

But they are not perfectly fair as well.

Interview Prof Benjamin Alamar , Sports Analytics

Here is an interview with Prof Benjamin Alamar, founding editor of the Journal of Quantitative Analysis in Sport, a professor of sports management at Menlo College and the Director of Basketball Analytics and Research for the Oklahoma City Thunder of the NBA.

Ajay – The movie Moneyball recently sparked out mainstream interest in analytics in sports.Describe the role of analytics in sports management

Benjamin- Analytics is impacting sports organizations on both the sport and business side.
On the Sport side, teams are using analytics, including advanced data management, predictive anlaytics, and information systems to gain a competitive edge. The use of analytics results in more accurate player valuations and projections, as well as determining effective strategies against specific opponents.
On the business side, teams are using the tools of analytics to increase revenue in a variety of ways including dynamic ticket pricing and optimizing of the placement of concession stands.
Ajay-  What are the ways analytics is used in specific sports that you have been part of?

Benjamin- A very typical first step for a team is to utilize the tools of predictive analytics to help inform their draft decisions.

Ajay- What are some of the tools, techniques and software that analytics in sports uses?
Benjamin- The tools of sports analytics do not differ much from the tools of business analytics. Regression analysis is fairly common as are other forms of data mining. In terms of software, R is a popular tool as is Excel and many of the other standard analysis tools.
Ajay- Describe your career journey and how you became involved in sports management. What are some of the tips you want to tell young students who wish to enter this field?

Benjamin- I got involved in sports through a company called Protrade Sports. Protrade initially was a fantasy sports company that was looking to develop a fantasy game based on advanced sports statistics and utilize a stock market concept instead of traditional drafting. I was hired due to my background in economics to develop the market aspect of the game.

There I met Roland Beech (who now works for the Mavericks) and Aaron Schatz (owner of footballoutsiders.com) and learned about the developing field of sports statistics. I then changed my research focus from economics to sports statistics and founded the Journal of Quantitative Analysis in Sports. Through the journal and my published research, I was able to establish a reputation of doing quality, useable work.

For students, I recommend developing very strong data management skills (sql and the like) and thinking carefully about what sort of questions a general manager or coach would care about. Being able to demonstrate analytic skills around actionable research will generally attract the attention of pro teams.

About-

Benjamin Alamar, Professor of Sport Management, Menlo College

Benjamin Alamar

Professor Benjamin Alamar is the founding editor of the Journal of Quantitative Analysis in Sport, a professor of sports management at Menlo College and the Director of Basketball Analytics and Research for the Oklahoma City Thunder of the NBA. He has published academic research in football, basketball and baseball, has presented at numerous conferences on sports analytics. He is also a co-creator of ESPN’s Total Quarterback Rating and a regular contributor to the Wall Street Journal. He has consulted for teams in the NBA and NFL, provided statistical analysis for author Michael Lewis for his recent book The Blind Side, and worked with numerous startup companies in the field of sports analytics. Professor Alamar is also an award winning economist who has worked academically and professionally in intellectual property valuation, public finance and public health. He received his PhD in economics from the University of California at Santa Barbara in 2001.

Prof Alamar is a speaker at Predictive Analytics World, San Fransisco and is doing a workshop there

http://www.predictiveanalyticsworld.com/sanfrancisco/2012/agenda.php#day2-17

2:55-3:15pm

All level tracks Track 1: Sports Analytics
Case Study: NFL, MLB, & NBA
Competing & Winning with Sports Analytics

The field of sports analytics ties together the tools of data management, predictive modeling and information systems to provide sports organization a competitive advantage. The field is rapidly developing based on new and expanded data sources, greater recognition of the value, and past success of a variety of sports organizations. Teams in the NFL, MLB, NBA, as well as other organizations have found a competitive edge with the application of sports analytics. The future of sports analytics can be seen through drawing on these past successes and the developments of new tools.

You can know more about Prof Alamar at his blog http://analyticfootball.blogspot.in/ or journal at http://www.degruyter.com/view/j/jqas. His detailed background can be seen at http://menlo.academia.edu/BenjaminAlamar/CurriculumVitae

SAS Institute Financials 2011

SAS Institute has release it’s financials for 2011 at http://www.sas.com/news/preleases/2011financials.html,

Revenue surged across all solution and industry categories. Software to detect fraud saw a triple-digit jump. Revenue from on-demand solutions grew almost 50 percent. Growth from analytics and information management solutions were double digit, as were gains from customer intelligence, retail, risk and supply chain solutions

AJAY- and as a private company it is quite nice that they are willing to share so much information every year.

The graphics are nice ( and the colors much better than in 2010) , but pie-charts- seriously dude there is no way to compare how much SAS revenue is shifting across geographies or even across industries. So my two cents is – lose the pie charts, and stick to line graphs please for the share of revenue by country /industry.

In 2011, SAS grew staff 9.2 percent and reinvested 24 percent of revenue into research and development

AJAY- So that means 654 million dollars spent in Research and Development.  I wonder if SAS has considered investing in much smaller startups (than it’s traditional strategy of doing all research in-house and completely acquiring a smaller company)

Even a small investment of say 5-10 million USD in open source , or even Phd level research projects could greatly increase the ROI on that.

That means

Analyzing a private company’s financials are much more fun than a public company, and I remember the words of my finance professor ( “dig , dig”) to compare 2011 results with 2010 results.

http://www.sas.com/news/preleases/2010financials.html

The percentage invested in R and D is exactly the same (24%) and the percentages of revenue earned from each geography is exactly the same . So even though revenue growth increased from 5.2 % to 9% in 2011, both the geographic spread of revenues and share  R&D costs remained EXACTLY the same.

The Americas accounted for 46 percent of total revenue; Europe, Middle East and Africa (EMEA) 42 percent; and Asia Pacific 12 percent.

Overall, I think SAS remains a 35% market share (despite all that noise from IBM, SAS clones, open source) because they are good at providing solutions customized for industries (instead of just software products), the market for analytics is not saturated (it seems to be growing faster than 12% or is it) , and its ability to attract and retain the best analytical talent (which in a non -American tradition for a software company means no stock options, job security, and great benefits- SAS remains almost Japanese in HR practices).

In 2010, SAS grew staff by 2.4 percent, in 2011 SAS grew staff by 9 percent.

But I liked the directional statement made here-and I think that design interfaces, algorithmic and computational efficiencies should increase analytical time, time to think on business and reduce data management time further!

“What would you do with the extra time if your code ran in two minutes instead of five hours?” Goodnight challenged.

Rcpp Workshop in San Francisco Oct 8th

 Rcpp Workshop in San Francisco  Oct 8th 

Following the successful one-day master class on Rcpp preceding this year’s R/Finance conference, a full-day master class on Rcpp and related topics which will be held on Saturday, October 8, in San Francisco.

Join Dirk Eddelbuettel for six hours of detailed and hands-on instructions and discussions aroundRcppinline,  RInsideRcppArmadilloRcppGSLRcppEigen and other packages—in an intimate small-group setting.

The full-day format allows combining an introductory morning session with a more advanced afternoon session while leaving room for sufficient breaks. We plan on having about six hours of instructions, a one-hour lunch break and two half-hour coffee breaks (and lunch and refreshments will be provided).

Morning session: “A Hands-on Introduction to R and C++”

The morning session will provide a practical introduction to the Rcpp package (and other related packages).  The focus will be on simple and straightforward applications of Rcpp in order to extend R and/or to significantly accelerate the execution of simple functions.

The tutorial will cover the inline package which permits embedding of self-contained C, C++ or FORTRAN code in R scripts. We will also discuss  RInside, to easily embed the R engine code in C++ applications, as well as standard Rcpp extension packages such as RcppArmadillo and RcppEigen for linear algebra (via highly expressive templated C++ libraries) and RcppGSL.

Afternoon session: “Advanced R and C++ Topics”

The afternoon tutorial will provide a hands-on introduction to more advanced Rcpp features. It will cover topics such as writing packages that use Rcpp, how Rcpp modules and the new R ReferenceClasses interact, and how Rcpp sugar lets us write C++ code that is often as expressive as R code. Another possible topic, time permitting, may be writing glue code to extend Rcpp to other C++ projects.

We also expect to leave some time to discuss problems brought by the class participants.

October 8, 2011 – San Franciso

AMA Executive Conference Center
@ the Marriott Hotel
55 4th Street, 2nd Level
San Francisco, CA 94103
Tel.             415-442-6770

Register Now!

Instructor Bio

Dirk Eddelbuettel Dirk E has been contributing packages to CRAN for nearly a decade. Among these are RQuantLib, digest, littler, random, RPostgreSQL, as well the Rcpp family of packages comprising Rcpp, RInside, RcppClassic, RcppExamples, RcppDE, RcppArmadillo and RcppEigen. He maintains the CRAN Task Views for Finance as well as High-Performance Computing, and is a founding co-organiser of the annual R / Finance conferences in Chicago. He has Ph.D. in Financial Econometrics from EHESS (Paris), and works in Chicago as a Quantitative Strategist.

Tableau Interactive "Viz" Contest

The Las Vegas Sign.
Image via Wikipedia
One more contest- open only for US though
but the prizes are hmm okay. The catch is you have to use the software Tableau created 
not R or J or ggobi or ggplot or java

Check out http://www.tableausoftware.com/public/biz-viz-contest/?=decisionstats

Tableau Interactive “Viz” Contest

AS FEATURED AT

Win a trip to Vegas and a chance for $2,000 & an iPad2

Are you a business, finance or real estate geek? This contest is for you! In cooperation with The Economist Ideas Economy conference, the Tableau Software Interactive “Viz” Contest will focus on business, finance and real estate data… Find some data then use Tableau Public to analyze and visualize it. That’s all it takes.

What you’ll win

A 3-day trip to Las Vegas and a chance to win $2,000 & an iPad2

The winner chosen by our judges will also take away a free roundtrip ticket to attend the2011 Tableau Customer Conference. This includes 3 night’s accommodations at theEncore and a chance to compete in the Iron Viz championship with the winners of two other contests. The winner of Iron Viz will take away a new iPad2, and $2,000.

Cash for the crowd favorite

After entering you’ll receive a custom bit.ly link to your viz. Tweet, Facebook and e-mail that link to everyone you can! Whoever gets the most clicks through their link will become our Crowd Favorite and receive a $250 debit card.

Recognition from The Economist Ideas Economy

Your winning entry will be announced live on stage at The Economist Ideas Economy conference, and Tableau will issue a national press release naming the winner.

Everyone who enters gets a t-shirt!

Everyone who enters will get a very cool Tableau t-shirt. The winner will also receive increased Tableau Public limits and a free copy of Tableau Desktop (a $1999 value)!

How it works

(Click on the steps to expand and get the details.)
 Check the box to view all steps and details.

  • Step 1

    Download the FREE Tableau Public tool


  • Step 2

    Create and publish your “viz” to your blog or website


  • Step 3

    Submit your entry formFill out the entry form and submit by June 3, 2011. A panel of judges will evaluate all submissions based on overall appeal, design elements, and data analysis/findings.

Contest Rules Summary

The following contest is open to legal residents of the United Sates only. You must publish your “viz” on your blog or website to be qualified. Submission form must be submitted by June 3, 2011. Winners will be notified by June 7, 2010. Incomplete applications will not be accepted.

Please read all the rules in their entirety before entering.

Protected: Happy Labour Day to American Stats-ical Association

This content is password protected. To view it please enter your password below:

%d bloggers like this: