Text Mining Barack Obama using R #rstats

  • We copy and paste President Barack Obama’s “Yes We Can” speech in a text document and read it in. For a word cloud we need a dataframe with two columns, one with words and the the other with frequency.We read in the transcript from http://www.nytimes.com/2008/01/08/us/politics/08text-obama.html?pagewanted=all&_r=0  and paste in the file located in the local directory- /home/ajay/Desktop/new. Note tm is a powerful package and will read ALL the text documents within the particular folder

library(tm)

library(wordcloud)

txt2=”/home/ajay/Desktop/new”

b=Corpus(DirSource(txt2), readerControl = list(language = “eng”))

> b b b tdm m1 v1 d1 wordcloud(d1$word,d1$freq)

Now it seems we need to remove some of the very commonly occuring words like “the” and “and”. We are not using the standard stopwords in english (the tm package provides that see Chapter 13 Text Mining case studies), as the words “we” and “can” are also included .

> b tdm m1 v1 d1 wordcloud(d1$word,d1$freq)

But let’s see how the wordcloud changes if we remove all English Stopwords.

> b tdm m1 v1 d1 wordcloud(d1$word,d1$freq)

and you can draw your own conclusions from the content of this famous speech based on your political preferences.

Politicians can give interesting speeches but they may be full of simple sounding words…..

Citation-

1. Ingo Feinerer (2012). tm: Text Mining Package. R package version0.5-7.1.

Ingo Feinerer, Kurt Hornik, and David Meyer (2008). Text Mining
Infrastructure in R. Journal of Statistical Software 25/5. URL:
http://www.jstatsoft.org/v25/i05/

2. Ian Fellows (2012). wordcloud: Word Clouds. R package version 2.0.

http://CRAN.R-project.org/package=wordcloud

3. You can see more than 100 of Obama’s speeches at http://obamaspeeches.com/

Quote- numbers dont lie, people do.

.

Topic Models in R- search documents for similarity by frequency

Zombie-process
Image via Wikipedia

From the marvelous lovely Journal of Statistical Software, ignored by mainstream corporatia, but beloved to academia. here is one more interesting and very timely paper.

Can be used to grade stdudents homework, catch terrorists as in plagiarists , search engine spam linkers. Enjoy!

Multi State Models

Arc de Triomphe

A special issue of the Journal of Statistical Software has come out devoted to Multi State Models and Competing Risks. It is a must read for anyone with interest in Pharma Analytics or Survival Analysis- even if you dont know much R

Here is an extract from “mstate: An R Package for the Analysis ofCompeting Risks and Multi-State Models”

Multi-state models are a very useful tool to answer a wide range of questions in sur-vival analysis that cannot, or only in a more complicated way, be answered by classicalmodels. They are suitable for both biomedical and other applications in which time-to-event variables are analyzed. However, they are still not frequently applied. So far, animportant reason for this has been the lack of available software. To overcome this prob-lem, we have developed the mstate package in R for the analysis of multi-state models.The package covers all steps of the analysis of multi-state models, from model buildingand data preparation to estimation and graphical representation of the results. It canbe applied to non- and semi-parametric (Cox) models. The package is also suitable forcompeting risks models, as they are a special category of multi-state models.

 

—————————–

 

Issues for JSS Special Volume 38: Competing Risks and Multi-State Models

Special Issue about Competing Risks and Multi-State Models

Hein Putter
Vol. 38, Issue 1, Jan 2011
Submitted 2011-01-03, Accepted 2011-01-03

Analyzing Competing Risk Data Using the R timereg Package

Thomas H. Scheike, Mei-Jie Zhang
Vol. 38, Issue 2, Jan 2011
Submitted 2009-05-25, Accepted 2010-06-22

p3state.msm: Analyzing Survival Data from an Illness-Death Model

Luís Filipe Meira Machado, Javier Roca-Pardiñas
Vol. 38, Issue 3, Jan 2011
Submitted 2009-06-30, Accepted 2010-03-02

Empirical Transition Matrix of Multi-State Models: The etm Package

Arthur Allignol, Martin Schumacher, Jan Beyersmann
Vol. 38, Issue 4, Jan 2011
Submitted 2009-01-08, Accepted 2010-03-11

Lexis: An R Class for Epidemiological Studies with Long-Term Follow-Up

Martyn Plummer, Bendix Carstensen
Vol. 38, Issue 5, Jan 2011
Submitted 2010-02-09, Accepted 2010-09-16

Using Lexis Objects for Multi-State Models in R

Bendix Carstensen, Martyn Plummer
Vol. 38, Issue 6, Jan 2011
Submitted 2010-02-09, Accepted 2010-09-16

mstate: An R Package for the Analysis of Competing Risks and Multi-State Models

Liesbeth C. de Wreede, Marta Fiocco, Hein Putter
Vol. 38, Issue 7, Jan 2011
Submitted 2010-01-17, Accepted 2010-08-20

Multi-State Models for Panel Data: The msm Package for R

Christopher Jackson
Vol. 38, Issue 8, Jan 2011
Submitted 2009-07-21, Accepted 2010-08-18

_______________________________________________
JSS-Announce mailing list
JSS-Announce@lists.stat.ucla.edu
http://lists.stat.ucla.edu/mailman/listinfo/jss-announce

 

Cloud Computing with R

Illusion of Depth and Space (4/22) - Rotating ...
Image by Dominic's pics via Flickr

Here is a short list of resources and material I put together as starting points for R and Cloud Computing It’s a bit messy but overall should serve quite comprehensively.

Cloud computing is a commonly used expression to imply a generational change in computing from desktop-servers to remote and massive computing connections,shared computers, enabled by high bandwidth across the internet.

As per the National Institute of Standards and Technology Definition,
Cloud computing is a model for enabling convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction.

(Citation: The NIST Definition of Cloud Computing

Authors: Peter Mell and Tim Grance
Version 15, 10-7-09
National Institute of Standards and Technology, Information Technology Laboratory
http://csrc.nist.gov/groups/SNS/cloud-computing/cloud-def-v15.doc)

R is an integrated suite of software facilities for data manipulation, calculation and graphical display.

From http://cran.r-project.org/doc/FAQ/R-FAQ.html#R-Web-Interfaces

R Web Interfaces

Rweb is developed and maintained by Jeff Banfield. The Rweb Home Page provides access to all three versions of Rweb—a simple text entry form that returns output and graphs, a more sophisticated JavaScript version that provides a multiple window environment, and a set of point and click modules that are useful for introductory statistics courses and require no knowledge of the R language. All of the Rweb versions can analyze Web accessible datasets if a URL is provided.
The paper “Rweb: Web-based Statistical Analysis”, providing a detailed explanation of the different versions of Rweb and an overview of how Rweb works, was published in the Journal of Statistical Software (http://www.jstatsoft.org/v04/i01/).

Ulf Bartel has developed R-Online, a simple on-line programming environment for R which intends to make the first steps in statistical programming with R (especially with time series) as easy as possible. There is no need for a local installation since the only requirement for the user is a JavaScript capable browser. See http://osvisions.com/r-online/ for more information.

Rcgi is a CGI WWW interface to R by MJ Ray. It had the ability to use “embedded code”: you could mix user input and code, allowing the HTMLauthor to do anything from load in data sets to enter most of the commands for users without writing CGI scripts. Graphical output was possible in PostScript or GIF formats and the executed code was presented to the user for revision. However, it is not clear if the project is still active.

Currently, a modified version of Rcgi by Mai Zhou (actually, two versions: one with (bitmap) graphics and one without) as well as the original code are available from http://www.ms.uky.edu/~statweb/.

CGI-based web access to R is also provided at http://hermes.sdu.dk/cgi-bin/go/. There are many additional examples of web interfaces to R which basically allow to submit R code to a remote server, see for example the collection of links available from http://biostat.mc.vanderbilt.edu/twiki/bin/view/Main/StatCompCourse.

David Firth has written CGIwithR, an R add-on package available from CRAN. It provides some simple extensions to R to facilitate running R scripts through the CGI interface to a web server, and allows submission of data using both GET and POST methods. It is easily installed using Apache under Linux and in principle should run on any platform that supports R and a web server provided that the installer has the necessary security permissions. David’s paper “CGIwithR: Facilities for Processing Web Forms Using R” was published in the Journal of Statistical Software (http://www.jstatsoft.org/v08/i10/). The package is now maintained by Duncan Temple Lang and has a web page athttp://www.omegahat.org/CGIwithR/.

Rpad, developed and actively maintained by Tom Short, provides a sophisticated environment which combines some of the features of the previous approaches with quite a bit of JavaScript, allowing for a GUI-like behavior (with sortable tables, clickable graphics, editable output), etc.
Jeff Horner is working on the R/Apache Integration Project which embeds the R interpreter inside Apache 2 (and beyond). A tutorial and presentation are available from the project web page at http://biostat.mc.vanderbilt.edu/twiki/bin/view/Main/RApacheProject.

Rserve is a project actively developed by Simon Urbanek. It implements a TCP/IP server which allows other programs to use facilities of R. Clients are available from the web site for Java and C++ (and could be written for other languages that support TCP/IP sockets).

OpenStatServer is being developed by a team lead by Greg Warnes; it aims “to provide clean access to computational modules defined in a variety of computational environments (R, SAS, Matlab, etc) via a single well-defined client interface” and to turn computational services into web services.

Two projects use PHP to provide a web interface to R. R_PHP_Online by Steve Chen (though it is unclear if this project is still active) is somewhat similar to the above Rcgi and Rweb. R-php is actively developed by Alfredo Pontillo and Angelo Mineo and provides both a web interface to R and a set of pre-specified analyses that need no R code input.

webbioc is “an integrated web interface for doing microarray analysis using several of the Bioconductor packages” and is designed to be installed at local sites as a shared computing resource.

Rwui is a web application to create user-friendly web interfaces for R scripts. All code for the web interface is created automatically. There is no need for the user to do any extra scripting or learn any new scripting techniques. Rwui can also be found at http://rwui.cryst.bbk.ac.uk.

Finally, the R.rsp package by Henrik Bengtsson introduces “R Server Pages”. Analogous to Java Server Pages, an R server page is typically HTMLwith embedded R code that gets evaluated when the page is requested. The package includes an internal cross-platform HTTP server implemented in Tcl, so provides a good framework for including web-based user interfaces in packages. The approach is similar to the use of the brew package withRapache with the advantage of cross-platform support and easy installation.

Also additional R Cloud Computing Use Cases
http://wwwdev.ebi.ac.uk/Tools/rcloud/

ArrayExpress R/Bioconductor Workbench

Remote access to R/Bioconductor on EBI’s 64-bit Linux Cluster

Start the workbench by downloading the package for your operating system (Macintosh or Windows), or via Java Web Start, and you will get access to an instance of R running on one of EBI’s powerful machines. You can install additional packages, upload your own data, work with graphics and collaborate with colleagues, all as if you are running R locally, but unlimited by your machine’s memory, processor or data storage capacity.

  • Most up-to-date R version built for multicore CPUs
  • Access to all Bioconductor packages
  • Access to our computing infrastructure
  • Fast access to data stored in EBI’s repositories (e.g., public microarray data in ArrayExpress)

Using R Google Docs
http://www.omegahat.org/RGoogleDocs/run.pdf
It uses the XML and RCurl packages and illustrates that it is relatively quick and easy
to use their primitives to interact with Web services.

Using R with Amazon
Citation
http://rgrossman.com/2009/05/17/running-r-on-amazons-ec2/

Amazon’s EC2 is a type of cloud that provides on demand computing infrastructures called an Amazon Machine Images or AMIs. In general, these types of cloud provide several benefits:

  • Simple and convenient to use. An AMI contains your applications, libraries, data and all associated configuration settings. You simply access it. You don’t need to configure it. This applies not only to applications like R, but also can include any third-party data that you require.
  • On-demand availability. AMIs are available over the Internet whenever you need them. You can configure the AMIs yourself without involving the service provider. You don’t need to order any hardware and set it up.
  • Elastic access. With elastic access, you can rapidly provision and access the additional resources you need. Again, no human intervention from the service provider is required. This type of elastic capacity can be used to handle surge requirements when you might need many machines for a short time in order to complete a computation.
  • Pay per use. The cost of 1 AMI for 100 hours and 100 AMI for 1 hour is the same. With pay per use pricing, which is sometimes called utility pricing, you simply pay for the resources that you use.

Connecting to R on Amazon EC2- Detailed tutorials
Ubuntu Linux version
https://decisionstats.com/2010/09/25/running-r-on-amazon-ec2/
and Windows R version
https://decisionstats.com/2010/10/02/running-r-on-amazon-ec2-windows/

Connecting R to Data on Google Storage and Computing on Google Prediction API
https://github.com/onertipaday/predictionapirwrapper
R wrapper for working with Google Prediction API

This package consists in a bunch of functions allowing the user to test Google Prediction API from R.
It requires the user to have access to both Google Storage for Developers and Google Prediction API:
see
http://code.google.com/apis/storage/ and http://code.google.com/apis/predict/ for details.

Example usage:

#This example requires you had previously created a bucket named data_language on your Google Storage and you had uploaded a CSV file named language_id.txt (your data) into this bucket – see for details
library(predictionapirwrapper)

and Elastic R for Cloud Computing
http://user2010.org/tutorials/Chine.html

Abstract

Elastic-R is a new portal built using the Biocep-R platform. It enables statisticians, computational scientists, financial analysts, educators and students to use cloud resources seamlessly; to work with R engines and use their full capabilities from within simple browsers; to collaborate, share and reuse functions, algorithms, user interfaces, R sessions, servers; and to perform elastic distributed computing with any number of virtual machines to solve computationally intensive problems.
Also see Karim Chine’s http://biocep-distrib.r-forge.r-project.org/

R for Salesforce.com

At the point of writing this, there seem to be zero R based apps on Salesforce.com This could be a big opportunity for developers as both Apex and R have similar structures Developers could write free code in R and charge for their translated version in Apex on Salesforce.com

Force.com and Salesforce have many (1009) apps at
http://sites.force.com/appexchange/home for cloud computing for
businesses, but very few forecasting and statistical simulation apps.

Example of Monte Carlo based app is here
http://sites.force.com/appexchange/listingDetail?listingId=a0N300000016cT9EAI#

These are like iPhone apps except meant for business purposes (I am
unaware if any university is offering salesforce.com integration
though google apps and amazon related research seems to be on)

Force.com uses a language called Apex  and you can see
http://wiki.developerforce.com/index.php/App_Logic and
http://wiki.developerforce.com/index.php/An_Introduction_to_Formulas
Apex is similar to R in that is OOPs

SAS Institute has an existing product for taking in Salesforce.com data.

A new SAS data surveyor is
available to access data from the Customer Relationship Management
(CRM) software vendor Salesforce.com. at
http://support.sas.com/documentation/cdl/en/whatsnew/62580/HTML/default/viewer.htm#datasurveyorwhatsnew902.htm)

Personal Note-Mentioning SAS in an email to a R list is a big no-no in terms of getting a response and love. Same for being careless about which R help list to email (like R devel or R packages or R help)

For python based cloud see http://pi-cloud.com

Special Issue of JSS on R GUIs

An announcement by the Journal of Statistical Software- call for papers on R GUIs. Initial deadline is December 2010 with final versions published along 2011.

Announce

Special issue of the Journal of Statistical Software on

Graphical User Interfaces for R

Editors: Pedro Valero-Mora and Ruben Ledesma

Since it original paper from Gentleman and Ihaka was published, R has managed to gain an ever-increasing percentage of academic and professional statisticians but the spread of its use among novice and occasional users of statistics have not progressed at the same pace. Among the reasons for this relative lack of impact, the lack of a GUI or point and click interface is one of the causes most widely mentioned. But, however, in the last few years, this situation has been quietly changing and a number of projects have equipped R with a number of different GUIs, ranging from the very simple to the more advanced, and providing the casual user with what could be still a new source of trouble: choosing what is the GUI for him. We may have moved from the “too few” situation to the “too many” situation
This special issue of the JSS intends as one of its main goals to offer a general overview of the different GUIs currently available for R. Thus, we think that somebody trying to find its way among different alternatives may find useful it as starting point. However, we do not want to stop in a mere listing but we want to offer a bit of a more general discussion about what could be good GUIs  for R (and how to build them). Therefore, we want to see papers submitted that discuss the whole concept of GUI in R, what elements it should include (or not), how this could be achieved, and, why not, if it is actually needed at all. Finally, despite the high success of R, this does not mean other systems may not treasure important features that we would like to see in R. Indeed, descriptions of these nice features that we do not have in R but are in other systems could be another way of driving the future progress of GUIs for R.

In summary, we envision papers for this special issue on GUIs for R in the following categories:

– General discussions on GUIs for statistics, and for R.

– Implementing GUI toolboxes for R so others can program GUIs with them.

– R GUIs examples (with two subcategories, in the desktop or in the cloud).

– Is there life beyond R? What features have other systems that R does not have and why R needs them.

Papers can be sent directly to Pedro Valero-Mora (valerop@uv.es) or Ruben Ledesma (rdledesma@gmail.com) and they will follow the usual JSS reviewing procedure. Initial deadline is December 2010 with final versions published along 2011.

====================================================
Jan de Leeuw; Distinguished Professor and Chair, UCLA Department of Statistics;
Director: UCLA Center for Environmental Statistics (CES);
Editor: Journal of Multivariate Analysis, Journal of Statistical Software;

Interview Professor John Fox Creator R Commander

Here is an interview with Prof John Fox, creator of the very popular R language based GUI, RCmdr.

Ajay- Describe your career in science from your high school days to the science books you have written. What do you think can be done to increase interest in science in young people.

John Fox- I’m a sociologist and social statistician, so I don’t have a career in science, as that term is generally understood. I was interested in science as a child, however: I attended a science high school in New York City (Brooklyn Tech), and when I began university in 1964 at New York’s City College, I started in engineering. I moved subsequently through majors in philosophy and psychology, before finishing in sociology — had I not graduated in 1968 I probably would have moved on to something else. I took a statistics course during my last year as an undergraduate and found it fascinating. I enrolled in the sociology graduate program at the University of Michigan, where I specialized in social psychology and demography, and finished with a PhD in 1972 when I was 24 years old. I became interested in computers during my first year in graduate school, where I initially learned to program in Fortran. I also took quite a few courses in statistics and math.

I haven’t written any science books, but I have written and edited a number of books on social statistics, including, most recently, Applied Regression Analysis and Generalized Linear Models, Second Edition (Sage, 2008).

I’m afraid that I don’t know how to interest young people in science. Science seemed intrinsically interesting to me when I was young, and still does.

Ajay- What prompted you to R Commander. How would you describe R Commander as a tool, say for a user of other languages and who want to learn R, but get afraid of the syntax.

John- I originally programmed the R Commander so that I could use R to teach introductory statistics courses to sociology undergraduates. I previously taught this course with Minitab or SPSS, which were programs that I never used for my own work. I waited for someone to come up with a simple, portable, easily installed point-and-click interface to R, but nothing appeared on the horizon, and so I decided to give it a try myself.

I suppose that the R Commander can ease users into writing commands, inasmuch as the commands are displayed, but I suspect that most users don’t look at them. I think that serious prospective users of R should be encouraged to use the command-line interface along with a script editor of some sort. I wouldn’t exaggerate the difficulty of learning R: I came to R — actually S then — after having programmed in perhaps a dozen other languages, most recently at that point Lisp, and found the S language particularly easy to pick up.

Ajay- I particularly like the R Cmdr plugins. Is it possible for anyone to increase R Commander with a customized package- plugin.

John- That’s the basic idea, though the plug-in author has to be able to program in R and must learn a little Tcl/Tk.

Ajay- Have you thought of using the R Commander GUI on an Amazon EC2 and thus making R high performance computing say available on demand ( similar to Zementis model deployment using Amazon Ec2). What are you views on the future of statistical computing

John- I’m not sure whether or how an interface like the Rcmdr, which is Tcl/Tk-based, can be adapted to cloud computing. I also don’t feel qualified to predict the future of statistical computing.

I think that R is where the action is for the near future.

Ajay-What are the best ways for using R Commander as a teaching tool ( I noticed the help is a bit outdated).

John- Is the help a bit outdated? My intention is that the R Commander should be largely self-explanatory. Most people know how to use point-and-click interfaces. In the basic courses for which it is principally designed, my goals are to teach the essential ideas of statistical reasoning and some skills in data analysis. In this kind of course, statistical software should facilitate the basic goals of the course.

As I said, for serious data analysis, I believe that it’s a good idea to encourage use of the command-line interface.

Ajay- What are your views on R being recognized by SAS Institute for it’s IML product. Do you think there can be a middle way for open source and proprietary software to exist.

John- I imagine that R is a challenge for producers of proprietary software like SAS, partly because R development moves more quickly, but also because R is giving away something that SAS and other vendors of proprietary statistical software are selling. For example, I once used SAS quite a bit but don’t anymore. I also have the sense that for some time SAS has directed its energies more toward business uses of its software than toward purely statistical applications.

Ajay- Do people in R Core team recognize the importance of GUI? What does the rest of R community feel? What has the feedback of users ben to you. Any plans to corporate sponsors for R Commander ( Rattle , an R language data mining GUI has a version called Rstat at http://www.informationbuilders.com/products/webfocus/predictivemodeling.html while the free version and code is at rattle.togaware.com)

John- I feel that the R Commander GUI has been generally positively received, both by members of R Core who have said something about it to me and by others in the R community. Of course, a nice feature of the R package system is that people can simply ignore packages in which they have no interest. I noticed recently that a Journal of Statistical Software paper that I wrote several years ago on the Rcmdr package has been downloaded nearly 35,000 times.

Because I wouldn’t expect many students using the Rcmdr package in a course to read that paper, I expect that the package is being used fairly widely.

Ajay- What does John Fox do for fun or as a hobby?

John- I’m tempted to say that much of my work is fun — particularly doing research, writing programs, and writing papers and books. I used to be quite a serious photographer, but I haven’t done that in years, and the technology of photography has changed a great deal. I run and swim for exercise, but that’s not really fun. I like to read and to travel, but who doesn’t?

Biography-

Prof John Fox is a giant in his chosen fields and has edited/authored 13 books and written chapters for 12 more books. He has also written and been published in almost 49 Journal articles. He is also editor in chief for R News newsletter. You can read more about Dr Fox at http://socserv.mcmaster.ca/jfox/

On R Cmdr-

R Cmdr has substantially decreased the hygiene factor for people wanting to learn R- they begin with the GUI and then later transition to customization using command line. It is so simple in its design that even under graduates have started basic data analysis with R Cmdr after just a class.You can read more on it here at http://socserv.mcmaster.ca/jfox/Misc/Rcmdr/Getting-Started-with-the-Rcmdr.pdf

Journal of Statistical Software

Here is a good open content Journal for people wanting to keep track of latest in statistical software.

It is called Journal of Statistical Software.

Citation: http://www.jstatsoft.org/

Established in 1996, the Journal of Statistical Software publishes articles, book reviews, code snippets, and software reviews on the subject of statistical software and algorithms.  The contents are freely available on-line.  For both articles and code snippets the source code is published along with the paper.

Implementations can use languages such as C, C++, S, Fortran, Java, PHP, Python and Ruby or environments such as Mathematica, MATLAB, R, S-PLUS, SAS, Stata, and XLISP-STAT.

E.g Book Reviews of  A Handbook of Statistical Analyses Using SAS (Third Edition)

and Statistics and Data with R: An Applied Approach Through Examples

jss

It is really cutting edge stuff for someone who wants to keep up with the latest and fast moving tech trends in statistical software and has convenient RSS feeds as well announce alerts for emails.

Note- Various Journals can be ranked using a quantitative index called Impact Factor

Citation http://in-cites.com/research/2007/august_27_2007-2.html

E.G For Statistics

In these columns, total citations to a journal’s published papers are divided by the total number of papers that the journal published, producing a citations-per-paper impact score over a five-year period (middle column) and a 26-year period (right-hand column).

Journals Ranked by Impact:
Statistics & Probability

Rank

2006
Impact Factor

Impact
2002-06

Impact
1981-2006
1 Bioinformatics
(4.89)
Bioinformatics
(9.87)
Econometrica
(52.93)
2 Biostatistics
(3.01)
J. Royal Stat. Soc. B
(6.75)
J. Royal Stat. Soc. B
(27.32)
3 Chemom. Intell. Lab.
(2.45)
Biostatistics
(6.56)
J. Am. Stat. Assoc.
(25.11)
4 Econometrica
(2.40)
J. Computat. Biology
(6.49)
Biometrika
(22.75)
5 J. Royal Stat. Soc. B
(2.32)
Econometrica
(5.82)
Annals of Statistics
(21.31)
6 IEEE ACM T Comp. Bi.
(2.28)
J. Chemometrics
(5.08)
Biometrics
(20.32)
7 J. Am. Stat. Assoc.
(2.17)
J. Am. Stat. Assoc.
(4.95)
Technometrics
(17.74)
8 Multivar. Behav. Res.
(2.10)
Statistical Science
(4.19)
Multivar. Behav. Res.
(16.62)
9 J. Computat. Biology
(2.00)
Annals of Statistics
(3.94)
Bioinformatics
(16.37)
10 Annals of Statistics
(1.90)
Stat. in Medicine
(3.62)
J. Royal Stat. Soc. A
(14.46)
%d bloggers like this: