I used the mirror server that Dataspora provides as I have had latency issues with Jeroen’s website.
I got this error while trying to connect the Dataspora App to my Google spreadsheet
The page you have requested cannot be displayed. Another site was requesting access to your Google Account, but sent a malformed request. Please contact the site that you were trying to use when you received this message to inform them of the error. A detailed error message follows:
wow it works! thats cloud computing now so I wonder why Google and Amazon continue to ignore the rApache, and Jeroen’s cloud app . Surely their Google Fusion Tables can be always improved or tweaked. Not to mention the next gen version of R which will have its own server
Pretty cool screenshot (but click to see more)
I get the following pretty graph. Hadley Wickham would be ashamed of me by now.
What went wrong- well one page has 36000 views . Scale is the key to graphical coherence . So I redo- delete home page in Google spreadsheet ,reimport replot. ( I didnt know how to modify data in the cloud app, maybe we need a cloud PlyR) I redo it again as I have a big outlier-The top 10 Statistical GUI article which ironically has only 5 GUIs in that article but hush dont tell to high quality search engine)
So again Belatedly I discover something called layer in ggplot.
Base Graphics engine has really spoilt me to write short functions for plots.
I give up. I rather prefer hist() I go to my favorite GUI Rattle, but it has some dating issues with the dll of GTK+
So I go to John Fox’s simple GUI. R Commander- is the best GUI if you use Occam’s Razor, and I am using Occam’s Chainsaw now.
I get the analysis I want in 12 secs
Summary- GGPLot is more complicated than base graphics engine.
Deducer GUI is not as simple too
R Commander is the best GUI because it retains simplicity
Ignore long tail of internet only at your peril
Almost 2/3 rds of my daily traffic of 400+ comes from old archived content That is why Search Engine Optimization and Alerts for Keywords are CRITICAL for any poor soul trying to write on a blog (which has no journal like prestige nor rewards)
If you make life easier for the search engine, it being a fair chap, rewards you well
Existing web traffic estimates like Comscore and Google Trends ignore this long tail
Comments are welcome (Data is pasted below of 500 rows X 2 columns if you can come up with a better analysis)
Since SAS has ignored web analytics and Google Analytics is hmm hmm, this could be an area of opportunity for R developers as well to create a web analytics package.
A highly optimized blog post or web content can get you a lot of attention just like Rebecca Black’s video (provided it passes through the new quality metrics \change*/ in the Search Engine)
But if the underlying content is weak, or based on a shoddy understanding of the content-it can drive lots of horrid comments as well as ensuring that bad word of mouth is spread about the content or you/despite your hard work.
An example of this is copy and paste journalism especially in technology circles, where even a bigger Page Ranked website /blog can get away with scraping or stealing content from a lower page ranked website (or many websites) after adding a cursory “expert comment”. This is also true when someone who is basically a corporate communication specialist (or PR -public relations) person is given a techinical text and encourage to write about it without completely understanding it.
A mild technical defect in the search engine algorithm is that it does not seem to pay attention to when the content was published, so the copying website or blog actually can get by as fresher content even if it is practically has 90% of the same words). The second flaw is over punishment or manual punishment of excessive linking – this can encourage search optimization minded people to hoard links or discourage trackbacks.
A free internet is one which promotes free sharing of content and does not encourage stealing or un-authorized scraping or content copying. Unfortunately current search engine optimization can encourage scraping and content copying without paying too much attention to origin of the words.
In addition the analytical rigor by which search algorithms search your inboxes (as in search all emails for a keyword) or media rich sites (like Youtube) are quite on a different level of quality altogether. The chances of garbage results are much more while searching for media content and/or emails.
Using WP- Stats I set about answering this question-
What search keywords lead here-
Clearly Michael Jackson is down this year
And R GUI, Data Mining is up.
How does that affect my writing- given I get almost 250 visitors by search engines alone daily- assume I write nothing on this blog from now on.
It doesnt- I still write what ever code or poem that comes to my mind. So it is hurtful people misunderstimate the effort in writing and jump to conclusions (esp if I write about a company- I am not on payroll of that company- just like if I write about a poem- I am not a full time poet)
Over to xkcd
All Time (for Decisionstats.Wordpress.com)
michael jackson history
wps sas lawsuit
sas wps lawsuit
google maps jet ski
google maps jetski
sas sues wps
donald farmer microsoft
best statistics software
r gui ubuntu
tamilnadu advanced technical training institute tatti
Just go to Users-Personal Settings- and check the options shown. Thats it every time you write a post it suggests links and tags. Links are helpful for your readers (like Wikipedia links to understand dense technical jargon, or associated websites). Tags help to classify your contents so that all visitors to the web site including spiders ,search engines and your readers can search it better.
The bad thing is I need to go back to all 1025 posts on this site and auto generate tags for the archives ! Oh well. Great collaboration between zementa and Automattic for this new feature.
Google Instant is a relatively newer feature in Google Search Engine- it suggests websites at each type of keyword rather than wait for you to type the whole keyword.
The impact on user experience is incredible- rather than search or scroll through the results- you are more likely to click on the almost one of the ten websites you would have seen by the time you finished typing- or just clicking on the relevant ad (which probably changes on the right margin as fast as the websites below)
This spells a death for all those who indulged in black hat SEO– or link building, link exchanging- as these techniques pushed up your rank in search page only incrementally and rarely to the top 2-3 for a keyword.
Remember the size of the screen is such that each Google instant snapshot basically shows you or rather makes you focus on the top ranked search (and then presumably type on to get a newer result- rather than scroll down as the case was before).
It would be interesting to see or research the effect of keywords in the auction pricing, as well as compare those keyword pricing with Bing.com- Maybe there should be a website api tool for advertisers -like Adwords Instant that would show them the price instantly of keywords,comparison with Bing AND the search engine results for the keyword in a visual way.
Anyways- it is a incredible innovation and it is good Google is back to the math after the flings with being “Mad Men” of advertising.
and yes- I heard there is a new movie coming- it is called “The Search Engine” 🙂