Dashboard Design: Google Activity

I quite like Google’s monthly email on account activity. It is the Google way to offer free services, as well as treat users as special, that continues to command loyalty despite occasional exasperation with corporate thingies.

See this dashboard-

1Notice the use of Bigger Font for overall number of emails as well as smaller bar plots- I would say they are almost spark lines or spark bar plots if you excuse my Tufte.

The medium range font shows persons sent/from statistics, and the color shades are done to empahsize or de-emphasize the metric

Colors used are black/grey, green and blue coincident with the Corporate Logo.

However some of the JS for visualizations need to be tweaked. Clearly the hover script ( an  integral part of Dashboard design ) needs better elucidiation or formatting)

2

I would also venture my neck and suggest that rather than just monthly snapshots, atleast some way of comparing snapshots across periods or even the total time period be enabled- rather than be in seperate views.  This may give the user a bit more analytical value.

Overall, a nice and simple dashboard which may be of some use to the business user who makes or views a lot of reports on online properties. Minimal and effective- and in keeping with Open Data- Data Liberation Principles. I guess Google is secure in the knowledge that users do not view time spent on Google services as a total waste , unlike some of the other more social 😉 websites they spend time on.

Easier Tagging for E Commerce by Google Tag Manager

Ok I guess I am a bit late to this, but I really like the concept of Google Tag Manager https://developers.google.com/tag-manager/ and the fact they have a WordPress plugin ready http://wordpress.org/extend/plugins/wp-google-tag-manager/. What does it do? It integrates all your tags on websites on one dashboard. So much easier Web Analytics for marketing people who dont want to learn Reg Ex , JS etc.

gtm1

IT-friendly – Google Tag Manager has lots features to set your mind
at ease—like user permissions, automated error checking, the Debug
Console, and asynchronous technology. So everything runs efficiently,
with no unpleasant surprises.
• Quick and easy – Users add or change tags whenever they want, to
keep sites running smoothly and quickly. Tags are managed with an
easy-to-use web interface, so there’s no need to write or rewrite site
code following implementation.
• Verified tags & templates – Google Tag Manager makes it easy to
verify that new tags are working properly, so users don’t need to call on
IT to check the tags. Built-in tag templates and automatic error checking
also prevent tags with improper formatting from even being deployed
on your site.
• Swift loading – Google Tag Manager replaces all your measurement
and marketing tags with a single, asynchronously loading tag—so your
tags can fire faster without getting in each other’s way.

 

Google Analytics using #Rstats – Updated

Due to changes in Google APIs my earlier post on using Google Analytics in R is deprecated.  Unfortunately it is still on top 10 results for Google results for Using Google Analytics with R.

That post is here https://decisionstats.com/2012/03/20/using-google-analytics-with-r/

A more updated R package on Google Analytics and R is here . https://github.com/skardhamar/rga

A better updated post on an easy to use tutorial on using Google Analytics with R using OAuth 2 playground is here.

http://www.tatvic.com/blog/ga-data-extraction-in-r/

  1. Set the Google analytics query parameters for preparing the request URI
  2. Get the access token from Oauth 2.0 Playground
  3. Retrieve and select the Profile
  4. Retrieving GA data

Note it is excellent for learning to use RJSON method as well. You can see the details on the Tatvic blog above.

Hat tip- Vignesh Prajapati

 

 

Note on Internet Privacy (Updated)and a note on DNSCrypt

I noticed the brouaha on Google’s privacy policy. I am afraid that social networks capture much more private information than search engines (even if they integrate my browser history, my social network, my emails, my search engine keywords) – I am still okay. All they are going to do is sell me better ads (maybe than just flood me with ads hoping to get a click). Of course Microsoft should take it one step forward and capture data from my desktop as well for better ads, that would really complete the curve. In any case , with the Patriot Act, most information is available to the Government anyway.

But it does make sense to have an easier to understand privacy policy, and one of my disappointments is the complete lack of visual appeal in such notices. Make things simple as possible, but no simpler, as Al-E said.

 

Privacy activists forget that ads run on models built on AGGREGATED data, and most models are scored automatically. Unless you do something really weird and fake like, chances are the data pertaining to you gets automatically collected, algorithmic-ally aggregated, then modeled and scored, and a corresponding ad to your score, or segment is shown to you. Probably no human eyes see raw data (but big G can clarify that)

 

( I also noticed Google gets a lot of free advice from bloggers. hey, if you were really good at giving advice to Google- they WILL hire you !)

on to another tool based (than legalese based approach to privacy)

I noticed tools like DNSCrypt increase internet security, so that all my integrated data goes straight to people I am okay with having it (ad sellers not governments!)

Unfortunately it is Mac Only, and I will wait for Windows or X based tools for a better review. I noticed some lag in updating these tools , so I can only guess that the boys of Baltimore have been there, so it is best used for home users alone.

 

Maybe they can find a chrome extension for DNS dummies.

http://www.opendns.com/technology/dnscrypt/

Why DNSCrypt is so significant

In the same way the SSL turns HTTP web traffic into HTTPS encrypted Web traffic, DNSCrypt turns regular DNS traffic into encrypted DNS traffic that is secure from eavesdropping and man-in-the-middle attacks.  It doesn’t require any changes to domain names or how they work, it simply provides a method for securely encrypting communication between our customers and our DNS servers in our data centers.  We know that claims alone don’t work in the security world, however, so we’ve opened up the source to our DNSCrypt code base and it’s available onGitHub.

DNSCrypt has the potential to be the most impactful advancement in Internet security since SSL, significantly improving every single Internet user’s online security and privacy.

and

http://dnscurve.org/crypto.html

The DNSCurve project adds link-level public-key protection to DNS packets. This page discusses the cryptographic tools used in DNSCurve.

Elliptic-curve cryptography

DNSCurve uses elliptic-curve cryptography, not RSA.

RSA is somewhat older than elliptic-curve cryptography: RSA was introduced in 1977, while elliptic-curve cryptography was introduced in 1985. However, RSA has shown many more weaknesses than elliptic-curve cryptography. RSA’s effective security level was dramatically reduced by the linear sieve in the late 1970s, by the quadratic sieve and ECM in the 1980s, and by the number-field sieve in the 1990s. For comparison, a few attacks have been developed against some rare elliptic curves having special algebraic structures, and the amount of computer power available to attackers has predictably increased, but typical elliptic curves require just as much computer power to break today as they required twenty years ago.

IEEE P1363 standardized elliptic-curve cryptography in the late 1990s, including a stringent list of security criteria for elliptic curves. NIST used the IEEE P1363 criteria to select fifteen specific elliptic curves at five different security levels. In 2005, NSA issued a new “Suite B” standard, recommending the NIST elliptic curves (at two specific security levels) for all public-key cryptography and withdrawing previous recommendations of RSA.

Some specific types of elliptic-curve cryptography are patented, but DNSCurve does not use any of those types of elliptic-curve cryptography.

 

Adding / to robots. text again

So I tried to move without a search engine , and only social sharing, but for a small blog like mine, that means almost 75% of traffic comes via search engines.
Maybe the ratio of traffic from search to social will change in the future,

I have now enough data to conclude search is the ONLY statistically significant driver of traffic ( for a small blog)
If you are a blogger you should definitely try and give the tools at Google Webmaster a go,

eg

 

https://www.google.com/webmasters/tools/googlebot-fetch

URL Googlebot type Fetch Status Fetch date
https://decisionstats.com/ Web Denied by robots.txt 1/19/12 8:25 PM
https://decisionstats.com/ Web Success URL and linked pages submitted to index 12/27/11 9:55 PM

 

Also from Google Analytics, I see that denying search traffic doesnot increase direct/ referral traffic in any meaningful way.

So my hypothesis that some direct traffic was mis-counted as search traffic due to Chrome, toolbar search – well the hypothesis was wrong 🙂

Also Google seems to drop url quite quickly (within 18 hours) and I will test the rebound in SERPs in a few hours.  I was using meta tags, blocked using robots.txt, and removal via webmasters ( a combination of the three may have helped)

To my surprise search traffic declined to 5-10, but it did not become 0. I wonder why that happens (I even got a few Google queries per day) and I was blocking the “/” fron robots.txt.

 

Net Net- The numbers below show- as of now , in a non SOPA, non Social world, Search Engines remain the webmasters only true friend (till they come up with another panda or whatever update 😉 )

Going off Search Radar for 2012 Q1

I just used the really handy tools at

https://www.google.com/webmasters/tools/crawl-access

, clicked Remove URL

https://www.google.com/webmasters/tools/crawl-access?hl=en&siteUrl=https://decisionstats.com/&tid=removal-list

and submitted http://www.decisionstats.com

and I also modified my robots.txt file to

User-agent: *
Disallow: /

Just to make sure- I added the meta tag to each right margin of my blog

“<meta name=”robots” content=”noindex”>”

Now for last six months of 2011 as per Analytics, search engines were really generous to me- Giving almost 170 K page views,

Source                            Visits          Pages/Visit
1. google                       58,788                       2.14
2. (direct)                     10,832                       2.24
3. linkedin.com            2,038                       2.50
4. google.com                1,823                       2.15
5. bing                              1,007                      2.04
6. reddit.com                    749                       1.93
7. yahoo                              740                      2.25
8. google.co.in                  576                       2.13
9. search                             572                       2.07

 

I do like to experiment though, and I wonder if search engines just –

1) Make people lazy to bookmark or type the whole website name in Chrome/Opera  toolbars

2) Help disguise sources of traffic by encrypted search terms

3) Help disguise corporate traffic watchers and aggregators

So I am giving all spiders a leave for Q1 2012. I am interested in seeing impact of this on my traffic , and I suspect that the curves would not be as linear as I think.

Is search engine optimization over rated? Let the data decide…. 🙂

I am also interested in seeing how social sharing can impact traffic in the absence of search engine interaction effects- and whether it is possible to retain a bigger chunk of traffic by reducing SEO efforts and increasing social efforts!

 

Careers in #Rstats

I saw a posting for career with Revolution Analytics. Now I am probably on the wrong side of a H1 visa and the C,R skill-o-meter, but these look great for any aspiring R coder. Includes one free lance opp as well.

http://www.revolutionanalytics.com/aboutus/careers.php

We have many opportunities opening up—among them:

Job Title Location
Pre-sales Consultants / Technical Sales Palo Alto, CA
Parallel Computing Developer Palo Alto, CA or Seattle, WA
R Programmer (Freelance) Palo Alto, CA
Software Training Course Developer (Freelance) Palo Alto, CA
Build / Release Engineer Seattle, WA
QA Engineer Seattle, WA
Technical Writer Seattle, WA

 

Please send your resume to careers@revolutionanalytics.com

2) Indeed.com

Searching for “R” jobs and not just , R jobs, gives better results in search engines and job sites. It is still a tough keyword to search but it is getting better.

You can use this RSS feed http://www.indeed.co.in/rss?q=%22R%22++analytics+jobs or send by email option to get alerts

3) http://icrunchdata.com/

 

I Crunch Data has a good number of Analytics Jobs, and again using the keyword as R within quotes of “R” you can see lots of jobs here

http://www.icrunchdata.com/ViewJob.aspx?id=334914&keys=%22R%22

There used to be a Google Group on R jobs, but is too low volume compared to the actual number of R jobs out there.

Note the big demand is for analytics, and knowing more than one platform helps you in the job search than knowing just a single language.

 

 

 

%d bloggers like this: