App to App Porting

I often wonder why bright, intelligent software programmers go out of their way to write turgid and lengthy words in documentation, do not make  step by step screenshot/slides for Tutorials, and practically force everyone to reinvent the wheel everytime they create a new platform.

Top of my wish list for 2012-

1) Better GUI  for APP CREATION-

example-A GUI utility to create chrome apps something similar to Android  App creator http://www.appinventorbeta.com/about/

2)  Automated Porting or Translation-

An automated appsot app for reading in an iOS app (or iPhone app) and churning out the necessary Android app code. This is similar to translating blogs from one blogging platform to another using Python at http://code.google.com/p/google-blog-converters-appengine/

 

but the woefully underpowered http://wordpress2blogger.appspot.com/ currently allows only downloads less than 1 MB, while WordPress itself allows 15 MB export files.

3) Better interaction between cloud and desktop apps

example – (google docs and libre office)  or webcams to (google hangouts and google voice /youtube)

Are we there yet? Not appy enough !

 

 

Statistics on Social Media

Some official statistics on social media from the owners themselves

1) Facebook-

http://www.facebook.com/press/info.php?statistics

Date -17 Nov 2011

Statistics

People on Facebook

Preview- Google Cloud SQL

From –http://code.google.com/apis/sql/

What is Google Cloud SQL?

Google Cloud SQL is web service that allows you to create, configure, and use relational databases with your App Engine applications. It is a fully-managed service that maintains, manages, and administers your databases, allowing you to focus on your applications and services.

By offering the capabilities of a MySQL database, the service enables you to easily move your data, applications, and services into and out of the cloud. This allows for high data portability and helps in faster time-to-market because you can quickly leverage your existing database (using JDBC and/or DB-API) in your App Engine application.

Here is where you can get an invite to the beta only Google Cloud SQL

Sign up for Limited Preview

Google Cloud SQL is available to a limited number of users. To sign up for the service:

  1. Visit the Google APIs Console. The console opens the All services pane.
  2. Find the SQL Service line in the Services table and click Request access…
  3. Fill out the enrollment form.
  4. Our team will review your enrollment information and respond by email to the address associated with your Google Account.
  5. Follow the link in the email to view the Terms of Service. Please read these carefully before accepting.
  6. Sign up for the google-cloud-sql-announce group to receive important announcements and product news. (NOTE- Members: 384)
and after all that violence and double talk, a walk in the clouds with SQL.
1. There are three kinds of instances in the beta view
2. Wait for the Instance to be created note- the Design of the Interface uptil now is much better than Amazon’s.  
Note you need to have an appspot application from Google Apps and can choose between the Python and Java versions. Quite clearly there is a play for other languages too. I think GO is also supported.
3. You can import your data from your Google Storage bucket
4. I am not that hot at coding or maybe the interface was too pretty. Anyways- the log tells me that import of the text file has failed from Google Storage to Google Cloud SQL 
5. Incidentally the Google Cloud Storage interface is also much better than the Amazon GUI for transferring data- Note I was using the classical statistical dataset Boston Housing Data as the test case. 
6. The SQL prompt is the weakest part of the design process of the Interphase. There is no Query builder and the SELECT FROM WHERE prompt is slightly amusing/ insulting . I mean guys either throw in a fully fledged GUI for query builder similar to the MYSQL Workbench , than create a pretty white command prompt.
7. You can also export your data back to your Google Storage bucket 
These are early days, and I am trying to see if there is a play for some cloud kind of ODBC action between R, Prediction API , and the cloud SQL… so try it out yourself at http://code.google.com/apis/sql/ and see if there is any juice you can build  here.

Google Dart a new programming language for web applications

From Google a new language for structured web applications-

http://www.dartlang.org/docs/technical-overview/index.html ( a rather unstructured website, if I may add)

Dart is a new class-based programming language for creating structured web applications. Developed with the goals of simplicity, efficiency, and scalability, the Dart language combines powerful new language features with familiar language constructs into a clear, readable syntax.

  • structured yet flexible programming language for the web.
  • Make Dart feel familiar and natural to programmers and thus easy to learn.
  • Ensure that all Dart language constructs allow high performance and fast application startup.
  • Make Dart appropriate for the full range of devices on the web—including phones, tablets, laptops, and servers.
  • Provide tools that make Dart run fast across all major modern browsers.

These design goals address the following problems currently facing web developers:

  • Small scripts often evolve into large web applications with no apparent structure—they’re hard to debug and difficult to maintain. In addition, these monolithic apps can’t be split up so that different teams can work on them independently. It’s difficult to be productive when a web application gets large.
  • Scripting languages are popular because their lightweight nature makes it easy to write code quickly. Generally, the contracts with other parts of an application are conveyed in comments rather than in the language structure itself. As a result, it’s difficult for someone other than the author to read and maintain a particular piece of code.
  • With existing languages, the developer is forced to make a choice between static and dynamic languages. Traditional static languages require heavyweight toolchains and a coding style that can feel inflexible and overly constrained.
  • Developers have not been able to create homogeneous systems that encompass both client and server, except for a few cases such as Node.js and Google Web Toolkit (GWT).
  • Different languages and formats entail context switches that are cumbersome and add complexity to the coding process.

10 Ways We will miss Steve Jobs

I am not an Apple fanboy.In fact I dont use a Mac (because Linux works well for me at much cheaper rates)

I am going to miss Steve Jobs like I miss …… still.

1) The Original Pirate – I liked Steve Jobs ever since I saw Pirates of Silicon Valley, I wanted to be like the Jobs who created jobs http://en.wikipedia.org/wiki/Pirates_of_Silicon_Valley

Artists steal. Yeah baby!

2) Music -Itunes Improbably the man who came up with the idea of music @ 99 cents helped more artists earn money in the era of Napster. Music piracy is not dead, but at 99 cents you CAN afford the songs

3) Aesthetics- and Design- as competitive barriers. It was all about the interface. People care about interfaces. Shoody software wont sell.

4) Portable Music- yes I once wrote a poem on my first Ipod. http://www.decisionstats.com/ode-to-an-ipod/ No , it doesnot rank as the top ten poems on Ipod in SERP

Walkman ‘s evolution was the Ipod – and it was everywhere.

5) Big Phones can be cool too- I loved my IPhone and so did everyone. But thats because making cool phones before that was all about making the tiniest thinnest phone. Using Videochat on Iphone and webs surfing were way much cooler than anything before or since.

6) Apps for Money for Geeks. Yes the Apps marketplace was more enriching to the geek universe than all open source put together.

7) Turtleneck Steve- You know when Steve Jobs was about to make a presentation because one week before and one week later the whole tech media behaved like either a fanboy or we are too cool to be an Apple fanboy but we will report it still. The man who wrote no code sold more technology than everyone else using just a turtleneck and presentations.

8) Pixar toons- Yes Pixar toons made sure cartoons were pieces of art and not just funny stuff anymore. This one makes me choke up

9) Kicking Microsoft butt- Who else but Steve can borrow money from MS and then beat it in every product it wanted to.

10) Not being evil. Steve Jobs made more money for more geeks than anyone. and he made it look good! The original DONT BE EVIL guy who never needed to say it aloud

Take a bow Steve Jobs (or touch the first Apple product that comes to your hand after reading this!)

The article was first written on Aug 25,2011 on Steve Jobs resignation news.It has been updated to note his departing from this planet as of yesterday.

 

 

 

 

Page Mathematics

I was looking at the site http://www.google.com/adplanner/static/top1000/index.html

and I saw this list (Below) and using a Google Doc at https://docs.google.com/spreadsheet/pub?hl=en_US&hl=en_US&key=0AtYMMvghK2ytdE9ybmVQeUxMeXdjWlVKYzRlMkxjX0E&output=html.

I then decided to divide  pageviews by users to check the maths

Facebook is AAAAAmazing! and the Russian social network is amazing too!

or

The maths is wrong! (maybe sampling, maybe virtual pageviews caused by friendstream refresh)

but the average of 1,136 page views per unique visitor per month means 36 page views /visitor a Day!

Rank Site     Category        Unique Visitors (users) Page Views Views/Visitors
1  facebook.com	  Social Networks	880000000 1000000000000	1,136
29 linkedin.com	  Social Networks	80000000     2500000000	31
38 orkut.com	  Social Networks	66000000     4000000000	61
40 orkut.com.br	  Social Networks	62000000    43000000000	694
65 weibo.com	  Social Networks	42000000     2800000000	67
66 renren.com	  Social Networks	42000000     3300000000	79
84 odnoklassniki.ru Social Networks	37000000    13000000000	351
90 scribd.com	  Social Networks	34000000      140000000	4
95 vkontakte.ru	  Social Networks	34000000    48000000000	1,412
and
Rank Site	Category  Unique Visitors (users)Page Views	Page Views/Visitors
1 facebook.com	Social Networks	880000000	1000000000000	1,136
2 youtube.com	Online Video	800000000	100000000000	125
3 yahoo.com	Web Portals	590000000	77000000000	131
4 live.com	Search Engines	490000000	84000000000	171
5 msn.com	Web Portals	440000000	20000000000	45
6 wikipedia.org	Dict    	410000000	6000000000	15
7 blogspot.com	Blogging	340000000	4900000000	14
8 baidu.com	Search Engines	300000000	110000000000	367
9 microsoft.com	Software	250000000	2500000000	10
10	 qq.com	Web Portals	250000000	39000000000	156

see complete list at http://www.google.com/adplanner/static/top1000/index.html Continue reading “Page Mathematics”

Using Google Fusion Tables from #rstats

But after all that- I was quite happy to see Google Fusion Tables within Google Docs. Databases as a service ? Not quite but still quite good, and lets see how it goes.

https://www.google.com/fusiontables/DataSource?dsrcid=implicit&hl=en_US&pli=1

http://googlesystem.blogspot.com/2011/09/fusion-tables-new-google-docs-app.html

 

But what interests me more is

http://code.google.com/apis/fusiontables/docs/developers_guide.html

The Google Fusion Tables API is a set of statements that you can use to search for and retrieve Google Fusion Tables data, insert new data, update existing data, and delete data. The API statements are sent to the Google Fusion Tables server using HTTP GET requests (for queries) and POST requests (for inserts, updates, and deletes) from a Web client application. The API is language agnostic: you can write your program in any language you prefer, as long as it provides some way to embed the API calls in HTTP requests.

The Google Fusion Tables API does not provide the mechanism for submitting the GET and POST requests. Typically, you will use an existing code library that provides such functionality; for example, the code libraries that have been developed for the Google GData API. You can also write your own code to implement GET and POST requests.

Also see http://code.google.com/apis/fusiontables/docs/sample_code.html

 

Google Fusion Tables API Sample Code

Libraries

SQL API

Language Library Public repository Samples
Python Fusion Tables Python Client Library fusion-tables-client-python/ Samples
PHP Fusion Tables PHP Client Library fusion-tables-client-php/ Samples

Featured Samples

An easy way to learn how to use an API can be to look at sample code. The table above provides links to some basic samples for each of the languages shown. This section highlights particularly interesting samples for the Fusion Tables API.

SQL API

Language Featured samples API version
cURL
  • Hello, cURLA simple example showing how to use curl to access Fusion Tables.
SQL API
Google Apps Script SQL API
Java
  • Hello, WorldA simple walkthrough that shows how the Google Fusion Tables API statements work.
  • OAuth example on fusion-tables-apiThe Google Fusion Tables team shows how OAuth authorization enables you to use the Google Fusion Tables API from a foreign web server with delegated authorization.
SQL API
Python
  • Docs List ExampleDemonstrates how to:
    • List tables
    • Set permissions on tables
    • Move a table to a folder
Docs List API
Android (Java)
  • Basic Sample ApplicationDemo application shows how to create a crowd-sourcing application that allows users to report potholes and save the data to a Fusion Table.
SQL API
JavaScript – FusionTablesLayer Using the FusionTablesLayer, you can display data on a Google Map

Also check out FusionTablesLayer Builder, which generates all the code necessary to include a Google Map with a Fusion Table Layer on your own website.

FusionTablesLayer, Google Maps API
JavaScript – Google Chart Tools Using the Google Chart Tools, you can request data from Fusion Tables to use in visualizations or to display directly in an HTML page. Note: responses are limited to 500 rows of data.

Google Chart Tools

External Resources

Google Fusion Tables is dedicated to providing code examples that illustrate typical uses, best practices, and really cool tricks. If you do something with the Google Fusion Tables API that you think would be interesting to others, please contact us at googletables-feedback@google.com about adding your code to our Examples page.

  • Shape EscapeA tool for uploading shape files to Fusion Tables.
  • GDALOGR Simple Feature Library has incorporated Fusion Tables as a supported format.
  • Arc2CloudArc2Earth has included support for upload to Fusion Tables via Arc2Cloud.
  • Java and Google App EngineODK Aggregate is an AppEngine application by the Open Data Kit team, uses Google Fusion Tables to store survey data that is collected through input forms on Android mobile phones. Notable code:
  • R packageAndrei Lopatenko has written an R interface to Fusion Tables so Fusion Tables can be used as the data store for R.
  • RubySimon Tokumine has written a Ruby gem for access to Fusion Tables from Ruby.

 

Updated-You can use Google Fusion Tables from within R from http://andrei.lopatenko.com/rstat/fusion-tables.R

 

ft.connect <- function(username, password) {
  url = "https://www.google.com/accounts/ClientLogin";
  params = list(Email = username, Passwd = password, accountType="GOOGLE", service= "fusiontables", source = "R_client_API")
 connection = postForm(uri = url, .params = params)
 if (length(grep("error", connection, ignore.case = TRUE))) {
 	stop("The wrong username or password")
 	return ("")
 }
 authn = strsplit(connection, "\nAuth=")[[c(1,2)]]
 auth = strsplit(authn, "\n")[[c(1,1)]]
 return (auth)
}

ft.disconnect <- function(connection) {
}

ft.executestatement <- function(auth, statement) {
      url = "http://tables.googlelabs.com/api/query"
      params = list( sql = statement)
      connection.string = paste("GoogleLogin auth=", auth, sep="")
      opts = list( httpheader = c("Authorization" = connection.string))
      result = postForm(uri = url, .params = params, .opts = opts)
      if (length(grep("<HTML>\n<HEAD>\n<TITLE>Parse error", result, ignore.case = TRUE))) {
      	stop(paste("incorrect sql statement:", statement))
      }
      return (result)
}

ft.showtables <- function(auth) {
   url = "http://tables.googlelabs.com/api/query"
   params = list( sql = "SHOW TABLES")
   connection.string = paste("GoogleLogin auth=", auth, sep="")
   opts = list( httpheader = c("Authorization" = connection.string))
   result = getForm(uri = url, .params = params, .opts = opts)
   tables = strsplit(result, "\n")
   tableid = c()
   tablename = c()
   for (i in 2:length(tables[[1]])) {
     	str = tables[[c(1,i)]]
   	    tnames = strsplit(str,",")
   	    tableid[i-1] = tnames[[c(1,1)]]
   	    tablename[i-1] = tnames[[c(1,2)]]
   	}
   	tables = data.frame( ids = tableid, names = tablename)
    return (tables)
}

ft.describetablebyid <- function(auth, tid) {
   url = "http://tables.googlelabs.com/api/query"
   params = list( sql = paste("DESCRIBE", tid))
   connection.string = paste("GoogleLogin auth=", auth, sep="")
   opts = list( httpheader = c("Authorization" = connection.string))
   result = getForm(uri = url, .params = params, .opts = opts)
   columns = strsplit(result,"\n")
   colid = c()
   colname = c()
   coltype = c()
   for (i in 2:length(columns[[1]])) {
     	str = columns[[c(1,i)]]
   	    cnames = strsplit(str,",")
   	    colid[i-1] = cnames[[c(1,1)]]
   	    colname[i-1] = cnames[[c(1,2)]]
   	    coltype[i-1] = cnames[[c(1,3)]]
   	}
   	cols = data.frame(ids = colid, names = colname, types = coltype)
    return (cols)
}

ft.describetable <- function (auth, table_name) {
   table_id = ft.idfromtablename(auth, table_name)
   result = ft.describetablebyid(auth, table_id)
   return (result)
}

ft.idfromtablename <- function(auth, table_name) {
    tables = ft.showtables(auth)
	tableid = tables$ids[tables$names == table_name]
	return (tableid)
}

ft.importdata <- function(auth, table_name) {
	tableid = ft.idfromtablename(auth, table_name)
	columns = ft.describetablebyid(auth, tableid)
	column_spec = ""
	for (i in 1:length(columns)) {
		column_spec = paste(column_spec, columns[i, 2])
		if (i < length(columns)) {
			column_spec = paste(column_spec, ",", sep="")
		}
	}
	mdata = matrix(columns$names,
	              nrow = 1, ncol = length(columns),
	              dimnames(list(c("dummy"), columns$names)), byrow=TRUE)
	select = paste("SELECT", column_spec)
	select = paste(select, "FROM")
	select = paste(select, tableid)
	result = ft.executestatement(auth, select)
    numcols = length(columns)
    rows = strsplit(result, "\n")
    for (i in 3:length(rows[[1]])) {
    	row = strsplit(rows[[c(1,i)]], ",")
    	mdata = rbind(mdata, row[[1]])
   	}
   	output.frame = data.frame(mdata[2:length(mdata[,1]), 1])
   	for (i in 2:ncol(mdata)) {
   		output.frame = cbind(output.frame, mdata[2:length(mdata[,i]),i])
   	}
   	colnames(output.frame) = columns$names
    return (output.frame)
}

quote_value <- function(value, to_quote = FALSE, quote = "'") {
	 ret_value = ""
     if (to_quote) {
     	ret_value = paste(quote, paste(value, quote, sep=""), sep="")
     } else {
     	ret_value = value
     }
     return (ret_value)
}

converttostring <- function(arr, separator = ", ", column_types) {
	con_string = ""
	for (i in 1:(length(arr) - 1)) {
		value = quote_value(arr[i], column_types[i] != "number")
		con_string = paste(con_string, value)
	    con_string = paste(con_string, separator, sep="")
	}

    if (length(arr) >= 1) {
    	value = quote_value(arr[length(arr)], column_types[length(arr)] != "NUMBER")
    	con_string = paste(con_string, value)
    }
}

ft.exportdata <- function(auth, input_frame, table_name, create_table) {
	if (create_table) {
       create.table = "CREATE TABLE "
       create.table = paste(create.table, table_name)
       create.table = paste(create.table, "(")
       cnames = colnames(input_frame)
       for (columnname in cnames) {
         create.table = paste(create.table, columnname)
    	 create.table = paste(create.table, ":string", sep="")
    	   if (columnname != cnames[length(cnames)]){
    		  create.table = paste(create.table, ",", sep="")
           }
       }
      create.table = paste(create.table, ")")
      result = ft.executestatement(auth, create.table)
    }
    if (length(input_frame[,1]) > 0) {
    	tableid = ft.idfromtablename(auth, table_name)
	    columns = ft.describetablebyid(auth, tableid)
	    column_spec = ""
	    for (i in 1:length(columns$names)) {
		   column_spec = paste(column_spec, columns[i, 2])
		   if (i < length(columns$names)) {
			  column_spec = paste(column_spec, ",", sep="")
		   }
	    }
    	insert_prefix = "INSERT INTO "
    	insert_prefix = paste(insert_prefix, tableid)
    	insert_prefix = paste(insert_prefix, "(")
    	insert_prefix = paste(insert_prefix, column_spec)
    	insert_prefix = paste(insert_prefix, ") values (")
    	insert_suffix = ");"
    	insert_sql_big = ""
    	for (i in 1:length(input_frame[,1])) {
    		data = unlist(input_frame[i,])
    		values = converttostring(data, column_types  = columns$types)
    		insert_sql = paste(insert_prefix, values)
    		insert_sql = paste(insert_sql, insert_suffix) ;
    		insert_sql_big = paste(insert_sql_big, insert_sql)
    		if (i %% 500 == 0) {
    			ft.executestatement(auth, insert_sql_big)
    			insert_sql_big = ""
    		}
    	}
        ft.executestatement(auth, insert_sql_big)
    }
}
%d bloggers like this: