Note on Internet Privacy (Updated)and a note on DNSCrypt

I noticed the brouaha on Google’s privacy policy. I am afraid that social networks capture much more private information than search engines (even if they integrate my browser history, my social network, my emails, my search engine keywords) – I am still okay. All they are going to do is sell me better ads (maybe than just flood me with ads hoping to get a click). Of course Microsoft should take it one step forward and capture data from my desktop as well for better ads, that would really complete the curve. In any case , with the Patriot Act, most information is available to the Government anyway.

But it does make sense to have an easier to understand privacy policy, and one of my disappointments is the complete lack of visual appeal in such notices. Make things simple as possible, but no simpler, as Al-E said.

 

Privacy activists forget that ads run on models built on AGGREGATED data, and most models are scored automatically. Unless you do something really weird and fake like, chances are the data pertaining to you gets automatically collected, algorithmic-ally aggregated, then modeled and scored, and a corresponding ad to your score, or segment is shown to you. Probably no human eyes see raw data (but big G can clarify that)

 

( I also noticed Google gets a lot of free advice from bloggers. hey, if you were really good at giving advice to Google- they WILL hire you !)

on to another tool based (than legalese based approach to privacy)

I noticed tools like DNSCrypt increase internet security, so that all my integrated data goes straight to people I am okay with having it (ad sellers not governments!)

Unfortunately it is Mac Only, and I will wait for Windows or X based tools for a better review. I noticed some lag in updating these tools , so I can only guess that the boys of Baltimore have been there, so it is best used for home users alone.

 

Maybe they can find a chrome extension for DNS dummies.

http://www.opendns.com/technology/dnscrypt/

Why DNSCrypt is so significant

In the same way the SSL turns HTTP web traffic into HTTPS encrypted Web traffic, DNSCrypt turns regular DNS traffic into encrypted DNS traffic that is secure from eavesdropping and man-in-the-middle attacks.  It doesn’t require any changes to domain names or how they work, it simply provides a method for securely encrypting communication between our customers and our DNS servers in our data centers.  We know that claims alone don’t work in the security world, however, so we’ve opened up the source to our DNSCrypt code base and it’s available onGitHub.

DNSCrypt has the potential to be the most impactful advancement in Internet security since SSL, significantly improving every single Internet user’s online security and privacy.

and

http://dnscurve.org/crypto.html

The DNSCurve project adds link-level public-key protection to DNS packets. This page discusses the cryptographic tools used in DNSCurve.

Elliptic-curve cryptography

DNSCurve uses elliptic-curve cryptography, not RSA.

RSA is somewhat older than elliptic-curve cryptography: RSA was introduced in 1977, while elliptic-curve cryptography was introduced in 1985. However, RSA has shown many more weaknesses than elliptic-curve cryptography. RSA’s effective security level was dramatically reduced by the linear sieve in the late 1970s, by the quadratic sieve and ECM in the 1980s, and by the number-field sieve in the 1990s. For comparison, a few attacks have been developed against some rare elliptic curves having special algebraic structures, and the amount of computer power available to attackers has predictably increased, but typical elliptic curves require just as much computer power to break today as they required twenty years ago.

IEEE P1363 standardized elliptic-curve cryptography in the late 1990s, including a stringent list of security criteria for elliptic curves. NIST used the IEEE P1363 criteria to select fifteen specific elliptic curves at five different security levels. In 2005, NSA issued a new “Suite B” standard, recommending the NIST elliptic curves (at two specific security levels) for all public-key cryptography and withdrawing previous recommendations of RSA.

Some specific types of elliptic-curve cryptography are patented, but DNSCurve does not use any of those types of elliptic-curve cryptography.

 

Interview JJ Allaire Founder, RStudio

Here is an interview with JJ Allaire, founder of RStudio. RStudio is the IDE that has overtaken other IDE within the R Community in terms of ease of usage. On the eve of their latest product launch, JJ talks to DecisionStats on RStudio and more.

Ajay-  So what is new in the latest version of RStudio and how exactly is it useful for people?

JJ- The initial release of RStudio as well as the two follow-up releases we did last year were focused on the core elements of using R: editing and running code, getting help, and managing files, history, workspaces, plots, and packages. In the meantime users have also been asking for some bigger features that would improve the overall work-flow of doing analysis with R. In this release (v0.95) we focused on three of these features:

Projects. R developers tend to have several (and often dozens) of working contexts associated with different clients, analyses, data sets, etc. RStudio projects make it easy to keep these contexts well separated (with distinct R sessions, working directories, environments, command histories, and active source documents), switch quickly between project contexts, and even work with multiple projects at once (using multiple running versions of RStudio).

Version Control. The benefits of using version control for collaboration are well known, but we also believe that solo data analysis can achieve significant productivity gains by using version control (this discussion on Stack Overflow talks about why). In this release we introduced integrated support for the two most popular open-source version control systems: Git and Subversion. This includes changelist management, file diffing, and browsing of project history, all right from within RStudio.

Code Navigation. When you look at how programmers work a surprisingly large amount of time is spent simply navigating from one context to another. Modern programming environments for general purpose languages like C++ and Java solve this problem using various forms of code navigation, and in this release we’ve brought these capabilities to R. The two main features here are the ability to type the name of any file or function in your project and go immediately to it; and the ability to navigate to the definition of any function under your cursor (including the definition of functions within packages) using a keystroke (F2) or mouse gesture (Ctrl+Click).

Ajay- What’s the product road map for RStudio? When can we expect the IDE to turn into a full fledged GUI?

JJ- Linus Torvalds has said that “Linux is evolution, not intelligent design.” RStudio tries to operate on a similar principle—the world of statistical computing is too deep, diverse, and ever-changing for any one person or vendor to map out in advance what is most important. So, our internal process is to ship a new release every few months, listen to what people are doing with the product (and hope to do with it), and then start from scratch again making the improvements that are considered most important.

Right now some of the things which seem to be top of mind for users are improved support for authoring and reproducible research, various editor enhancements including code folding, and debugging tools.

What you’ll see is us do in a given release is to work on a combination of frequently requested features, smaller improvements to usability and work-flow, bug fixes, and finally architectural changes required to support current or future feature requirements.

While we do try to base what we work on as closely as possible on direct user-feedback, we also adhere to some core principles concerning the overall philosophy and direction of the product. So for example the answer to the question about the IDE turning into a full-fledged GUI is: never. We believe that textual representations of computations provide fundamental advantages in transparency, reproducibility, collaboration, and re-usability. We believe that writing code is simply the right way to do complex technical work, so we’ll always look for ways to make coding better, faster, and easier rather than try to eliminate coding altogether.

Ajay -Describe your journey in science from a high school student to your present work in R. I noticed you have been very successful in making software products that have been mostly proprietary products or sold to companies.

Why did you get into open source products with RStudio? What are your plans for monetizing RStudio further down the line?

JJ- In high school and college my principal areas of study were Political Science and Economics. I also had a very strong parallel interest in both computing and quantitative analysis. My first job out of college was as a financial analyst at a government agency. The tools I used in that job were SAS and Excel. I had a dim notion that there must be a better way to marry computation and data analysis than those tools, but of course no concept of what this would look like.

From there I went more in the direction of general purpose computing, starting a couple of companies where I worked principally on programming languages and authoring tools for the Web. These companies produced proprietary software, which at the time (between 1995 and 2005) was a workable model because it allowed us to build the revenue required to fund development and to promote and distribute the software to a wider audience.

By 2005 it was however becoming clear that proprietary software would ultimately be overtaken by open source software in nearly all domains. The cost of development had shrunken dramatically thanks to both the availability of high-quality open source languages and tools as well as the scale of global collaboration possible on open source projects. The cost of promoting and distributing software had also collapsed thanks to efficiency of both distribution and information diffusion on the Web.

When I heard about R and learned more about it, I become very excited and inspired by what the project had accomplished. A group of extremely talented and dedicated users had created the software they needed for their work and then shared the fruits of that work with everyone. R was a platform that everyone could rally around because it worked so well, was extensible in all the right ways, and most importantly was free (as in speech) so users could depend upon it as a long-term foundation for their work.

So I started RStudio with the aim of making useful contributions to the R community. We started with building an IDE because it seemed like a first-rate development environment for R that was both powerful and easy to use was an unmet need. Being aware that many other companies had built successful businesses around open-source software, we were also convinced that we could make RStudio available under a free and open-source license (the AGPLv3) while still creating a viable business. At this point RStudio is exclusively focused on creating the best IDE for R that we can. As the core product gets where it needs to be over the next couple of years we’ll then also begin to sell other products and services related to R and RStudio.

About-

http://rstudio.org/docs/about

Jjallaire

JJ Allaire

JJ Allaire is a software engineer and entrepreneur who has created a wide variety of products including ColdFusion,Windows Live WriterLose It!, and RStudio.

From http://en.wikipedia.org/wiki/Joseph_J._Allaire
In 1995 Joseph J. (JJ) Allaire co-founded Allaire Corporation with his brother Jeremy Allaire, creating the web development tool ColdFusion.[1] In March 2001, Allaire was sold to Macromedia where ColdFusion was integrated into the Macromedia MX product line. Macromedia was subsequently acquired by Adobe Systems, which continues to develop and market ColdFusion.
After the sale of his company, Allaire became frustrated at the difficulty of keeping track of research he was doing using Google. To address this problem, he co-founded Onfolio in 2004 with Adam Berrey, former Allaire co-founder and VP of Marketing at Macromedia.
On March 8, 2006, Onfolio was acquired by Microsoft where many of the features of the original product are being incorporated into the Windows Live Toolbar. On August 13, 2006, Microsoft released the public beta of a new desktop blogging client called Windows Live Writer that was created by Allaire’s team at Microsoft.
Starting in 2009, Allaire has been developing a web-based interface to the widely used R technical computing environment. A beta version of RStudio was publicly released on February 28, 2011.
JJ Allaire received his B.A. from Macalester College (St. Paul, MN) in 1991.
RStudio-

RStudio is an integrated development environment (IDE) for R which works with the standard version of R available from CRAN. Like R, RStudio is available under a free software license. RStudio is designed to be as straightforward and intuitive as possible to provide a friendly environment for new and experienced R users alike. RStudio is also a company, and they plan to sell services (support, training, consulting, hosting) related to the open-source software they distribute.

Jim Kobielus on 2012

Jim Kobielus revisits the predictions he made in 2011 (and a summary of 2010) , and makes some fresh ones for 2012. For technology watchers, this is an article by one of the gurus of enterprise software.

 

All of those trends predictions (at http://www.decisionstats.com/brief-interview-with-james-g-kobielus/ ) came true in 2011, and are in full force in 2012 as well.Here are my predictions for 2012, and the links to the 3 blogposts in which I made them last month:

 

The Year Ahead in Next Best Action? Here’s the Next Best Thing to a Crystal Ball!

  • The next-best-action market will continue to coalesce around core solution capabilities.
  • Data scientists will become the principal application developers for next best action.
  • Real-world experiments will become the new development paradigm in next best action.

The Year Ahead in Advanced Analytics? Advances on All Fronts!

  • Open-source platforms will expand their footprint in advanced analytics.
  • Data science centers of excellence will spring up everywhere.
  • Predictive analytics and interactive exploration will enter the mainstream BI user experience:

The Year Ahead In Big Data? Big, Cool, New Stuff Looms Large!

  • Enterprise Hadoop deployments will expand at a rapid clip.
  • In-memory analytics platforms will grow their footprint.
  • Graph databases will come into vogue.

 

And in an exclusive and generous favor for DecisionStats, Jim does some crystal gazing for the cloud computing field in 2012-

Cloud/SaaS EDWs will cross the enterprise-adoption inflection point. In 2012, cloud and software-as-a-service (SaaS) enterprise data warehouses (EDWs), offered on a public subscription basis, will gain greater enterprise adoption as a complement or outright replacement for appliance- and software-based EDWs. A growing number of established and startup EDW vendors will roll out cloud/SaaS “Big Data” offerings. Many of these will supplement and extend RDBMS and columnar technologies with Hadoop, key-value, graph, document, and other new database architectures.

About-

http://www.forrester.com/rb/analyst/james_kobielus

James G. Kobielus James G. Kobielus
Senior Analyst

RESEARCH FOCUS

 

James serves Business Process & Application Development & Delivery Professionals. He is a leading expert on data warehousing, predictive analytics, data mining, and complex event processing. In addition to his core coverage areas, James contributes to Forrester’s research in business intelligence, data integration, data quality, and master data management.

 

PREVIOUS WORK EXPERIENCE

 

James has a long history in IT research and consulting and has worked for both vendors and research firms. Most recently, he was at Current Analysis, an IT research firm, where he was a principal analyst covering topics ranging from data warehousing to data integration and the Semantic Web. Prior to that position, James was a senior technical systems analyst at Exostar (a hosted supply chain management and eBusiness hub for the aerospace and defense industry). In this capacity, James was responsible for identifying and specifying product/service requirements for federated identity, PKI, and other products. He also worked as an analyst for the Burton Group and was previously employed by LCC International, DynCorp, ADEENA, International Center for Information Technologies, and the North American Telecommunications Association. He is both well versed and experienced in product and market assessments. James is a widely published business/technology author and has spoken at many industry events.

Contact –

Twitter: http://twitter.com/jameskobielus

Who made Libre Office

From

 

http://www.libreoffice.org/about-us/credits/

 

Credits

513 individuals contributed to OpenOffice.org (and whose contributions were imported into LibreOffice) or LibreOffice until 2011-11-11 09:02:38.

Developers committing code since 2010-09-28

Ruediger Timm
Commits: 89832
Joined: 2000-10-10
Kurt Zenker
Commits: 32763
Joined: 2000-09-25
Oliver Bolte
Commits: 31795
Joined: 2000-09-19
Vladimir Glazunov
Commits: 30289
Joined: 2000-12-04
Jens-Heiner Rechtien [hr]
Commits: 29314
Joined: 2000-09-18
Ivo Hinkelmann
Commits: 10228
Joined: 2002-09-09
Caolán McNamara
Commits: 5952
Joined: 2000-10-10
Frank Schoenheit [fs]
Commits: 5019
Joined: 2000-09-19
Hans-Joachim Lankenau
Commits: 3077
Joined: 2000-09-19
Ocke Janssen [oj]
Commits: 2861
Joined: 2000-09-20
Mathias Bauer
Commits: 2606
Joined: 2000-09-20
Oliver Specht
Commits: 2458
Joined: 2000-09-21
Philipp Lohmann [pl]
Commits: 2132
Joined: 2000-09-21
Tor Lillqvist
Commits: 2035
Joined: 2010-03-23
Stephan Bergmann
Commits: 1993
Joined: 2000-10-04
Christian Lippka ORACLE
Commits: 1811
Joined: 2000-09-25

We do not distinguish between commits that were imported from the OOo code base and those that went directly into the LibreOffice code base as:
a) it is technically not possible to distinguish between commits that go directly into the LibreOffice code base and commits that were merged in from the OpenOffice.org code base, and
b) contributers to the OOo code base should also be credited for the excellent work they do.

Do note that LibreOffice is divided into 20 git repositories. Pushing a change into all repositories will be counted as 20 commits as there is no way to distinguish this from 20 separate commits.

Total contributions to the TDF Wiki

1223 individuals contributed:

PMML Augustus

Here is a new-old system in open source for

for building and scoring statistical models designed to work with data sets that are too large to fit into memory.

http://code.google.com/p/augustus/

Augustus is an open source software toolkit for building and scoring statistical models. It is written in Python and its
most distinctive features are:
• Ability to be used on sets of big data; these are data sets that exceed either memory capacity or disk capacity, so
that existing solutions like R or SAS cannot be used. Augustus is also perfectly capable of handling problems
that can fit on one computer.
• PMML compliance and the ability to both:
– produce models with PMML-compliant formats (saved with extension .pmml).
– consume models from files with the PMML format.
Augustus has been tested and deployed on serveral operating systems. It is intended for developers who work in the
financial or insurance industry, information technology, or in the science and research communities.
Usage
Augustus produces and consumes Baseline, Cluster, Tree, and Ruleset models. Currently, it uses an event-based
approach to building Tree, Cluster and Ruleset models that is non-standard.

New to PMML ?

Read on http://code.google.com/p/augustus/wiki/PMML

The Predictive Model Markup Language or PMML is a vendor driven XML markup language for specifying statistical and data mining models. In other words, it is an XML language so that Continue reading “PMML Augustus”

Ads Alliance on Internet

Just saw

the Digital Advertising Alliance’s (DAA) Self-Regulatory Program for Online Behavioral Advertising.

Multi-Site Data Collection Principles Broaden Self Regulation Beyond Online Behavioral Advertising
WASHINGTON, D.C., NOVEMBER 7, 2011

The new Principles consist of the following specific requirements:

  1. Transparency and consumer control for purposes other than OBA – The Multi-Site Data Principles call for organizations that collect Multi-Site Data for purposes other than OBA to provide transparency and control regarding Internet surfing across unrelated Websites.
  2. Collection / use of data for eligibility determination – The Multi-Site Data Principles prohibit the collection, use or transfer of Internet surfing data across Websites for determination of a consumer’s eligibility for employment, credit standing, healthcare treatment and insurance.
  3. Collection / use of children’s data – The Multi-Site Data Principles state that organizations must comply with the Children’s Online Privacy Protection Act (COPPA).
  4. Meaningful accountability – The Multi-Site Data Principles are subject to enforcement through strong accountability mechanisms.

http://www.aboutads.info/principles

The DAA Self-Regulatory Principles

 

The cross-industry Self-Regulatory Principles for Multi-Site Data augment the Self-Regulatory   Principles for Online Behavioral Advertising  (OBA)  by covering the prospective  collection of Web site   data beyond that collected for OBA purposes.  The existing OBA  Principles and definitions  remain in   full force and effect and are not limited by the new  principles.

The cross-industry Self-Regulatory Principles for Online Behavioral Advertising was developed by   leading industry associations to apply  consumer-friendly standards to online  behavioral advertising  across the Internet. Online behavioral advertising increasingly supports the convenient access to  content, services, and applications over the Internet that consumers have come to expect at no cost   to them.

The Education Principle calls for organizations to participate in efforts to educate individuals and businesses about online behavioral advertising and the Principles.

The Transparency Principle calls for clearer and easily accessible disclosures to consumers about data collection and use practices associated with online behavioral advertising. It will result in new, enhanced notice on the page where data is collected through links embedded in or around advertisements, or on the Web page itself.

The Consumer Control Principle provides consumers with an expanded ability to choose whether data is collected and used for online behavioral advertising purposes. This choice will be available through a link from the notice provided on the Web page where data is collected.

The Consumer Control Principle requires “service providers”, a term that includes Internet access service providers and providers of desktop applications software such as Web browser “tool bars” to obtain the consent of users before engaging in online behavioral advertising, and take steps to de-identify the data used for such purposes.

The Data Security Principle calls for organizations to provide appropriate security for, and limited retention of data, collected and used for online behavioral advertising purposes.

The Material Changes Principle calls for obtaining consumer consent before a Material Change is made to an entity’s Online Behavioral Advertising data collection and use policies unless that change will result in less collection or use of data.

The Sensitive Data Principle recognizes that data collected from children and used for online behavioral advertising merits heightened protection, and requires parental consent for behavioral advertising to consumers known to be under 13 on child-directed Web sites. This Principle also provides heightened protections to certain health and financial data when attributable to a specific individual.

The Accountability Principle calls for development of programs to further advance these Principles, including programs to monitor and report instances of uncorrected non-compliance with these Principles to appropriate government agencies. The CBBB and DMA have been asked and agreed to work cooperatively to establish accountability mechanisms under the Principles.

 

Ajay- So why the self regulations?

Answer- Shoddy Maths in behaviorally targeted ads is leading to a very high glut in targeted ads, more than can be reasonably expected to click based on consumer spending. On the internet- unlike on television- cost is less of a barrrier to OVER ADVERTISING.

 

Oracle Public Cloud

The slick website of Oracle Public Cloud- coming soon to an office near your location.

 

and including the oracle social network 😉

http://cloud.oracle.com/mycloud/f?p=service:social:0

Oracle Social Network

A secure collaboration tool for everyone you work with.

Public Cloud

http://cloud.oracle.com/mycloud/f?p=5001:1:0#