Libre Office turns six

On September 28th, 2010, The Document Foundation was announced. The last six months, it feels, have just passed within a short glimpse of time. Not only did we release three LibreOffice versions within three months, have created the LibreOffice-Box DVD image, and brought LibreOffice Portable on its way. We also have announced the LibreOffice Conference for October 2011 and have taken part in lots of events worldwide, with FOSDEM and CeBIT being the most prominent ones.

People follow us at Twitter, Identi.ca, XING, LinkedIn and a Facebook group and fan page, they discuss on our mailing lists with more than 6.000 subscriptions, collaborate in our wiki, get insight on our daily work in our blog, and post and blog themselves. From the very first day, openness, transparency and meritocracy have been shaping the framework we want to work in. Our discussions and decisions take place on a public mailing list, and regularly, we hold phone conferences for the Steering Committee and for the marketing teams, where everyone is invited to join. Our ideas and visions have made their way into our Next Decade Manifesto.

We have joined the Open Invention Network as well as the OpenDoc Society, and just last week have become an SPI-associated project, and we see a wide range of support from all over the world. Not only do Novell and Red Hat support our efforts with developers, but just recently, Canonical, creators of Ubuntu, joined as well. All major Linux distributions deliver LibreOffice with their operating systems, and more follow every day.

One of the most stunning contributions, that still leaves us speechless, is the support that we receive from the community. When we asked for 50,000 € capital stock for a German-based foundation, the community showed their support, appreciation and their power, and not only donated it in just eight days, but up to now has supported us with close to 100,000 €! Another one is that driven by our open, vendor neutral approach, combined with our easy hacks, we have included code contributions from over 150 entirely new developers to the project, alongside localisations from over 50 localizers. The community has developed itself better than we could ever dream of, and first meetings like the project’s weekend or the QA meeting of the Germanophone group are already being organized.

What we have seen now is just the beginning of something very big. The Document Foundation has a vision, and the creation of the foundation in Germany is about to happen soon. LibreOffice has been downloaded over 350,000 times within the first week, and we just counted more than 1,3 million downloads just from our download system — not counting packages directly delivered by Linux distributors, other download sites or DVDs included in magazines and newspapers — supported by 65 mirrors from all over the world, and millions already use and contribute to it worldwide. With our participation in the Google Summer of Code, we will engage more students and young developers to be part of our community. Our improved release schedule will ensure that new features and improvements will make their way to end-users soon, and for testers, we even provide daily builds.

We are so excited by what has been achieved over the last six months, and we are immensely grateful to all those who have supported the project in whatever ways they can. It is an honour to be working with you, to be part of one united community! The future as we are shaping it has just begun, and it will be bright and excellent.

 

from-

List archive: http://listarchives.documentfoundation.org/www/announce/

Input Data in R using the top 3 R GUI

PC-DOS was an early OS for personal computers ...
Image via Wikipedia

One area where clearly GUI methods are preferable to command line methods in R, is data input. There is no need of learning read.csv or read.table when these options are only two clicks away in any R GUI. For academics/students there is a definite need to easily access

datasets from attached packages just as it is a need for business analysts to access databases with a few clicks than learn or read pages of pdf on RODBC. However some GUI (like Rattle) need data only in data frames, rather than list or arrays-this limits R’s flexibility. These are my views but you can see and compare views of data input in R Commander, Rattle and Deducer.



Ways to use both Windows and Linux together

Tux, as originally drawn by Larry Ewing
Image via Wikipedia

Some programming ways to use both Windows and Linux

1) Wubi

http://wubi.sourceforge.net/

Wubi only adds an extra option to boot into Ubuntu. Wubi does not require you to modify the partitions of your PC, or to use a different bootloader, and does not install special drivers.

2) Wine

Wine lets you run Windows software on other operating systems. With Wine, you can install and run these applications just like you would in Windows. Read more at http://wiki.winehq.org/Debunking_Wine_Myths

http://www.winehq.org/about/

3) Cygwin

http://www.cygwin.com/

Cygwin is a Linux-like environment for Windows. It consists of two parts:

  • A DLL (cygwin1.dll) which acts as a Linux API emulation layer providing substantial Linux API functionality.
  • A collection of tools which provide Linux look and feel
  • What Isn’t Cygwin?

  • Cygwin is not a way to run native linux apps on Windows. You have to rebuild your application from source if you want it to run on Windows.
  • Cygwin is not a way to magically make native Windows apps aware of UNIX ® functionality, like signals, ptys, etc. Again, you need to build your apps from source if you want to take advantage of Cygwin functionality.
  • 4) Vmplayer

    https://www.vmware.com/products/player/

    VMware Player is the easiest way to run multiple operating systems at the same time on your PC. With its user-friendly interface, VMware Player makes it effortless for anyone to try out Windows 7, Chrome OS or the latest Linux releases, or create isolated virtual machines to safely test new software and surf the Web

    Choosing R for business – What to consider?

    A composite of the GNU logo and the OSI logo, ...
    Image via Wikipedia

    Additional features in R over other analytical packages-

    1) Source Code is given to ensure complete custom solution and embedding for a particular application. Open source code has an advantage that is extensively peer- reviewed in Journals and Scientific Literature.  This means bugs will found, shared and corrected transparently.

    2) Wide literature of training material in the form of books is available for the R analytical platform.

    3) Extensively the best data visualization tools in analytical software (apart from Tableau Software ‘s latest version). The extensive data visualization available in R is of the form a variety of customizable graphs, as well as animation. The principal reason third-party software initially started creating interfaces to R is because the graphical library of packages in R is more advanced as well as rapidly getting more features by the day.

    4) Free in upfront license cost for academics and thus budget friendly for small and large analytical teams.

    5) Flexible programming for your data environment. This includes having packages that ensure compatibility with Java, Python and C++.

     

    6) Easy migration from other analytical platforms to R Platform. It is relatively easy for a non R platform user to migrate to R platform and there is no danger of vendor lock-in due to the GPL nature of source code and open community.

    Statistics are numbers that tell (descriptive), advise ( prescriptive) or forecast (predictive). Analytics is a decision-making help tool. Analytics on which no decision is to be made or is being considered can be classified as purely statistical and non analytical. Thus ease of making a correct decision separates a good analytical platform from a not so good analytical platform. The distinction is likely to be disputed by people of either background- and business analysis requires more emphasis on how practical or actionable the results are and less emphasis on the statistical metrics in a particular data analysis task. I believe one clear reason between business analytics is different from statistical analysis is the cost of perfect information (data costs in real world) and the opportunity cost of delayed and distorted decision-making.

    Specific to the following domains R has the following costs and benefits

    • Business Analytics
      • R is free per license and for download
      • It is one of the few analytical platforms that work on Mac OS
      • It’s results are credibly established in both journals like Journal of Statistical Software and in the work at LinkedIn, Google and Facebook’s analytical teams.
      • It has open source code for customization as per GPL
      • It also has a flexible option for commercial vendors like Revolution Analytics (who support 64 bit windows) as well as bigger datasets
      • It has interfaces from almost all other analytical software including SAS,SPSS, JMP, Oracle Data Mining, Rapid Miner. Existing license holders can thus invoke and use R from within these software
      • Huge library of packages for regression, time series, finance and modeling
      • High quality data visualization packages
      • Data Mining
        • R as a computing platform is better suited to the needs of data mining as it has a vast array of packages covering standard regression, decision trees, association rules, cluster analysis, machine learning, neural networks as well as exotic specialized algorithms like those based on chaos models.
        • Flexibility in tweaking a standard algorithm by seeing the source code
        • The RATTLE GUI remains the standard GUI for Data Miners using R. It was created and developed in Australia.
        • Business Dashboards and Reporting
        • Business Dashboards and Reporting are an essential piece of Business Intelligence and Decision making systems in organizations. R offers data visualization through GGPLOT, and GUI like Deducer and Red-R can help even non R users create a metrics dashboard
          • For online Dashboards- R has packages like RWeb, RServe and R Apache- which in combination with data visualization packages offer powerful dashboard capabilities.
          • R can be combined with MS Excel using the R Excel package – to enable R capabilities to be imported within Excel. Thus a MS Excel user with no knowledge of R can use the GUI within the R Excel plug-in to use powerful graphical and statistical capabilities.

    Additional factors to consider in your R installation-

    There are some more choices awaiting you now-
    1) Licensing Choices-Academic Version or Free Version or Enterprise Version of R

    2) Operating System Choices-Which Operating System to choose from? Unix, Windows or Mac OS.

    3) Operating system sub choice- 32- bit or 64 bit.

    4) Hardware choices-Cost -benefit trade-offs for additional hardware for R. Choices between local ,cluster and cloud computing.

    5) Interface choices-Command Line versus GUI? Which GUI to choose as the default start-up option?

    6) Software component choice- Which packages to install? There are almost 3000 packages, some of them are complimentary, some are dependent on each other, and almost all are free.

    7) Additional Software choices- Which additional software do you need to achieve maximum accuracy, robustness and speed of computing- and how to use existing legacy software and hardware for best complementary results with R.

    1) Licensing Choices-
    You can choose between two kinds of R installations – one is free and open source from http://r-project.org The other R installation is commercial and is offered by many vendors including Revolution Analytics. However there are other commercial vendors too.

    Commercial Vendors of R Language Products-
    1) Revolution Analytics http://www.revolutionanalytics.com/
    2) XL Solutions- http://www.experience-rplus.com/
    3) Information Builder – Webfocus RStat -Rattle GUI http://www.informationbuilders.com/products/webfocus/PredictiveModeling.html
    4) Blue Reference- Inference for R http://inferenceforr.com/default.aspx

    1. Choosing Operating System
        1. Windows

     

    Windows remains the most widely used operating system on this planet. If you are experienced in Windows based computing and are active on analytical projects- it would not make sense for you to move to other operating systems. This is also based on the fact that compatibility problems are minimum for Microsoft Windows and the help is extensively documented. However there may be some R packages that would not function well under Windows- if that happens a multiple operating system is your next option.

          1. Enterprise R from Revolution Analytics- Enterprise R from Revolution Analytics has a complete R Development environment for Windows including the use of code snippets to make programming faster. Revolution is also expected to make a GUI available by 2011. Revolution Analytics claims several enhancements for it’s version of R including the use of optimized libraries for faster performance.
        1. MacOS

     

    Reasons for choosing MacOS remains its considerable appeal in aesthetically designed software- but MacOS is not a standard Operating system for enterprise systems as well as statistical computing. However open source R claims to be quite optimized and it can be used for existing Mac users. However there seem to be no commercially available versions of R available as of now for this operating system.

        1. Linux

     

          1. Ubuntu
          2. Red Hat Enterprise Linux
          3. Other versions of Linux

     

    Linux is considered a preferred operating system by R users due to it having the same open source credentials-much better fit for all R packages and it’s customizability for big data analytics.

    Ubuntu Linux is recommended for people making the transition to Linux for the first time. Ubuntu Linux had an marketing agreement with revolution Analytics for an earlier version of Ubuntu- and many R packages can  installed in a straightforward way as Ubuntu/Debian packages are available. Red Hat Enterprise Linux is officially supported by Revolution Analytics for it’s enterprise module. Other versions of Linux popular are Open SUSE.

        1. Multiple operating systems-
          1. Virtualization vs Dual Boot-

     

    You can also choose between having a VMware VM Player for a virtual partition on your computers that is dedicated to R based computing or having operating system choice at the startup or booting of your computer. A software program called wubi helps with the dual installation of Linux and Windows.

    1. 64 bit vs 32 bit – Given a choice between 32 bit versus 64 bit versions of the same operating system like Linux Ubuntu, the 64 bit version would speed up processing by an approximate factor of 2. However you need to check whether your current hardware can support 64 bit operating systems and if so- you may want to ask your Information Technology manager to upgrade atleast some operating systems in your analytics work environment to 64 bit operating systems.

     

    1. Hardware choices- At the time of writing this book, the dominant computing paradigm is workstation computing followed by server-client computing. However with the introduction of cloud computing, netbooks, tablet PCs, hardware choices are much more flexible in 2011 than just a couple of years back.

    Hardware costs are a significant cost to an analytics environment and are also  remarkably depreciated over a short period of time. You may thus examine your legacy hardware, and your future analytical computing needs- and accordingly decide between the various hardware options available for R.
    Unlike other analytical software which can charge by number of processors, or server pricing being higher than workstation pricing and grid computing pricing extremely high if available- R is well suited for all kinds of hardware environment with flexible costs. Given the fact that R is memory intensive (it limits the size of data analyzed to the RAM size of the machine unless special formats and /or chunking is used)- it depends on size of datasets used and number of concurrent users analyzing the dataset. Thus the defining issue is not R but size of the data being analyzed.

      1. Local Computing- This is meant to denote when the software is installed locally. For big data the data to be analyzed would be stored in the form of databases.
        1. Server version- Revolution Analytics has differential pricing for server -client versions but for the open source version it is free and the same for Server or Workstation versions.
        2. Workstation
      2. Cloud Computing- Cloud computing is defined as the delivery of data, processing, systems via remote computers. It is similar to server-client computing but the remote server (also called cloud) has flexible computing in terms of number of processors, memory, and data storage. Cloud computing in the form of public cloud enables people to do analytical tasks on massive datasets without investing in permanent hardware or software as most public clouds are priced on pay per usage. The biggest cloud computing provider is Amazon and many other vendors provide services on top of it. Google is also coming for data storage in the form of clouds (Google Storage), as well as using machine learning in the form of API (Google Prediction API)
        1. Amazon
        2. Google
        3. Cluster-Grid Computing/Parallel processing- In order to build a cluster, you would need the RMpi and the SNOW packages, among other packages that help with parallel processing.
      3. How much resources
        1. RAM-Hard Disk-Processors- for workstation computing
        2. Instances or API calls for cloud computing
    1. Interface Choices
      1. Command Line
      2. GUI
      3. Web Interfaces
    2. Software Component Choices
      1. R dependencies
      2. Packages to install
      3. Recommended Packages
    3. Additional software choices
      1. Additional legacy software
      2. Optimizing your R based computing
      3. Code Editors
        1. Code Analyzers
        2. Libraries to speed up R

    citation-  R Development Core Team (2010). R: A language and environment for statistical computing. R Foundation for Statistical Computing,Vienna, Austria. ISBN 3-900051-07-0, URL http://www.R-project.org.

    (Note- this is a draft in progress)

    2011 Forecast-ying

    Free twitter badge
    Image via Wikipedia

    I had recently asked some friends from my Twitter lists for their take on 2011, atleast 3 of them responded back with the answer, 1 said they were still on it, and 1 claimed a recent office event.

    Anyways- I take note of the view of forecasting from

    http://www.uiah.fi/projekti/metodi/190.htm

    The most primitive method of forecasting is guessing. The result may be rated acceptable if the person making the guess is an expert in the matter.

    Ajay- people will forecast in end 2010 and 2011. many of them will get forecasts wrong, some very wrong, but by Dec 2011 most of them would be writing forecasts on 2012. almost no one will get called on by irate users-readers- (hey you got 4 out of 7 wrong last years forecast!) just wont happen. people thrive on hope. so does marketing. in 2011- and before

    and some forecasts from Tom Davenport’s The International Institute for Analytics (IIA) at

    http://iianalytics.com/2010/12/2011-predictions-for-the-analytics-industry/

    Regulatory and privacy constraints will continue to hamper growth of marketing analytics.

    (I wonder how privacy and analytics can co exist in peace forever- one view is that model building can use anonymized data suppose your IP address was anonymized using a standard secret Coco-Cola formula- then whatever model does get built would not be of concern to you individually as your privacy is protected by the anonymization formula)

    Anyway- back to the question I asked-

    What are the top 5 events in your industry (events as in things that occured not conferences) and what are the top 3 trends in 2011.

    I define my industry as being online technology writing- research (with a heavy skew on stat computing)

    My top 5 events for 2010 were-

    1) Consolidation- Big 5 software providers in BI and Analytics bought more, sued more, and consolidated more.  The valuations rose. and rose. leading to even more smaller players entering. Thus consolidation proved an oxy moron as total number of influential AND disruptive players grew.

     

    2) Cloudy Computing- Computing shifted from the desktop but to the mobile and more to the tablet than to the cloud. Ipad front end with Amazon Ec2 backend- yup it happened.

    3) Open Source grew louder- yes it got more clients. and more revenue. did it get more market share. depends on if you define market share by revenues or by users.

    Both Open Source and Closed Source had a good year- the pie grew faster and bigger so no one minded as long their slices grew bigger.

    4) We didnt see that coming –

    Technology continued to surprise with events (thats what we love! the surprises)

    Revolution Analytics broke through R’s Big Data Barrier, Tableau Software created a big Buzz,  Wikileaks and Chinese FireWalls gave technology an entire new dimension (though not universally popular one).

    people fought wars on emails and servers and social media- unfortunately the ones fighting real wars in 2009 continued to fight them in 2010 too

    5) Money-

    SAP,SAS,IBM,Oracle,Google,Microsoft made more money than ever before. Only Facebook got a movie named on itself. Venture Capitalists pumped in money in promising startups- really as if in a hurry to park money before tax cuts expired in some countries.

     

    2011 Top Three Forecasts

    1) Surprises- Expect to get surprised atleast 10 % of the time in business events. As internet grows the communication cycle shortens, the hype cycle amplifies buzz-

    more unstructured data  is created (esp for marketing analytics) leading to enhanced volatility

    2) Growth- Yes we predict technology will grow faster than the automobile industry. Game changers may happen in the form of Chrome OS- really its Linux guys-and customer adaptability to new USER INTERFACES. Design will matter much more in technology on your phone, on your desktop and on your internet. Packaging sells.

    False Top Trend 3) I will write a book on business analytics in 2011. yes it is true and I am working with A publisher. No it is not really going to be a top 3 event for anyone except me,publisher and lucky guys who read it.

    3) Creating technology and technically enabling creativity will converge at an accelerated rate. use of widgets, guis, snippets, ide will ensure creative left brains can code easier. and right brains can design faster and better due to a global supply chain of techie and artsy professionals.

     

     

    Playing with Playwith- R Package for Interactive Data Visualizations

    While just browsing through Google Code repositories for R Packages-

    https://code.google.com/hosting/search?q=label:R

    I came across Playwith-  which is basically a toolkit for creating interactive data visualizations. I then played with ClusterApp and it really seems promising (hierarchical) – Since I am using R 2.12 on Win 7 (x64) platform somthing broke but overall this seemed like a promising interactive tool making widget.

    playwith is an R package, providing a GTK+ graphical user interface for editing and interacting with R plots.

    The playwith package is maintained by Felix Andrews <felix@nfrac.org>

    Here is the Data Visualization called Cluster App that impressed me There is an obvious synergy between Rattle and Playwith (though some bugs with new R 2.12 on an X64 do come into play)

    https://code.google.com/p/playwith/wiki/ClusterApp

    Enterprise Linux rises rapidly:New Report

    Tux, as originally drawn by Larry Ewing
    Image via Wikipedia

    A new report from Linux Foundation found significant growth trends for enterprise usage of Linux- which should be welcome to software companies that have enabled Linux versions of software, service providers that provide Linux based consulting (note -lesser competition, lower overheads) and to application creators.

    From –

    http://www.linuxfoundation.org/news-media/announcements/2010/10/new-linux-foundation-user-survey-shows-enterprise-linux-achieve-sig

    Key Findings from the Report
    • 79.4 percent of companies are adding more Linux relative to other operating systems in the next five years.

    • More people are reporting that their Linux deployments are migrations from Windows than any other platform, including Unix migrations. 66 percent of users surveyed say that their Linux deployments are brand new (“Greenfield”) deployments.

    • Among the early adopters who are operating in cloud environments, 70.3 percent use Linux as their primary platform, while only 18.3 percent use Windows.

    • 60.2 percent of respondents say they will use Linux for more mission-critical workloads over the next 12 months.

    • 86.5 percent of respondents report that Linux is improving and 58.4 percent say their CIOs see Linux as more strategic to the organization as compared to three years ago.

    • Drivers for Linux adoption extend beyond cost: technical superiority is the primary driver, followed by cost and then security.

    • The growth in Linux, as demonstrated by this report, is leading companies to increasingly seek Linux IT professionals, with 38.3 percent of respondents citing a lack of Linux talent as one of their main concerns related to the platform.

    • Users participate in Linux development in three primary ways: testing and submitting bugs (37.5 percent), working with vendors (30.7 percent) and participating in The Linux Foundation activities (26.0 percent).

    and from the report itself-

    download here-

    http://www.linuxfoundation.org/lp/page/download-the-free-linux-adoption-trends-report