Twitterfox- Twitter for the busy people

Here is a nice firefox plugin for people who want to start using Twitter without losing too much time. It sits nicely in one corner and gives gentle tweets – think of it as a big instant messenger, big in terms of number of followers and need to use twitter and busy in terms of time, but very nice and comfortable. The screenshot says it all and all you need to do is start using Firefox and install this from http://twitterfox.net/.

Heavily recommended for non users of Twitter who are curious on what this thing is all about—-

Screenshots courtesy  myself and the gentlepeople at http://twitterfox.net/.

screenshot-decisionstats-e280ba-dashboard-e28094-wordpress-mozilla-firefox

TwitterFox is a Firefox extension that notifies you of your friends’ tweets on Twitter.

This extension adds a tiny icon on the status bar which notifies you when your friends update their tweets. Also it has a small text input field to update your tweets.

Install TwitterFox

If you want to get updates of TwitterFox, feel free to follow @TwitterFox.

New Features and Changes in Version 1.7.7.1

  • Supported Firefox 3.1b3
  • Added a context menu to each tweets which has:
    • Copy
    • Re-tweet
    • Open this tweet in new tab
    • Delete tweet
  • Auto extract is.gd and bit.ly links.
  • Added Mark all as read menu item to main context menu.
  • Increased contrast of background color of read/unread messages.
  • Added in-reply-to-status-id parameter for status update.
  • Added da-DK, th-TH, vi-VN, ar-SA, ar, and kw-GB translations.
  • Bug fixes.

Interview KNIME Fabian Dill

fabian We have covered KNIME.com ‘s open source platform earlier. On the eve of it’s new product launch, co-founder of Knime.com Fabian Dill reveals his thoughts in an exclusive interview.

From the Knime.com website

The modular data exploration platform KNIME, originally solely developed at the University of Konstanz, Germany, enables the user to visually create data flows – or pipelines, execute selected analysis steps, and later investigate the results through interactive views on data and models. KNIME already has more than 2,000 active users in diverse application areas, ranging from early drug discovery and customer relationship analysis to financial information integration.

Ajay – What prompted you personally to be part of KNIME and not join a big technology  company?  What does the future hold for KNIME in 2009-10?

Fabian -I was excited when I first joined the KNIME team in 2005. Back then, we were working exclusively on the open source version backed by some academic funding. Being part of the team that put together such a professional data mining environment from scratch was a great experience. Growing this into a commercial support and development arm has been a thrill as well. The team and the diverse experiences gained from helping get a new company off the ground and being involved in everything it takes to enable this to be successful made it unthinkable for me to work anywhere else.

We continue to develop the open source arm of KNIME and many new features lie ahead: text, image, and time series processing as well as better support for variables. We are constantly working on adding new nodes. KNIME 2.1 is expected in the fall and some of the ongoing development can already be found on the KNIME Labs page (http://labs.knime.org)

The commercial division is providing support and maintenance subscriptions for the freely available desktop version. At the same time we are developing products which will streamline the integration of KNIME into existing IT infrastructures:

  • the KNIME Grid Support lets you run your compute-intensive (sub-) workflows or nodes on a grid or cluster;

  • KNIME Reporting makes use of KNIME’s flexibility in order to gather the data for your report and provides simplified views (static or interactive=dashboards) on the resulting workflow and its results; and

  • the KNIME Enterprise Server facilitates company-wide installation of KNIME and supports collaboration between departments and sites by providing central workflow repositories, scheduled and remote execution, and user rights management.

Ajay -Software as a service and Cloud Computing is the next big thing in 2009. Are there any plans to put KNIME on a cloud computer and charge clients for the hour so they can build models on huge data without buying any hardware but just rent the time?

Fabian – Cloud computing is an agile and client-centric approach and therefore fits nicely into the KNIME framework, especially considering that we are already working on support for distributed computing of KNIME workflows (see above). However, we have no immediate plans for KNIME workflow processing on a per-use charge or similar. That’s an interesting idea, though. The way KNIME nodes are nicely encapsulated (and often even distributable themselves) would make this quite natural.

Ajay – What differentiates KNIME from other products such as RPro and Rapid Miner, for example? What are the principal challenges you have faced in developing it? Why do customers like and dislike it?

Fabian- Every tool has its strengths and weaknesses depending on the task you actually want to accomplish. The focus of KNIME is to support the user during his or her quest of understanding large and heterogeneous data and to make sense out of it. For this task, you cannot rely only on classical data mining techniques, wrapping them into a command line or otherwise configurable environment, but simple, intuitive access to those tools is required in addition to supporting visual exploration with interactive linking and brushing techniques.

By design, KNIME is a modular integration platform, which makes it easy to write own nodes (with the easy-to-use API) or integrate existing libraries or tools.

We integrated Weka, for example, because of its vast library of state-of-the-art machine learning algorithms, the open source program R – in order to provide access to a rich library of statistical functions (and of course many more) – and parts of the Chemistry Development Kit (CDK). All these integrations follow the KNIME requirements for easy and intuitive usage so the user does not need to understand the details of each tool in great depth.

A number of our commercial partners such as Schroedinger, Infocom, Symyx, Tripos, among others, also follow this paradigm and similarly integrate their tools into KNIME. Academic collaborations with ETH Zurich, Switzerland on the High Content Screening Platform HC/DC represent another positive outcome of this open architecture. We believe that this strictly result-oriented approach based on a carefully designed and professionally coded framework is a key factor of KNIME’s broad acceptance. I guess this is another big differentiator: right from the start, KNIME has been developed by a team consisting of SW developers with decades of industrial SW engineering experience.

Ajay – Any there any Asian plans for KNIME? Any other open source partnerships in the pipeline?

Fabian – We have a Japan-based partner, Infocom, who operates in the fields of life science. But we are always open for other partnerships, supporters, or collaborations.

In addition to the open source integrations mentioned above (Weka, R, CDK, HC/DC), there are many other different projects in the works and partnerships under negotiation. Keep an eye on our blog and on our Labs@KNIME page (labs.knime.org).

ABOUT

KNIME – development started in January 2004. Since then: 10 releases; approx. 350,000 lines of code; 25,000 downloads; an estimated 2000 active users. KNIME.com was founded in June 2008 in Zurich, Switzerland.

Fabian Dill – has been working for and with KNIME since 2005; co-founder of KNIME.com.

Interview Visual Numerics Alicia McGreevey

alicia

Here is an interview with the head of marketing of Visual Numerics, Alicia McGreevey.

Visual Numerics® is the leading provider of data analysis software, visualization solutions and expert consulting for technical, business and scientific communities worldwide (see http://www.vni.com ).

Ajay – Describe your career in science so far. How would explain embeddable analytics to a high school student who has to decide between getting a MBA or a Science degree.

Alicia – I think of analytics as analyzing a situation so you can make a decision. To do that objectively, you need data about your situation. Data can be anything: foreign currency exchange rates, the daily temperature here in Houston, or Tiger Wood’s record at the Master’s tournament when he’s not leading after the 3rd round.

Embedding analytics is simply making the analysis part of an application close to, or embedded with, your data. As an example, we have a customer in Germany, GFTA (Gesellschaft Fuer Trendanalysen), who has built an application that embeds analytics to analyze historic and live tick foreign exchange rate data. Their application gives treasuries and traders predictions on what is about to happen to exchange rates so they can make good decisions on when to buy or sell.

Embedding analytics is as much a business discipline as it is science. Historically, our analytics have been used predominantly by the government and scientific community to perform heavy science and engineering research. As business intelligence becomes increasingly important to compete in today’s marketplace, our analytics can now be found driving business decisions in industries like financial services, healthcare and manufacturing. Partners like Teradata and SAP are embedding our analytics into their software as a way to extend their current offerings. As their customers demand more custom BI solutions to fit unique data sets, our analytics provide a more affordable approach to meet that need. Customers now have an option to implement custom BI without incurring the massive overhead that you would typically find in a one-size-fits-all solution.

If you’re a student, I’d recommend you invest time and course work in the area of analytics regardless of the discipline you choose to study. The term analytics is really just a fancy term for math and statistics. I’ve taken math and statistics courses as part of a science curriculum and as part of a business curriculum. Being able to make optimal decisions by objectively analyzing data is a skill that will help you in business, science, engineering, or any area.

Ajay – You have been working behind the scenes quietly building math libraries that power many partners. Could you name a few success stories so far.

Alicia – One of the most interesting things about working at Visual Numerics is our customers. They create fascinating analytic applications using mathematic and statistical functions from our libraries. A few examples:

  • Total, who you probably know as one of the world’s super major oil companies, uses our math optimization routines in an application that automatically controls the blending of components in the production of gasoline, diesel and heavy fuels. By making best use of components, Total helps minimize their refining costs while maximizing revenue.

  • The Physics Department at the University of Kansas uses nonlinear equation solvers from our libraries to develop more efficient particle beam simulations. By simulating the behavior of particle beams in particle accelerators, scientists can better design particle accelerators, like the LHC or Large Hadron Collider, for high-energy research.

  • A final example that I think is interesting, given the current economic situation, is from one of our financial customers RiskMetrics Group. RiskMetrics uses functions from our libraries to do financial stress testing that allows portfolio fund managers simulate economic events, like the price of oil spiking 10% or markets diving 20%. They use this information to predict impacts on their portfolio and make better decisions for their clients.

Ajay – What have been the key moments in Visual Numerics path so far.

Alicia – Our company has been in business for over 38 years, rooted in the fundamentals of mathematics and statistics. It started off as IMSL, offering IMSL Numerical Libraries as a high performance computing tool for numerical analysis. Before visualization was fashionable, we saw visualization as an important part of the data analysis process. As a result, the company merged with Precision Visuals, makers of PV-WAVE (our visual data analysis product) in the 1990s to become what is now known as Visual Numerics.

Looking back at recent history, a major event for Visual Numerics was definitely when SAP AG licensed the libraries at the end of 2007. For several years leading up to 2007, we’d seen increased interest in our libraries from independent software vendors (ISVs). More and more ISVs with broad product offerings were looking to provide their customers with analytic capabilities, so we had invested considerably in making the libraries more attractive to this type of customer. Having SAP, one of the largest and most respected ISVs in the world, license our products gave us confidence that we could be a valued OEM partner to this type of customer.

Ajay – What are the key problems you face in your day to day job as a Visual Numerics employee. How do you have fun when not building math libraries.

Alicia – In marketing, our job is to help potential users of our libraries understand what it is we offer so that they can determine if what we offer is of value to them. Often the hardest challenge we face is simply finding that person. Since our libraries are embeddable, they’ve historically been used by programmers. So we’ve spent a lot of time at developer conferences and sponsoring developer websites, journals and academic programs.

One product update this year is that we’ve made the libraries available from Python, a dynamic scripting language. Making IMSL Library functions available from Python basically means that someone who is not a trained programmer can now use the math and stats capabilities in the IMSL Libraries just like a C, Java, .Net or Fortran developer. It’s an exciting development, though brings with it the challenge of letting a whole new set of potential users know about the capabilities of the libraries. It’s a fun challenge though.

On a more fun side of things, you may be interested to know that our expertise in math and statistics led us to some Hollywood fame. At one point in time, we were selected to review scripts for the crime busting drama, NUMB3RS. NUMB3RS, aired on CBS in the US and features an FBI Special Agent who recruits his brilliant mathematician brother to use the science of mathematics with its complex equations to solve the trickiest crimes in Los Angeles. So yes, the math behind the Show is real and it is exciting indeed to see how math can be applied in all aspects of our lives, including ferreting out criminals on TV!

AjayWhat is the story ahead. How do you think Visual Numerics can help demand forecasting and BI to say BYE to the recession.

We’re seeing more success stories from customers using analytics and data to make good decisions and I think the more organizations leverage analytics, the faster we’ll emerge from this economic slump.

As an example, we have a partner, nCode International, who makes software to help manufacturers collect and analyze test data and use the analysis to make design decisions. Using it, automobile manufacturers can, for example, analyze real-world driving pattern data for different geographic areas (e.g., emerging markets like China and India versus established markets like the USA and Europe) and design the perfect vehicle for specific markets.

So the analytic successes are out there and we know that organizations have multitudes of data. Certainly every organization that we work with has more data today than ever before. For analytics to help us say Bye to the recession, I think we need to continue to promote our successes, make analytic tools available to more users, and get users across multiple disciplines and industries using analytics to make the best possible decisions for their organizations.

Personal Biography:

As Director of Marketing for Visual Numerics, Alicia is an authority on how organizations are using advanced analytics to improve performance. Alicia brings over 15 years of experience working with scientists and customers in the planning and development of new technology products and developing go to market plans. She has a B.A. in Mathematics from Skidmore College and an M.B.A. from the University of Chicago Booth School of Business.

Does Twitter reduce Blogging ?

One more post on Twitter you may sigh, but wait. I am examine Twitter as an economic complementary  or substitute product to Blogging and trying to come up with a mathematical proving rule to dis prove the Null Hypothesis-

Twitter does not affect blogging of individuals or communities as  a whole. or does it ?

Twitter reduces blogging because

  1. Twitter is easier to do. Creating a blog is different ball game.

  2. Tweeting is two way and interactive while Blogging is mostly a one way broadcast.

  3. People respond to Tweets and re tweet them much more than they comment or forward blog posts. This is due to the inherent design of the softwares.

  4. Twitter is chaotic, but so is real life in which human brain processes different information from people like collegues, family, friends and sorts them. Blogging has a structure which helps the reader more than the writer

  5. It is easier to tweet and faster to get your point across than in Blogging.

  6. People allocate a set amount of time for social media activities and personal branding. Now this may be elastic but not totally so. Hence the rise of twitter time in people’ lives would mean lesser time to read and write blogs.

Now to a more quantitative study.

We get statistics from Technocrati – State of the Blogosphere and add in WordPress Stats to boot.

(credit -http://technorati.com/blogging/state-of-the-blogosphere/ )

A chart of total WordPress.com blogs since  launch:

(credit- http://en.wordpress.com/stats/ )

Note new signups can be seen for WordPress.com at http://en.wordpress.com/stats/signups/

Fatigue could be a reason why Twitter is hotting up while Blogging sees steady state growth.

The following figure from Technocrati’s 2008 report sums it best.

http://technorati.com/blogging/state-of-the-blogosphere/who-are-the-bloggers/

But if I compare June 2008 numbers of Blogging Frequency with the 2007 report – I am not able to compare the numbers

(Source -http://technorati.com/blogging/state-of-the-blogosphere/the-how-of-blogging/ )

http://www.sifry.com/alerts/archives/000493.html

It seems that Blog posts did get a boost with the 2008 elections and the current low traffic may simply be due to a lack of issues in Blogosphere. The rise in Twitter traffic is also due to creation of applications by third party providers and this trend has led to Twitter being the number 3 social media site.

Based on the data, it does not seem Twitter reduces Blog posts to a significant degree. After all Twitter is also a great medium to disseminate or spread the word on good blog posts.

It is simply too early to say that Twitter is reducing blogging though there seem clear trends along that line.

What about you ? If you were a blogger, is  your blog post frequency affected by your tweeting activities.

iBi – Business Intelligence Applications on the iPhone

From the press release at QlikTech at http://www.qlikview.com/Contents.aspx?id=9836

They actually end up promoting Oracle’s mobile BI app even though they are trying to bash it up.

QlikTech, the world’s fastest-growing Business Intelligence (BI) company, today announced the immediate availability of QlikView for iPhone, the very first truly interactive mobile BI app built specifically for the iPhone. Unlike Oracle’s mobile BI offering that features a rigid interface and limited functionality, QlikView for iPhone fully leverages iPhone’s multitouch and GPS features to deliver QlikView’s renowned, industry-defining interactive capabilities. The result is a groundbreaking app that puts the power of sophisticated, real-time business answers in the hands of mobile users worldwide. It can be downloaded for free from Apple’s Mobile App Store on iTunes.

Product Highlights:

  • Interactive – click through line items on a list box or chart to get to answers, going deep into regional or product data.
  • Coverflow –flip through relevant business analysis, make a new selection and those changes are instantly reflected throughout.
  • GPS-enabled – automatically delivers local customer sales, service or inventory data as reps approach a customer or supplier facility.
  • Feature-rich – use Search, Bookmark and Shake to Erase

Smarter, Faster, Real-Time Interactive Analysis
Mobile professionals need access to comprehensive, real-time information, not static reports that lack detail from offerings like Oracle’s mobile BI tool. With QlikView for iPhone, salespeople can drill deep into accounts and get granular, up-to-the minute answers and analysis that help them do their job better. From specific customer or product data, down to a single SKU or employee name, QlikView for iPhone gets users what they need, the moment they need it.

“We comprehensively surveyed the BI mobile landscape and it was clear all previous attempts at addressing user needs failed miserably,” said Anthony Deighton, SVP Product, QlikTech.   “Just posting a static report on a mobile screen, as Oracle’s solution does, may be marginally helpful, but creates a tremendously frustrating user experience, leaving no opportunity to interact with the data. With QlikView for iPhone, users get a mobile view of a relevant data subset, as well as access to the specific answers they seek. This interactive dynamic is the only way to truly fulfill the promise of mobile BI.”

The Only Mobile BI Tool with Multitouch, Coverflow and GPS Integration
QlikView for iPhone takes full advantage of the iPhone’s native interface. The entire application is multitouch driven with complete implementation of the iPhone finger gestures users are accustomed to. Simple finger-swipes and finger-pinches enable users to select, interact and drill down into data. And to clear selections, all users have to do is shake the device. Apple’s popular 3-D Coverflow feature is also enabled, allowing users to “flip” through analyses in the same way they would through album covers and artists in iTunes. Real-time data changes are also instantly reflected in every Coverflow chart.

And here is the actual Oracle application

http://www.oracle.com/appserver/business-intelligence/business-indicators.html

Enhance Productivity for Mobile Business Users
Oracle Business Indicators is the first in a series of business applications for delivering Oracle business information to the Apple iPhone. The application provides mobile business users with real-time, secure access to business performance information on one of the industry’s most exciting and engaging mobile devices – Apple iPhone.

Oracle Business Indicators allows users to view and interact with Oracle Business Intelligence (BI) Applications that include financial, human resources, supply chain, and customer relationship management analytics, as well as analytical alerts generated by Oracle Delivers, an integrated component of Oracle Business Intelligence Enterprise Edition Plus (OBIEE). Leveraging full advantage of the Apple iPhone mobile platform, Oracle Business Indicators is built as a native application to offer highly intuitive and flexible features including browse, search, and favorites for a superior overall end user experience.

BENEFITS
* Pre-defined business indicators-Pre-built metrics and reports include financial, human resources, supply chain, and customer relationship management analytics.
* Timely alerts on exception conditions-Enables the mobile user to review alerts generated by conditions pre-defined in Oracle Delivers. A user can select an alert entry and immediately review an associated analytic report.
* Superior user experience-Offers a highly intuitive user interface for browsing, searching, and locating business performance metrics.
* Robust security-Based on the same user security model as Oracle BI Applications. Also supports Secure Sockets Layer (SSL) encryption technology.

SAS commits $70 million to Cloud Computing

From the official SAS website

http://www.sas.com/news/preleases/CCF2009.html

SAS to build $70 million cloud computing facility

New cloud computing facility will support needed data-intensive customer solutions

CARY, NC  (Mar. 19, 2009)  –  SAS, the leader in business analytics software and services, announces today it is building a 38,000-square-foot cloud computing facility to provide the additional data-handling capacity needed to expand SAS’ OnDemand offerings and hosted solutions.

As the need for hosted solutions grows, new research and development jobs will be generated at SAS’ Cary, N.C., world headquarters, where the majority of R&D employees (more than 1,400) are located.

“This project is proof that, despite the down economy, SAS continues to grow and innovate,” said Jim Goodnight, CEO of SAS. “The growing demand by our customers for hosted solutions has given us this opportunity to invest even further in North Carolina and the Cary community.”

In keeping with SAS’ commitment to protecting the environment, the facility will be built to Leadership in Energy and Environmental Design (LEED) standards for water and energy conservation. The sustainable construction methods encourage recycling of materials, similar to the Executive Briefing Center under construction on the Cary campus. SAS’ first LEED building, SAS Canada’s headquarters in Toronto, opened in April 2006.

In keeping with LEED standards, about 60 percent of the project’s construction and equipment spending will be in North Carolina.  Approximately 1,000 people will be involved in its design and construction.

The facility will include two 10,000-square-foot server farms. Server Farm 1 is anticipated to be on-line mid-2010 and support growth for three to five years.  Server Farm 2 will be constructed as a shell and will be populated with mechanical and electrical infrastructure once Server Farm 1 reaches 80 percent capacity.  The facility will be built on SAS’ Cary campus.

Apparently SAS Institute believes in creating jobs ( and thousands of them) during the recession ! Jim clearly is in top intellectual shape despite his err vintage. Imagine with just a browser and you could be crunching billions of bytes of data sitting from a beach in Goa! Thankfully they did not believe the hot air that McKinsey put out on cloud computing (read here http://smartdatacollective.com/Home/17942 )

McKinsey attacks Cloud Computing having no sense

McKinsey, that fine think tank of intellectuals recently dubbed cloud computing as not making sense -thus trying to throttle in its infancy a paradigm that could make companies across the world more competitive than they are today by helping cut costs precisely when they need it the most. The attempt to paint virtualization rather than remote computing is another attempt to cloud the air rather than clear the air on cloud computing. Most consulting companies would have pointed out industry affiliations and disclaimers on which companies they are representing or have represented.

Read other comments at the NYT article

Its study uses Amazon.com’s Web service offering as the price of outsourced cloud computing, since its service is the best-known and it publishes its costs. On that basis, according to McKinsey, the total cost of the data center functions would be $366 a month per unit of computing output, compared with $150 a month for the conventional data center. “The industry has assumed the financial benefits of cloud computing and, in our view, that’s a faulty assumption,” said Will Forrest, a principal at McKinsey, who led the study.

My take on this is here-

Cloud computing will have lower costs as economies of scale kick in, as they did for nearly all technologies. McKinsey partners must be having a hard time to meet their annual bonuses if they have not factored this basic assumption in their cost projections. Cloud computing just converts this to a mass infrastructure from the present scenario where you pay annual licenses for software that you use for less than 60 % in a day, and hardware that you find obsolete in 3-4 years, which is off course gives accountants a reason to help you with depreciation and tax benefits. Rent a computer in the sky is simpler – and you would not need any consultant to help advise what configuration you need.

Mckinsey has deep touches with the outsourcing industry in India from their seminal paper in 1999, to their first concept Knowledge center that helped start it, to their alumni across the outsourcing sector which satisfy a mutual symbiotic relationship particularly in business research. Cloud computing actually help with virtual teams – no need for server farms, IT bureaucracies and Indian outsourcing can actually reduce a lot of costs along with American direct users. The intermediaries and consultants would be affected the most.

Indeed I am speaking on the Cloud Slam 09, precisely on how cloud computing can help lower the digital divide by giving high power computing to anyone having a thin shell laptop with a browser. Developing countries need access to HPC to better plan their resources and growth in an environmentally optimized manner.

http://www.decisionstats.com