McKinsey attacks Cloud Computing having no sense

McKinsey, that fine think tank of intellectuals recently dubbed cloud computing as not making sense -thus trying to throttle in its infancy a paradigm that could make companies across the world more competitive than they are today by helping cut costs precisely when they need it the most. The attempt to paint virtualization rather than remote computing is another attempt to cloud the air rather than clear the air on cloud computing. Most consulting companies would have pointed out industry affiliations and disclaimers on which companies they are representing or have represented.

Read other comments at the NYT article

Its study uses Amazon.com’s Web service offering as the price of outsourced cloud computing, since its service is the best-known and it publishes its costs. On that basis, according to McKinsey, the total cost of the data center functions would be $366 a month per unit of computing output, compared with $150 a month for the conventional data center. “The industry has assumed the financial benefits of cloud computing and, in our view, that’s a faulty assumption,” said Will Forrest, a principal at McKinsey, who led the study.

My take on this is here-

Cloud computing will have lower costs as economies of scale kick in, as they did for nearly all technologies. McKinsey partners must be having a hard time to meet their annual bonuses if they have not factored this basic assumption in their cost projections. Cloud computing just converts this to a mass infrastructure from the present scenario where you pay annual licenses for software that you use for less than 60 % in a day, and hardware that you find obsolete in 3-4 years, which is off course gives accountants a reason to help you with depreciation and tax benefits. Rent a computer in the sky is simpler – and you would not need any consultant to help advise what configuration you need.

Mckinsey has deep touches with the outsourcing industry in India from their seminal paper in 1999, to their first concept Knowledge center that helped start it, to their alumni across the outsourcing sector which satisfy a mutual symbiotic relationship particularly in business research. Cloud computing actually help with virtual teams – no need for server farms, IT bureaucracies and Indian outsourcing can actually reduce a lot of costs along with American direct users. The intermediaries and consultants would be affected the most.

Indeed I am speaking on the Cloud Slam 09, precisely on how cloud computing can help lower the digital divide by giving high power computing to anyone having a thin shell laptop with a browser. Developing countries need access to HPC to better plan their resources and growth in an environmentally optimized manner.

http://www.decisionstats.com

Cloud say hello to R. R say hello to Cloud.

image

Here is a terrific project from Biocep which I have covered before in January at http://www.decisionstats.com/2009/01/r-and-cloud-computing/

But with some exciting steps ahead at http://biocep-distrib.r-forge.r-project.org/

Basically add open source R , create a user friendly GUI, host it on a cloud computer to better crunch data, and save hardware costs as well. Basically upload, crunch data, download.

Save hardware costs and software costs in recession. Before your boss decides to save his staffing costs.

image

    Biocep combines the capabilities of R and the flexibility of a Java based distributed system to create a tool of considerable power and utility. A Biocep based R virtualization infrastructure has been successfully deployed on the British National Grid Service, demonstrating its usability and usefulness for researchers. 

    The virtual workbench enhances the user experience and the productivity of anyone working with R.

A lovely presentation on it is here

and I am taking an extract

What is missing now

•High Level Java API for Accessing R

•Stateful, Resuable, Remotable R Components

•Scalable, Distributed, R Based Infrastructure

•Safe multiple clients framework for components usage as a pool of indistinguishable Remote Resources

•User friendly Interface for the remote resources creation, tracking and debugging

    Citation: Karim Chine, "Biocep, Towards a Federative, Collaborative, User-Centric,Grid-Enabled and Cloud-Ready Computational Open Platform,"
    escience,pp.321-322, 2008 Fourth IEEE International Conference on eScience, 2008

Ajay- With thanks to Bob Marcus for pointing this out from an older post of mine. I did write on this in August on the Ohri framework but that was before recession moved me out from cloud computing to blog computing.

What is Cloud Computing

Here is nice video on ‘What is Cloud Computing‘ . It was created by  Joyent http://joyent.com/. It basically shows you the perspectives on how cloud computing is without getting into jargon, technical gab or semantics.

Enjoy!

 

 

It is also available at http://www.youtube.com/watch?v=6PNuQHUiV3Q

KNIME 2.0.2 released

From the makers of KNIME http://www.knime.org/blog/knime-202-released

KNIME 2.0.2 has been released and is available for download. This release includes a number of important bug fixes, amongst others addressing some chemical related issues, as well as a few new features. For further details see the detailed changelog.

In addition, a Windows 64bit version has been just released. Both versions, KNIME Desktop (win64) and KNIME SDK (win64), are still in experimental state and are intended to evaluate KNIME under Windows 64bit.

64 bit OS version is BIG milestone, though the launch did have a temporary issue with the R Plug-in being disabled

In KNIME 2.0.2, we have encountered a problem within the R Snippet (Local) node when special characters are used within the R script. Therefore we have temporarily disable the R 2.0.2 plugin from our update site. For a detailed work-around see the FAQ section.

Note KNIME has been covered before here.

Cloud Nine

I got a note saying my entry below has been accepted for the Cloud Slam 09 Webinar

If you want to hear me speak on Cloud Computing, please mark your calendars for April 21.

Here is the Link to April 21 Webinar-

http://tr.im/ixF5

 

Abstract.

 

The cloud computing paradigm offers unparalleled access to computing resources both in terms of storage as well as processing power for developing countries.The use of predictive analytics and data mining has been hitherto restricted to an elite set of universities and organizations willing to invest tens of thousands in annual license 
fees to software companies like SAS ,SPSS,Oracle and SAP and even more in terms of network and server hardware costs to companies like HP,Dell and IBM.

Every two or three years, the hardware needed to be upgraded , thus putting total cost of ownership of predictive analytics, data driven decision making and resource planning well out of reach of a major part of the planet’s population.Copyright infringements and intellectual property violations further helped create a divide between advanced computing and those who needed it the most.

Now thanks to open source softwares, softwares as a service and cloud hosted processing , even a relatively non funded Indian or Asian or African university ,government office as well as small and medium enterprise can avail the advanced cost savings due to predictive analytics. This in turn will lead to a new era of resource optimized decision making, one which benefits all companies that offer the flexibility of cloud hosted applications to hitherto closed markets.

Which reminds me I have to prepare the presentation as well…….I will post the slides and full article here on this blog too.

Speaking of Webinars, here is one which I am helping which tries to showcase technological methods including CRM and BI to help manage cost challenges and marketing ops in the recession. It’s on 11 a.m. EST April 16, 2009

http://tr.im/isf

PASW 13 :The preview

Here are some previews of the PASW – the new suite of softwares by SPSS.

 

Auto Cluster

The Auto Cluster feature in PASW Modeler 13 creates, ranks, browses and visualizes models to identify which clusters offer the most effective cross-sell/up-sell opportunities, or reduce the propensity to churn.

Automatic Data Preparation

Automatic Data Preparation, a one-click feature in PASW Modeler 13, quickly flags problems such as missing data and recommends which sets of data to use for optimal results.

Comments

The new Comments feature is an invaluable collaborative tool that enables users to post quick notes directly into a particular model stream and communicate detail behind the logic used to create it.

PASW Statistics Integration

Now all the PASW Statistics modules and functionality can be used directly within Modeler 13 to conduct all statistical analysis without having to switch between applications.

The images were courtesy SPSS PR. But the website itself talks of much more

http://www.spss.com/software/modeling/

(Ajay- Much better revamped website for much better revamped  software 🙂 )

Use BI to say BYE to the recession

While the failure of predictive analytics models in the mortgage industry and financial services industry to PREDICT default rates started the recession,

here is something which may just ensure-

Why 2009 wont be like 1929.

Business Intelligence for better Decision Management.

Bad news on coming demand mismatch enabled faster decision making in cutting interest rates, coordinated global action. Inventories fell faster than expected, as companies aligned supply chains faster.

 

Featured-

 A webinar coming up on April 16 on using Technology to beat the recession back.

Is this the beginning of the end of the recession? As Winston Churchill said- This may be the end of the beginning.

How are you using 2009 technology to align decision management in the current economic downturn? How much training are you giving yourself for these interesting times.