Carole-Ann’s 2011 Predictions for Decision Management

Carole-Ann’s 2011 Predictions for Decision Management

For Ajay Ohri on DecisionStats.com

What were the top 5 events in 2010 in your field?
  1. Maturity: the Decision Management space was made up of technology vendors, big and small, that typically focused on one or two aspects of this discipline.  Over the past few years, we have seen a lot of consolidation in the industry – first with Business Intelligence (BI) then Business Process Management (BPM) and lately in Business Rules Management (BRM) and Advanced Analytics.  As a result the giant Platform vendors have helped create visibility for this discipline.  Lots of tiny clues finally bubbled up in 2010 to attest of the increasing activity around Decision Management.  For example, more products than ever were named Decision Manager; companies advertised for Decision Managers as a job title in their job section; most people understand what I do when I am introduced in a social setting!
  2. Boredom: unfortunately, as the industry matures, inevitably innovation slows down…  At the main BRMS shows we heard here and there complaints that the technology was stalling.  We heard it from vendors like Red Hat (Drools) and we heard it from bored end-users hoping for some excitement at Business Rules Forum’s vendor panel.  They sadly did not get it
  3. Scrum: I am not thinking about the methodology there!  If you have ever seen a rugby game, you can probably understand why this is the term that comes to mind when I look at the messy & confusing technology landscape.  Feet blindly try to kick the ball out while superhuman forces are moving randomly the whole pack – or so it felt when I played!  Business Users in search of Business Solutions are facing more and more technology choices that feel like comparing apples to oranges.  There is value in all of them and each one addresses a specific aspect of Decision Management but I regret that the industry did not simplify the picture in 2010.  On the contrary!  Many buzzwords were created or at least made popular last year, creating even more confusion on a muddy field.  A few examples: Social CRM, Collaborative Decision Making, Adaptive Case Management, etc.  Don’t take me wrong, I *do* like the technologies.  I sympathize with the decision maker that is trying to pick the right solution though.
  4. Information: Analytics have been used for years of course but the volume of data surrounding us has been growing to unparalleled levels.  We can blame or thank (depending on our perspective) Social Media for that.  Sites like Facebook and LinkedIn have made it possible and easy to publish relevant (as well as fluffy) information in real-time.  As we all started to get the hang of it and potentially over-publish, technology evolved to enable the storage, correlation and analysis of humongous volumes of data that we could not dream of before.  25 billion tweets were posted in 2010.  Every month, over 30 billion pieces of data are shared on Facebook alone.  This is not just about vanity and marketing though.  This data can be leveraged for the greater good.  Carlos pointed to some fascinating facts about catastrophic event response team getting organized thanks to crowd-sourced information.  We are also seeing, in the Decision management world, more and more applicability for those very technology that have been developed for the needs of Big Data – I’ll name for example Hadoop that Carlos (yet again) discussed in his talks at Rules Fest end of 2009 and 2010.
  5. Self-Organization: it may be a side effect of the Social Media movement but I must admit that I was impressed by the success of self-organizing initiatives.  Granted, this last trend has nothing to do with Decision Management per se but I think it is a great evolution worth noting.  Let me point to a couple of examples.  I usually attend traditional conferences and tradeshows in which the content can be good but is sometimes terrible.  I was pleasantly surprised by the professionalism and attendance at *un-conferences* such as P-Camp (P stands for Product – an event for Product Managers).  When you think about it, it is already difficult to get a show together when people are dedicated to the tasks.  How crazy is it to have volunteers set one up with no budget and no agenda?  Well, people simply show up to do their part and everyone has fun voting on-site for what seems the most appealing content at the time.  Crowdsourcing applied to shows: it works!  Similar experience with meetups or tweetups.  I also enjoyed attending some impromptu Twitter jam sessions on a given topic.  Social Media is certainly helping people reach out and get together in person or virtually and that is wonderful!

A segment of a social network
Image via Wikipedia

What are the top three trends you see in 2011?

  1. Performance:  I might be cheating here.   I was very bullish about predicting much progress for 2010 in the area of Performance Management in your Decision Management initiatives.  I believe that progress was made but Carlos did not give me full credit for the right prediction…  Okay, I am a little optimistic on timeline…  I admit it…  If it did not fully happen in 2010, can I predict it again in 2011?  I think that companies want to better track their business performance in order to correct the trajectory of course but also to improve their projections.  I see that it is turning into reality already here and there.  I expect it to become a trend in 2011!
  2. Insight: Big Data being available all around us with new technologies and algorithms will continue to propagate in 2011 leading to more widely spread Analytics capabilities.  The buzz at Analytics shows on Social Network Analysis (SNA) is a sign that there is interest in those kinds of things.  There is tremendous information that can be leveraged for smart decision-making.  I think there will be more of that in 2011 as initiatives launches in 2010 will mature into material results.
    5 Ways to Cultivate an Active Social Network
    Image by Intersection Consulting via Flickr
  3. Collaboration:  Social Media for the Enterprise is a discipline in the making.  Social Media was initially seen for the most part as a Marketing channel.  Over the years, companies have started experimenting with external communities and ideation capabilities with moderate success.  The few strategic initiatives started in 2010 by “old fashion” companies seem to be an indication that we are past the early adopters.  This discipline may very well materialize in 2011 as a core capability, well, or at least a new trend.  I believe that capabilities such Chatter, offered by Salesforce, will transform (slowly) how people interact in the workplace and leverage the volumes of social data captured in LinkedIn and other Social Media sites.  Collaboration is of course a topic of interest for me personally.  I even signed up for Kare Anderson’s collaboration collaboration site – yes, twice the word “collaboration”: it is really about collaborating on collaboration techniques.  Even though collaboration does not require Social Media, this medium offers perspectives not available until now.

Brief Bio-

Carole-Ann is a renowned guru in the Decision Management space. She created the vision for Decision Management that is widely adopted now in the industry. Her claim to fame is the strategy and direction of Blaze Advisor, the then-leading BRMS product, while she also managed all the Decision Management tools at FICO (business rules, predictive analytics and optimization). She has a vision for Decision Management both as a technology and a discipline that can revolutionize the way corporations do business, and will never get tired of painting that vision for her audience. She speaks often at Industry conferences and has conducted university classes in France and Washington DC.

Leveraging her Masters degree in Applied Mathematics / Computer Science from a “Grande Ecole” in France, she started her career building advanced systems using all kinds of technologies — expert systems, rules, optimization, dashboarding and cubes, web search, and beta version of database replication – as well as conducting strategic consulting gigs around change management.

She now tweets as @CMatignon, blogs at blog.sparklinglogic.com and interacts at community.sparklinglogic.com.

She started her career building advanced systems using all kinds of technologies — expert systems, rules, optimization, dashboarding and cubes, web search, and beta version of database replication.  At Cleversys (acquired by Kurt Salmon & Associates), she also conducted strategic consulting gigs mostly around change management.

While playing with advanced software components, she found a passion for technology and joined ILOG (acquired by IBM).  She developed a growing interest in Optimization as well as Business Rules.  At ILOG, she coined the term BRMS while brainstorming with her Sales counterpart.  She led the Presales organization for Telecom in the Americas up until 2000 when she joined Blaze Software (acquired by Brokat Technologies, HNC Software and finally FICO).

Her 360-degree experience allowed her to gain appreciation for all aspects of a software company, giving her a unique perspective on the business.  Her technical background kept her very much in touch with technology as she advanced.

She also became addicted to Twitter in the process.  She is active on all kinds of social media, always looking for new digital experience!

Outside of work, Carole-Ann loves spending time with her two boys.  They grow fruits in their Northern California home and cook all together in the French tradition.

profile on LinkedIn

TwitterFollow me on Twitter

Filtering to Gain Social Network Value
Image by Intersection Consulting via Flickr
Social Networks Hype Cycle
Image by fredcavazza via Flickr

Interview Jamie Nunnelly NISS

An interview with Jamie Nunnelly, Communications Director of National Institute of Statistical Sciences

Ajay– What does NISS do? And What does SAMSI do?

Jamie– The National Institute of Statistical Sciences (NISS) was established in 1990 by the national statistics societies and the Research Triangle universities and organizations, with the mission to identify, catalyze and foster high-impact, cross-disciplinary and cross-sector research involving the statistical sciences.

NISS is dedicated to strengthening and serving the national statistics community, most notably by catalyzing community members’ participation in applied research driven by challenges facing government and industry. NISS also provides career development opportunities for statisticians and scientists, especially those in the formative stages of their careers.

The Institute identifies emerging issues to which members of the statistics community can make key contributions, and then catalyzes the right combinations of researchers from multiple disciplines and sectors to tackle each problem. More than 300 researchers from over 100 institutions have worked on our projects.

The Statistical and Applied Mathematical Sciences Institute (SAMSI) is a partnership of Duke University,  North Carolina State University, The University of North Carolina at Chapel Hill, and NISS in collaboration with the William Kenan Jr. Institute for Engineering, Technology and Science and is part of the Mathematical Sciences Institutes of the NSF.

SAMSI focuses on 1-2 programs of research interest in the statistical and/or applied mathematical area and visitors from around the world are involved with the programs and come from a variety of disciplines in addition to mathematics and statistics.

Many come to SAMSI to attend workshops, and also participate in working groups throughout the academic year. Many of the working groups communicate via WebEx so people can be involved with the research remotely. SAMSI also has a robust education and outreach program to help undergraduate and graduate students learn about cutting edge research in applied mathematics and statistics.

Ajay– What successes have you had in 2010- and what do you need to succeed in 2011. Whats planned for 2011 anyway

Jamie– NISS has had a very successful collaboration with the National Agricultural Statistical Service (NASS) over the past two years that was just renewed for the next two years. NISS & NASS had three teams consisting of a faculty researcher in statistics, a NASS researcher, a NISS mentor, a postdoctoral fellow and a graduate student working on statistical modeling and other areas of research for NASS.

NISS is also working on a syndromic surveillance project with Clemson University, Duke University, The University of Georgia, The University of South Carolina. The group is currently working with some hospitals to test out a model they have been developing to help predict disease outbreak.

SAMSI had a very successful year with two programs ending this past summer, which were the Stochastic Dynamics program and the Space-time Analysis for Environmental Mapping, Epidemiology and Climate Change. Several papers were written and published and many presentations have been made at various conferences around the world regarding the work that was conducted as SAMSI last year.

Next year’s program is so big that the institute has decided to devote all it’s time and energy around it, which is uncertainty quantification. The opening workshop, in addition to the main methodological theme, will be broken down into three areas of interest under this broad umbrella of research: climate change, engineering and renewable energy, and geosciences.

Ajay– Describe your career in science and communication.

Jamie– I have been in communications since 1985, working for large Fortune 500 companies such as General Motors and Tropicana Products. I moved to the Research Triangle region of North Carolina after graduate school and got into economic development and science communications first working for the Research Triangle Regional Partnership in 1994.

From 1996-2005 I was the communications director for the Research Triangle Park, working for the Research Triangle Foundation of NC. I published a quarterly magazine called The Park Guide for awhile, then came to work for NISS and SAMSI in 2008.

I really enjoy working with the mathematicians and statisticians. I always joke that I am the least educated person working here and that is not far from the truth! I am honored to help get the message out about all of the important research that is conducted here each day that is helping to improve the lives of so many people out there.

Ajay– Research Triangle or Silicon Valley– Which is better for tech people and why? Your opinion

Jamie– Both the Silicon Valley and Research Triangle are great regions for tech people to locate, but of course, I have to be biased and choose Research Triangle!

Really any place in the world that you find many universities working together with businesses and government, you have an area that will grow and thrive, because the collaborations help all of us generate new ideas, many of which blossom into new businesses, or new endeavors of research.

The quality of life in places such as the Research Triangle is great because you have people from around the world moving to a place, each bringing his/her culture, food, and uniqueness to this place, and enriching everyone else as a result.

Two advantages the Research Triangle has over Silicon Valley are that the Research Triangle has a bigger diversity of industries, so when the telecommunications industry busted back in 2001-02, the region took a hit, but the biotechnology industry was still growing, so unemployment rose, but not to the extent that other areas might have experienced.

The latest recession has hit us all very hard, so even this strategy has not made us immune to having high unemployment, but the Research Triangle region has been pegged by experts to be one of the first regions to emerge out of the Great Recession.

The other advantage I think we have is that our cost of living is still much more reasonable than Silicon Valley. It’s still possible to get a nice sized home, some land and not break the bank!

Ajay– How do you manage an active online social media presence, your job and your family. How important is balance in professional life and when young professional should realize this?

Jamie– Balance is everything, isn’t it? When I leave the office, I turn off my iPhone and disconnect from Twitter/Facebook etc.

I know that is not recommended by some folks, but I am a one person communications department and I love my family and friends and feel its important to devote time to them as well as to my career.

I think it is very important for young people to establish this early in their careers because if they don’t they will fall victim to working way too many hours and really, who loves you at the end of the day?

Your company may appreciate all you do for them, but if you leave, or you get sick and cannot work for them, you will be replaced

. Lee Iacocca, former CEO of Chrystler, said, “No matter what you’ve done for yourself or for humanity, if you can’t look back on having given love and attention to your own family, what have you really accomplished?” I think that is what is really most important in life.

About-

Jamie Nunnelly has been in communications for 25 years. She is currently on the board of directors for Chatham County Economic Development Corporation and Leadership Triangle & is a member of the International Association of Business Communicators and the Public Relations Society of America. She earned a bachelor’s degree in interpersonal and public communications at Bowling Green State University and a master’s degree in mass communications at the University of South Florida.

You can contact Jamie at http://niss.org/content/jamie-nunnelly or on twitter at

Cisco SocialMiner

A highly simplified version of the RSS feed ic...
Image via Wikipedia

A new product from Cisco to mine social media for analytics on sentiment-

http://www.cisco.com/en/US/products/ps11349/index.html

Cisco SocialMiner is a social media customer care solution that can help you proactively respond to customers and prospects communicating through public social media networks like Twitter, Facebook, or other public forums or blogging sites. By providing social media monitoring, queuing, and workflow to organize customer posts on social media networks and deliver them to your social media customer care team, your company can respond to customers in real time using the same social network they are using.

Cisco SocialMiner provides:

  • The ability to configure multiple campaigns to search for customer postings on the public social web about your company’s products, services, or area of expertise
  • Filtering of social contacts based on preconfigured campaign filters to focus campaign searches
  • Routing of social contacts to skilled customer care representatives in the contact center or to experts in the enterprise–multiple people can work together to handle responses to customer postings through shared work queues
  • Detailed metrics for social media customer care activities, campaign reports, and team reports

With Cisco SocialMiner, your company can listen and respond to customer conversations originating in the social web. Being proactive can help your company enhance its service, improve customer loyalty, garner new customers, and protect your brand.

Table 1. Features and Benefits of Cisco SocialMiner 8.5

Feature Benefits
Product Baseline Features
Social media feeds

• Feeds are configurable sources to capture public social contacts that contain specific words, terms, or phrases.

• Feeds enable you to search for information on the public social web about your company’s products, services, or area of expertise.

• Cisco SocialMiner supports the following types of feeds:

• Facebook

• Twitter
Campaign management

• Groups feeds into campaigns to organize all posting activity related to a product category or business objective

• Produces metrics on campaign activity

• Provides the ability to configure multiple campaigns to search for customer postings on specific products or services

• Groups social contacts for handling by the social media customer care team

• Enables filtering of social contacts based on preconfigured campaign filters to focus campaign searches
Route and queue social contacts

• Enables routing of social contacts to skilled customer care representatives in the contact center

• Draws on expertise in the enterprise by allowing multiple people in the enterprise to work together to handle responses to customer postings through shared work queues

• Enables automated distribution of work to improve efficiency and effectiveness of social media engagement
Tagging

• Allows work to be routed to the appropriate team by grouping each post or social contact into different categories; for example, a post can be marked with the “customer_support” tag; this post will then appear on a customer support agent’s queue for processing
Social media customer care metrics

• Provides detailed metrics on social media customer care activities, campaign reports, and team reports

• Measures work and results

• Manages to service-level goals

• Supports brand management

• Optimizes staffing

• Includes dashboarding of social media posting activity when Cisco Unified Intelligence Center is used
Reporting for social contacts

• Provides a reporting database that can be accessed using any reporting tool, including Cisco Unified Intelligence Center

• Enables customer care management to accurately report on and track social media interactions by the contact center
OpenSocial-compliant gadgets

Representational State Transfer (REST) application programming interfaces (APIs)

• Provides flexible user interface options

• Enables extensive opportunities for customization
Optional integration with full suite of Cisco Collaboration tools

• Allows you to take advantage of the full suite of Cisco Collaboration tools, including Cisco Quad, Cisco Show and Share, and Cisco Pulse technology, to help your social media customer care team quickly find answers to help customers efficiently and effectively

• Easy to maintain with existing IT personnel
Operating Environment
Cisco Unified Computing System(UCS) C-Series or B-Series Servers

• Requires a Cisco UCS C-Series or B-Series Server.

• Server consolidation means lower cost per server with Cisco UCS Servers.
Architecture
Scalability

• One server supports up to 30 simultaneous social media customer care users and 10,000 social contacts per hour.
Management
Cisco Unified Real-Time Monitoring Tool (RTMT)

• Operational management is enhanced through integration with the Cisco Unified RTMT, providing consistent application monitoring across Cisco Unified Communications Solutions.
Simple Network Management Protocol (SNMP)

• SNMP with an associated MIB is supported through the Cisco Voice Operating System (VOS).
Reporting
Cisco Unified Intelligence Center

• Create customizable reports of social media customer care events using Cisco Unified Intelligence Center (purchased separately).

 

 

Scoring SAS and SPSS Models in the cloud

Outline of a cloud containing text 'The Cloud'
Image via Wikipedia

An announcement from Zementis and Predixion Software– about using cloud computing for scoring models using PMML. Note R has a PMML package as well which is used by Rattle, data mining GUI for exporting models.

Source- http://www.marketwatch.com/story/predixion-software-introduces-new-product-to-run-sas-and-spss-predictive-models-in-the-cloud-2010-10-19?reflink=MW_news_stmp

——————————————————————————————————–

ALISO VIEJO, Calif., Oct 19, 2010 (BUSINESS WIRE) — Predixion Software today introduced Predixion PMML Connexion(TM), an interface that provides Predixion Insight(TM), the company’s low-cost, self-service in the cloud predictive analytics solution, direct and seamless access to SAS, SPSS (IBM) and other predictive models for use by Predixion Insight customers. Predixion PMML Connexion enables companies to leverage their significant investments in legacy predictive analytics solutions at a fraction of the cost of conventional licensing and maintenance fees.

The announcement was made at the Predictive Analytics World conference in Washington, D.C. where Predixion also announced a strategic partnership with Zementis, Inc., a market leader in PMML-based solutions. Zementis is exhibiting in Booth #P2.

The Predictive Model Markup Language (PMML) standard allows for true interoperability, offering a mature standard for moving predictive models seamlessly between platforms. Predixion has fully integrated this PMML functionality into Predixion Insight, meaning Predixion Insight users can now effortlessly import PMML-based predictive models, enabling information workers to score the models in the cloud from anywhere and publish reports using Microsoft Excel(R) and SharePoint(R). In addition, models can also be written back into SAS, SPSS and other platforms for a truly collaborative, interoperable solution.

“Predixion’s investment in this PMML interface makes perfect business sense as the lion’s share of the models in existence today are created by the SAS and SPSS platforms, creating compelling opportunity to leverage existing investments in predictive and statistical models on a low-cost cloud predictive analytics platform that can be fed with enterprise, line of business and cloud-based data,” said Mike Ferguson, CEO of Intelligent Business Strategies, a leading analyst and consulting firm specializing in the areas of business intelligence and enterprise business integration. “In this economy, Predixion’s low-cost, self-service predictive analytics solutions might be welcome relief to IT organizations chartered with quickly adding additional applications while at the same time cutting costs and staffing.”

“We are pleased to be partnering with Zementis, truly a PMML market leader and innovator,” said Predixion CEO Simon Arkell. “To allow any SAS or SPSS customer to immediately score any of their predictive models in the cloud from within Predixion Insight, compare those models to those created by Predixion Insight, and share the results within Excel and Sharepoint is an exciting step forward for the industry. SAS and SPSS customers are fed up with the high prices they must pay for their business users just to access reports generated by highly skilled PhDs who are burdened by performing routine tasks and thus have become a massive bottleneck. That frustration is now a thing of the past because any information worker can now unlock the power of predictive analytics without relying on experts — for a fraction of the cost and from anywhere they can connect to the cloud,” Arkell said.

Dr. Michael Zeller, Zementis CEO, added, “Our mission is to significantly shorten the time-to-market for predictive models in any industry. We are excited to be contributing to Predixion’s self-service, cloud-based predictive analytics solution set.”

About Predixion Software

Predixion Software develops and markets collaborative predictive analytics solutions in the public and private cloud. Predixion enables self-service predictive analytics, allowing customers to use and analyze large amounts of data to make actionable decisions, all within the familiar environment of Excel and PowerPivot. Predixion customers are achieving immediate results across a multitude of industries including: retail, finance, healthcare, marketing, telecommunications and insurance/risk management.

Predixion Software is headquartered in Aliso Viejo, California with development offices in Redmond, Washington. The company has venture capital backing from established investors including DFJ Frontier, Miramar Venture Partners and Palomar Ventures. For more information please contact us at 949-330-6540, or visit us atwww.predixionsoftware.com.

About Zementis

Zementis, Inc. is a leading software company focused on the operational deployment and integration of predictive analytics and data mining solutions. Its ADAPA(R) decision engine successfully bridges the gap between science and engineering. ADAPA(R) was designed from the ground up to benefit from open standards and to significantly shorten the time-to-market for predictive models in any industry. For more information, please visit www.zementis.com.

 

The auto-suggest link/tags for WP.com blogs

WordPress.com blogs have a great new option for generating tags, and links and thus improving their search engine optimization for posts.

Just go to Users-Personal Settings- and check the options shown. Thats it every time you write a post it suggests links and tags. Links are helpful for your readers (like Wikipedia links to understand dense technical jargon, or associated websites). Tags help to classify your contents so that all visitors to the web site including spiders ,search engines and your readers can search it better.

The bad thing is I need to go back to all 1025 posts on this site and auto generate tags for the archives ! Oh well. Great collaboration between zementa and Automattic for this new feature.

New Deal in Statistical Training

The United States Government is planning a new initiative at providing employable skills to people, to cope with unemployment.
One skill perpetually in shortage is analytics training along with skills in statistics.

It is time that corporates like IBM SPSS, SAS Institute and Revolution Analytics as well as offshore companies in India or Asia can ramp up their on demand trainings, certification as well as academic partnership bundles. Indeed offshroing companies can earn revenue as well as goodwill if they help in with trainers available via video- conferencing. The new Deal initiative would require creative thinking as well as direct top management support to focus their best internal brains at developing this new revenue stream. Again the company that trains the most users (be it Revolution for R, IBM for SPSS-Cognos, SAS Institute for Base SAS-JMP, WPS for SAS language) is going to get a bigger chunk of new users and analysts.

Analytics skills are hot. There is big new demand for hot new skills by millions of unemployed Americans and Asians. How do you think this services market will play out?

If the US government could pump 800 Billion for bailouts, how much is your opinion it should spend on training programs to help citizens compete globally?

From http://www.nytimes.com/2010/10/03/business/economy/03skills.html?hpw

The national program is a response to frustrations from both workers and employers who complain that public retraining programs frequently do not provide students with employable skills. This new initiative is intended to help better align community college curriculums with the demands of local companies.

SAS recognizes the market –

see http://www.sas.com/news/preleases/aba-tech-engage.html

In tough economic times, it is more important than ever that companies be able to make better decisions using analytics. SAS is involved in two programs this summer that offer MBAs and unemployed technology workers the opportunity to learn and enhance analytics skills, and increase their marketability.

SAS is a partner in TechEngage, a week-long program of training classes that offer unemployed technology professionals new skills at a low cost to help them compete effectively in the marketplace.”

So does IBM-

http://www-03.ibm.com/press/us/en/pressrelease/28994.wss

. “Fordham has a long history of collaboration with IBM that has brought innovative new skills to our curriculum to prepare students for future jobs. With this effort, Fordham is preparing students with marketable skills for a coming wave of jobs in healthcare, sustainability, and social services where analytics can be applied to everyday challenges.”

and R

Well TIBCO and Revolution ….hmmm…mmmm

I am not sure there is even a R Analytics Certification program at the least.

Windows Azure vs Amazon EC2 (and Google Storage)

Here is a comparison of Windows Azure instances vs Amazon compute instances

Compute Instance Sizes:

Developers have the ability to choose the size of VMs to run their application based on the applications resource requirements. Windows Azure compute instances come in four unique sizes to enable complex applications and workloads.

Compute Instance Size CPU Memory Instance Storage I/O Performance
Small 1.6 GHz 1.75 GB 225 GB Moderate
Medium 2 x 1.6 GHz 3.5 GB 490 GB High
Large 4 x 1.6 GHz 7 GB 1,000 GB High
Extra large 8 x 1.6 GHz 14 GB 2,040 GB High

Standard Rates:

Windows Azure

  • Compute
    • Small instance (default): $0.12 per hour
    • Medium instance: $0.24 per hour
    • Large instance: $0.48 per hour
    • Extra large instance: $0.96 per hour
  • Storage
    • $0.15 per GB stored per month
    • $0.01 per 10,000 storage transactions
  • Content Delivery Network (CDN)
    • $0.15 per GB for data transfers from European and North American locations*
    • $0.20 per GB for data transfers from other locations*
    • $0.01 per 10,000 transactions*

Source –

http://www.microsoft.com/windowsazure/offers/popup/popup.aspx?lang=en&locale=en-US&offer=MS-AZR-0001P

and

http://www.microsoft.com/windowsazure/windowsazure/

Amazon EC2 has more options though——————————-

http://aws.amazon.com/ec2/pricing/

Standard On-Demand Instances Linux/UNIX Usage Windows Usage
Small (Default) $0.085 per hour $0.12 per hour
Large $0.34 per hour $0.48 per hour
Extra Large $0.68 per hour $0.96 per hour
Micro On-Demand Instances Linux/UNIX Usage Windows Usage
Micro $0.02 per hour $0.03 per hour
High-Memory On-Demand Instances
Extra Large $0.50 per hour $0.62 per hour
Double Extra Large $1.00 per hour $1.24 per hour
Quadruple Extra Large $2.00 per hour $2.48 per hour
High-CPU On-Demand Instances
Medium $0.17 per hour $0.29 per hour
Extra Large $0.68 per hour $1.16 per hour
Cluster Compute Instances
Quadruple Extra Large $1.60 per hour N/A*
* Windows is not currently available for Cluster Compute Instances.

http://aws.amazon.com/ec2/instance-types/

Standard Instances

Instances of this family are well suited for most applications.

Small Instance – default*

1.7 GB memory
1 EC2 Compute Unit (1 virtual core with 1 EC2 Compute Unit)
160 GB instance storage (150 GB plus 10 GB root partition)
32-bit platform
I/O Performance: Moderate
API name: m1.small

Large Instance

7.5 GB memory
4 EC2 Compute Units (2 virtual cores with 2 EC2 Compute Units each)
850 GB instance storage (2×420 GB plus 10 GB root partition)
64-bit platform
I/O Performance: High
API name: m1.large

Extra Large Instance

15 GB memory
8 EC2 Compute Units (4 virtual cores with 2 EC2 Compute Units each)
1,690 GB instance storage (4×420 GB plus 10 GB root partition)
64-bit platform
I/O Performance: High
API name: m1.xlarge

Micro Instances

Instances of this family provide a small amount of consistent CPU resources and allow you to burst CPUcapacity when additional cycles are available. They are well suited for lower throughput applications and web sites that consume significant compute cycles periodically.

Micro Instance

613 MB memory
Up to 2 EC2 Compute Units (for short periodic bursts)
EBS storage only
32-bit or 64-bit platform
I/O Performance: Low
API name: t1.micro

High-Memory Instances

Instances of this family offer large memory sizes for high throughput applications, including database and memory caching applications.

High-Memory Extra Large Instance

17.1 GB of memory
6.5 EC2 Compute Units (2 virtual cores with 3.25 EC2 Compute Units each)
420 GB of instance storage
64-bit platform
I/O Performance: Moderate
API name: m2.xlarge

High-Memory Double Extra Large Instance

34.2 GB of memory
13 EC2 Compute Units (4 virtual cores with 3.25 EC2 Compute Units each)
850 GB of instance storage
64-bit platform
I/O Performance: High
API name: m2.2xlarge

High-Memory Quadruple Extra Large Instance

68.4 GB of memory
26 EC2 Compute Units (8 virtual cores with 3.25 EC2 Compute Units each)
1690 GB of instance storage
64-bit platform
I/O Performance: High
API name: m2.4xlarge

High-CPU Instances

Instances of this family have proportionally more CPU resources than memory (RAM) and are well suited for compute-intensive applications.

High-CPU Medium Instance

1.7 GB of memory
5 EC2 Compute Units (2 virtual cores with 2.5 EC2 Compute Units each)
350 GB of instance storage
32-bit platform
I/O Performance: Moderate
API name: c1.medium

High-CPU Extra Large Instance

7 GB of memory
20 EC2 Compute Units (8 virtual cores with 2.5 EC2 Compute Units each)
1690 GB of instance storage
64-bit platform
I/O Performance: High
API name: c1.xlarge

Cluster Compute Instances

Instances of this family provide proportionally high CPU resources with increased network performance and are well suited for High Performance Compute (HPC) applications and other demanding network-bound applications. Learn more about use of this instance type for HPC applications.

Cluster Compute Quadruple Extra Large Instance

23 GB of memory
33.5 EC2 Compute Units (2 x Intel Xeon X5570, quad-core “Nehalem” architecture)
1690 GB of instance storage
64-bit platform
I/O Performance: Very High (10 Gigabit Ethernet)
API name: cc1.4xlarge

Also http://www.microsoft.com/en-us/sqlazure/default.aspx

offers SQL Databases as a service with a free trial offer

If you are into .Net /SQL big time or too dependent on MS, Azure is a nice option to EC2 http://www.microsoft.com/windowsazure/offers/popup/popup.aspx?lang=en&locale=en-US&offer=COMPARE_PUBLIC

Updated- I just got approved for Google Storage so am adding their info- though they are in Preview (and its free right now) 🙂

https://code.google.com/apis/storage/docs/overview.html

Functionality

Google Storage for Developers offers a rich set of features and capabilities:

Basic Operations

  • Store and access data from anywhere on the Internet.
  • Range-gets for large objects.
  • Manage metadata.

Security and Sharing

  • User authentication using secret keys or Google account.
  • Authenticated downloads from a web browser for Google account holders.
  • Secure access using SSL.
  • Easy, powerful sharing and collaboration via ACLs for individuals and groups.

Performance and scalability

  • Up to 100 gigabytes per object and 1,000 buckets per account during the preview.
  • Strong data consistency—read-after-write consistency for all upload and delete operations.
  • Namespace for your domain—only you can create bucket URIs containing your domain name.
  • Data replicated in multiple data centers across the U.S. and within the same data center.

Tools

  • Web-based storage manager.
  • GSUtil, an open source command line tool.
  • Compatible with many existing cloud storage tools and libraries.

Read the Getting Started Guide to learn more about the service.

Note: Google Storage for Developers does not support Google Apps accounts that use your company domain name at this time.

Back to top

Pricing

Google Storage for Developers pricing is based on usage.

  • Storage—$0.17/gigabyte/month
  • Network
    • Upload data to Google
      • $0.10/gigabyte
    • Download data from Google
      • $0.15/gigabyte for Americas and EMEA
      • $0.30/gigabyte for Asia-Pacific
  • Requests
    • PUT, POST, LIST—$0.01 per 1,000 requests
    • GET, HEAD—$0.01 per 10,000 requests