Interview : R For Stata Users

Here is an interview with Bob Muenchen , author of ” R For SAS and SPSS Users” and co-author with Joe Hilbe of ” R for Stata Users”.

Describe your new book R for Stata Users and how it is helpful to users.

Stata is a marvelous software package. Its syntax is well designed, concise and easy to learn. However R offers Stata users advantages in two key areas: education and analysis.

Regarding education, R is quickly becoming the universal language of data analysis. Books, journal articles and conference talks often include R code because it’s a powerful language and everyone can run it. So R has become an essential part of the education of data analysts, statisticians and data miners.

Regarding analysis, R offers a vast array of methods that R users have written. Next to R, Stata probably has more useful user-written add-ons than any other analytic software. The Statistical Software Components collection at Boston College’s Department of Economics is quite impressive (http://ideas.repec.org/s/boc/bocode.html), containing hundreds of useful additions to Stata. However, R’s collection of add-ons currently contains 3,680 packages, and more are being added every week.  Stata users can access these fairly easily by doing their data management in Stata, saving a Stata format data set, importing it into R and running what they need. Working this way, the R program may only be a few lines long.

In our book, the section “Getting Started Quickly” outlines the most essential 50 pages for Stata users to read to work in this way. Of course the book covers all the basics of R, should the reader wish to learn more. Being enthusiastic programmers, we’ll be surprised if they don’t want to read it all.

There are many good books on R, but as I learned the language I found myself constantly wondering how each concept related to the packages I already knew. So in this book we describe R first using Stata terminology and then using R terminology. For example, when introducing the R data frame, we start out saying that it’s just like a Stata data set: a rectangular set of variables that are usually numeric with perhaps one or two character variables. Then we move on to say that R also considers it a special type of “list” which constrains all its “components” to be equal in length. That then leads into entirely new territory.

The entire book is laid out to make learning easy for Stata users. The names used in the table of contents are Stata-based. The reader may look up how to “collapse” a data set by a grouping variable to find that one way R can do that is with the mysteriously named “tapply” function. A Stata user would never have guessed to look for that name

. When reading from cover-to-cover that may not be that big of a deal, but as you go back to look things up it’s a huge time saver. The index is similar in that you can look every subject up by its Stata name to find the R function or vice versa. People see me with both my books near my desk and chuckle that they’re there for advertising. Not true! I look details up in them all the time.

I didn’t have enough in-depth knowledge of Stata to pull this off by myself, so I was pleased to get Joe Hilbe as a co-author. Joe is a giant in the world of Stata. He wrote several of the Stata commands that ship with the product including glm, logistic and manova. He was also the first editor of the Stata Technical Bulletin, which later turned into the Stata Journal. I have followed his work from his days as editor of the statistical software reviews section in the journal The American Statistician. There he not only edited but also wrote many of the reviews which I thoroughly enjoyed reading over the years. If you don’t already know Stata, his review of Stata 9.0 is still good reading (November 1, 2005, 59(4): 335-348).

Describe the relationship between Stata and R and how it is the same or different from SAS / SPSS and R.

This is a very interesting question. I pointed out in R for SAS and SPSS Users that SAS and SPSS are structured very similarly while R is totally different. Stata, on the other hand, has many similarities to R. Here I’ll quote directly from the book:

• Both include rich programming languages designed for writing new analytic methods, not just a set of prewritten commands.

• Both contain extensive sets of analytic commands written in their own languages.

• The pre-written commands in R, and most in Stata, are visible and open for you to change as you please.

• Both save command or function output in a form you can easily use as input to further analysis.

• Both do modeling in a way that allows you to readily apply your models for tasks such as making predictions on new data sets. Stata calls these postestimation commands and R calls them extractor functions.

• In both, when you write a new command, it is on an equal footing with commands written by the developers. There are no additional “Developer’s Kits” to purchase.

• Both have legions of devoted users who have written numerous extensions and who continue to add the latest methods many years before their competitors.

• Both can search the Internet for user-written commands and download them automatically to extend their capabilities quickly and easily.

• Both hold their data in the computer’s main memory, offering speed but limiting the amount of data they can handle.

Can the book be used by a R user for learning Stata

That’s certainly not ideal. The sections that describe the relationship between the two languages would be good to know and all the example programs are presented in both R and Stata form. However, we spend very little time explaining the Stata programs while going into the R ones step by step. That said, I continue to receive e-mails from R experts who learned SAS or SPSS from R for SAS and SPSS Users, so it is possible.

Describe the response to your earlier work R for SAS and SPSS users and if any new editions is forthcoming.

I am very pleased with the reviews for R for SAS and SPSS Users. You can read them all, even the one really bad one, at http://r4stats.com. We incorporated all the advice from those reviews into R for Stata Users, so we hope that this book will be well received too.

In the first book, Appendix B: A Comparison of SAS and SPSS Products with R Packages and Functions has been particularly popular for helping people find the R packages they need. As it expanded, I moved it to the web site: http://r4stats.com/add-on-modules. All three packages are changing so fast that I sometimes edit that table several times per week!
The second edition to R for SAS and SPSS Users is due to the publisher by the end of February, so it will be in the bookstores by sometime in April 2011, if all goes as planned. I have a list of thirty new topics to add, and those won’t all fit. I have some tough decisions to make!
On a personal note, Ajay, it was a pleasure getting to meet you when you came to UT, especially our chats on the current state of the analytics market and where it might be headed. I love the fact that the Internet allows people to meet across thousands of miles. I look forward to reading more on DecisionStats!
About –

Bob Muenchen has twenty-eight years of experience consulting, managing and teaching in a variety of complex, research oriented computing environments. You can read about him here http://web.utk.edu/~muenchen/RobertMuenchenResume.html

Norman Nie: R GUI and More

Here is an interview from Norman Nie, SPSS Founder and CEO, REvolution Computing (R Platform).

Some notable thoughts

For example, SPSS was really among the first to deliver rich GUIs that make it easier to use by more people. This is why one of the first things you’ll see from REvolution is a GUI for R – to make R more accessible and hereby further accelerate adoption.

This is good news if executed- I have often written (in agony actually because I use it) for the need for GUIs for R. My last post on that was here. Indeed the one reason SPSS was easily adopted by business school students (like me) in India in 2001-3 was the much better GUI over SAS ‘s GUIs.

However some self delusion/ PR / cognitive dissonance seems at play at Dr Nie’s words

If you look at the last 40 years of university curriculum, SPSS – the product I helped build – has been the dominant player, even becoming the common thread uniting a diverse range of disciplines, which have in turn been applied to business. Data is ubiquitous: tools and data warehouses allow you to query a given set of data repeatedly. R does these things better than the alternatives out there; it is indeed the wave of the future.

SPSS has been a strong number 2- but it has never overtaken SAS. Part of that is SAS handles much bigger datasets much more easily than SPSS did ( and that is where R’s RAM only size can be a concern). Given the decreasing prices of RAM memory, the BIG-LM like packages, and the shift for cloud based computing(with rampable memory on demand) this can be less of an issue- but analysts generally like to have a straight way of handling bigger datasets. Indeed SAS with vertical focus and the recent social media analytics continues to innovate both itself as well as through its alliance partnerships in the Enterprise software world- and REvolution Computing would further need to tie up or sew these analytical partners especially data warehousing or BI providers to ensure R’s analytical functions can be used where there is maximum value for their usage to the corporate customer as well as the academic customer.

Part 2 of Nie’s interview should be interesting .

2010-2011 would likely see

Round 2 : Red Corner ( Nie)                             Gray Corner (Goodnight)

if

Norman Nie can truly deliver a REvolution in Computing

or else

he becomes number two again the second time around to Jim Goodnight’s software giant.

Interview Jeanne Harris Co-Author -Analytics at Work and Competing with Analytics

Here is an interview with Jeanne Harris, Executive Research Fellow and a Senior Executive at the Accenture Institute for High Performance and co-author of “Analytics at Work.”


Ajay- Describe your background in analytics and strategy

Jeanne- I’ve been involved in strategy and analytics since the mid 1980s, when I worked as part of a project that resulted in a Harvard Business Review article with Michael Porter entitled Information for Competitive Advantage. Since that time, I have led Accenture’s business intelligence, analytics, performance management, knowledge management, and data warehousing consulting practices. I have worked extensively with clients seeking to improve their managerial information, decision-making, analytical and knowledge management capabilities.

I am currently an Executive Research Fellow and a Senior Executive at the Accenture Institute for High Performance in Chicago. I lead the Institute’s global research agenda in the areas of information, technology, analytics and talent.

My research into building an enterprise analytical capability began in 1999, which led to an article called Data to Knowledge to Results: Building an Analytical Capability in California Management Review the following year.

In 2007, I co-authored “Competing on Analytics” with Tom Davenport, which argued that successful businesses must start making decisions based on data, not instinct.

Ajay- How is “Analytics at Work an extension of your previous work? How is it different and distinct?

Jeanne- In Competing on Analytics we argued that there are big rewards for organizations that embrace fact-based decision-making. High performers are 5 times as likely to view analytical capabilities as a key element of their strategy. Across the board, high performance is associated with more extensive and sophisticated use of analytical capabilities. Companies like USbased Progressive Insurance which build their corporate culture around analytics have a real and sustainable competitive advantage.

As I spoke to clients, I realized that for every company that viewed analytics as their new core competitive strategy, there were many more who just wanted pragmatic advice on how to use analytics to make smarter business decisions and achieve better results.

If an analytical organization could be established by executive fiat we would not have needed to write another book. But like anything worthwhile, putting analytics to work takes effort and thought.

Fortunately, every organization, regardless of its current analytical capability, can benefit by becoming more analytical, gaining better insights, making better decisions and translating that into improved business performance. In Analytics at Work, we describe the elements an organization needs to establish a sustainable, robust, enterprise-wide analytical capability. This book contains a lot of pragmatic “how to” advice gleaned from our research and Accenture’s many client experiences.

Ajay- Do you see Analytics as a tool for cost cutting especially in the present economic scenario?

Jeanne- Certainly, analytics are an important tool for cutting costs and improving efficiency. Optimization techniques and predictive models can anticipate market shifts, enable companies to move quickly to slash costs and eliminate waste. But there are other reasons analytics are more important than ever:

Manage risk. More precise metrics and risk management models will enable managers to make better decisions, reduce risk and monitor changing business conditions more effectively.

Know what is really working. Rigorous testing and monitoring of metrics can establish whether your actions are really making desired changes in your business or not.

Leverage existing investments (in IT and information) to get more insight, faster execution and more business value in business processes.

Invest to emerge stronger as business conditions improve. High performers take advantage of downturns to retool, gain insights into new market dynamics and invest so that they are prepared for the upturn. Analytics give executives insight into the dynamics of their business and how shifts influence business performance.

Identify and seize new opportunities for competitive advantage and differentiation.

Ajay- What are some analytics vendors and what do you think are their various differentiating points among each other?

Jeanne- Certain tools are ideally suited for some situations and not others.  It’s the same with analytics vendors.  At Accenture, we focus on matching the right tool with the demands of the situation, regardless of the vendor.

Ajay- What areas in an organization do you see Analytics applied the most and where do you think it is currently applied the least?

Jeanne- According to new Accenture research, two-thirds of senior managers in all areas of organizations the US and UK say their top long-term objective is to develop the ability to model and predict behavior, actions and decisions to the point where individual decisions and offers can be made in real time based on the analysis at hand[1]. Analytics is most frequently used in customer facing applications to generate customer insights, although in certain industries such as transportation it is also commonly used for logistics and yield management.  Right now, analytics is probably most infrequently used in HR, although talent management analytics is a very hot topic.

Ajay- What is the best case study you can think where Analytics made a difference? And name a case study where it back fired.

Jeanne- It is hard to pick one favorite case study, when we wrote a whole book full of them!  Harrahs is a great case study of a company that uses analytics for competitive differentiation.  Netflix is another company that has built its entire business model around an algorithm. Of course Google is essentially an analytical company too. What is of note is that during previous downturns, companies that thrived used data-derived insights made by informed decision makers to produce lasting competitive advantage.

In the book, we discuss the use and misuse of analytics as it relates to the global financial crisis, which we found to be a fascinating case study.

Ajay- Universities now offer Master in Analytics? What are your thoughts for an Ideal MS (Analytics) curriculum?

Jeanne- Yes, there are several universities around the world with degrees such as:

· Applied Analytics

· Applied Mathematics

· Econometrics

· Statistics

· Informatics

· Operations Research

Examples of universities in the US offering these programs include:

· Kellogg School of Management, Northwestern University

· Miami University (in Ohio)

· Central Michigan University,

· Villanova University

· North Carolina State University


Obviously analysts require extensive expertise in quantitative methods and modeling technology. But this is just the starting point. To be effective in business they also require industry and business process knowledge. They need to understand enough about IT to understand how analytics fit into the overall IT infrastructure. They need excellent written and verbal communications skills so their insights are understood.  Analysts must also have collaboration skills to work with business managers to apply insights and achieve better business performance. So relationship and consultative skills are critical. As analytics become more central to the organization, more analysts need to know how to lead, coach and develop other professional analysts.  They also will need to help coach and develop the analytical acumen of other information workers in the organizations.

Leading academic institutions are building more analytics into their business curriculums.  And the best analytics degree programs are adding more training to develop industry & business process acumen, as well as relationship, communication & consultative skills.

Biography-

Jeanne G. Harris has worked with analytics, decision support and business intelligence at Accenture for over 23 years and headed the consulting practice in that area for the firm for several years. She is now Executive Research Fellow and Director of Research for the Accenture Institute for High Performance Business. She has been co Author with Tom Davenport for the seminal path breaking book Competing with Analytics and now on Analytics at Work.

Here is a link to the new book-having read some of it (and still reading it) I recommend it highly as a practical actionable guide.






Interview Hadley Wickham R Project Data Visualization Guru

Here is an interview with the genius behind many of the R Project’s Graphical Packages- Dr Hadley Wickham.

Ajay– Describe your pivotal moments in your career in science from a high school science student leading up till here as a professor.

Hadley– After high school I went to medical school. After three years and a degree I realised that I really didn’t want to be a doctor so I went back to two topics that I had enjoyed in high school: programming and statistics. I really loved the practice of statistics, digging in to data and figuring out what was going on, but didn’t find the theoretical study of computer science so interesting. That spurred me to get my MSc in Statistics and then to apply to graduate school in the US.

The next pivotal moment occurred when I accepted a PhD offer from Iowa State. I applied to ISU because I was interested in multivariate data and visualisation and heard that the department had a focus on those two topics, through the presence of Di Cook and Heike Hofmann. I couldn’t have made a better choice – Di and Heike were fantastic major professors and I loved the combination of data analysis, software development and teaching that they practiced. That in turn lead to my decision to look for a job in academia.

Ajay– You have created almost ten R Packages as per your website http://had.co.nz/. Do you think there is a potential for a commercial version for a data visualization R software? What are your views on the current commercial R packages?

Hadley– I think there’s a lot of opportunity for the development of user-friendly data visualisation tools based on R. These would be great for novices and casual users, wrapping up the complexities of the command-line into an approachable GUI – see Jeroen Oom’s http://yeroon.net/ggplot2 for an example.

Developing these tools is not something that is part of my research endeavors. I’m a strong believer in the power of computational thinking and the advantages that programming (instead of pointing and clicking) brings. Creating visualizations with code makes reproducibility, automation and communication much easier – all of which are important for good science.

Commercial packages fill a hole in the R ecosystem. They make R more palatable to enterprise customers with guaranteed support, and they can offer a way to funnel some of that money back into the R ecosystem. I am optimistic about the future of these endeavors.

Ajay– Clearly with your interest in graphics, you seem to favor visual solutions. Do you also feel that R Project could benefit from better R GUIs or GUIs for specific packages?

Hadley– See above – while GUIs are useful for novices and casual users, they are not a good fit for the demands of science. In my opinion, what R needs more are better tutorials and documentation so that people don’t need to use GUIs. I’m very excited about the new dynamic html help system – I think it has huge potential for making R easier to use.

Compared to other programming languages, R currently lacks good online (free) introductions for new users. I think this is because many R developers are academics and the incentives aren’t there to make freely available documentation. Personally, I would love to make (e.g.) the ggplot2 book available openly available under a creative common license, but I would receive no academic credit for doing so.

Ajay– Describe the top 3-5 principles which you have explained in your book, ggplot2: Elegant graphics for data analysis). What are other important topics that you cover in the book?

Hadley– The ggplot2 book gives you the theory to understand the construction of almost any statistical graphic. With this theory in hand, you are much better equipped to create visualisations that are tailored to the exact problem you face, rather than having to rely on a canned set of pre-made graphics.

The book is divided into sections based on the components of this theory, called the layered grammar of graphics, which is based on Lee Wilkinson’s excellent “The Grammar of Graphics”. It’s quite possible to use ggplot2 without understanding these components, but the better you understand, the better your ability to critique and improve your graphics.

Ajay– What are the five best tutorials that you would recommend for students learning data visualization in R? As a data visualization person do you feel that R could do with more video tutorials?

Hadley– If you want to learn about ggplot2, I’d highly recommend the following two resources:

* The Learning R blog, http://learnr.wordpress.com/
* The ggplot2 mailing list, http://groups.google.com/group/ggplot2

For general data management and manipulation (often needed before you can visualise data) and visualisation using base graphics, Quick-R (http://www.statmethods.net/) is very useful.

Local useR groups can be an excellent if you live nearby. Lately, the bay area (http://www.meetup.com/R-Users/) and the New York (http://www.meetup.com/nyhackr/) useR groups have had some excellent speakers on visualisation, and they often post slides and videos online.

Ajay– What are your personal hobbies? How important are work-life balance and serendipity for creative, scientific and academic people?

Hadley– When I’m not working, I enjoy reading and cooking. I find it’s important to take regular breaks from my research and software development work. When I come back I’m usually bursting with new ideas. Two resources that have helped shape my views on creativity and productivity are Elizabeth’s Gilbert TED talk on nurturing creativity (http://www.ted.com/index.php/talks/elizabeth_gilbert_on_genius.html) and
“The Creative Habit: Learn It and Use It for Life”, by Twyla Twarp (http://amzn.com/0743235266). I highly recommend both of them.

Dr Wickham’s impressive biography can be best seen at http://had.co.nz/

Data Mining 2009- Using IPhone 3GS for Audio Interviews

I just used an I Phone 3GS for the interviews at Data Mining 2009 . The next time some one comes to your company claiming to be a social media expert to do a podcast or webcast etc. –

you can show them the I Phone and use the Voice Memos function to record it yourself.

Caveat audio- records at 1 MB/Minute and more than 5 minutes can be boring to listeners as well as Iphone can send only 5 mb files by email to yourself)

If you insist on doing more than 5 minute interviews- you can save it on the 80 Gb disc of the IPhone and use ITunes to set up your Podcast.

I believe that traditional PR needs to adapt and learn these techniques less they fail to adopt to the technology changes and this is especially true for public relations for technology companies.

The I Phone has a lot of features and you can even combine them with websites like www.ustream.com to create a Video Podcast.

For free.

SAS Data Mining 2009 Las Vegas

I am going to Las Vegas as a guest of SAS Institute for the Data Mining 2009 Conference. ( Note FCC regulations on bloggers come in effective December but my current policies are in ADVERTISE page unchanged since some months now)

With the big heavyweight of analytics, SAS Institute showcases events in both the SAS Global Forum and the Data Mining 2009

conference has a virtual who’s- who of partners there. This includes my friends at Aster Data and Shawn Rogers, Beye Network

in addition to Anne Milley, Senior Product Director. Anne is a frequent speaker for SAS Institute and has shrug off the beginning of the year NY Times spat with R /Open Source. True to their word they did go ahead and launch the SAS/IML with the interface to R – mindful of GPL as well as open source sentiments.

. While SPSS does have a data mining product there is considerable discussion on that help list today on what direction IBM will allow the data mining product to evolve.

Charlie Berger, from Oracle Data Mining , also announced at Oracle World that he is going to launch a GUI based data mining product for free ( or probably Software as a Service Model)- Thanks to Karl Rexer from Rexer Analytics for this tip.

While this is my first trip to Las Vegas ( a change from cold TN weather), I hope to read new stuff on data mining including sessions on blog and text mining and statistical usage of the same. Data Mining continues to be an enduring passion for me even though I need to get maybe a Divine Miracle for my Phd to get funded on that topic.

Also I may have some tweets at #M2009 for you and some video interviews/ photos. Ok- Watch this space.

ps _ We lost to Alabama #2 in the country by two points because 2 punts were blocked by hand which were as close as it gets.

Next week I hope to watch the South Carolina match in Orange Country.

Screenshot-32

Interview Timo Elliott SAP

Here is an interview with Timo Elliott, Senior Product Director SAP Business Objects.

Ajay- Describe your career in science from school to Senior Director in SAP to blogger/speaker. How do you think we can convince students of the benefits of learning science and maths.

Timo- I studied economics with statistics in the UK, but I had always been a closet geek and had dabbled with computers ever since I was a kid, starting with Z80 assembler code. I started my career doing low-level computer consulting in Hong Kong, and worked on a series of basic business intelligence projects at Shell in New Zealand, cobbling together a solution based on a mainframe HR system, floppy-disk transfers, and Lotus 1-2-3 macros. When I returned to Europe, I stumbled across a small French startup that provided exactly the “decision support systems” that I had been looking for, and enthusiastically joined the company.

Over the last eighteen years, I’ve worked with hundreds of companies around the world on their BI strategy and my job today is to help evangelize what works and what doesn’t, to help organizations avoid the mistakes that others have made.

When it comes to BI initiatives, I see the results of one fundamental problem almost on a daily basis: 75% of project success depends on people, process, organization, culture, and leadership, but we typically spend 92% of our time on data and technology.

BI is NOT about technology – it’s about helping people do their jobs. So when it comes to education, we need to teach our technologists more about people, not science!

Ajay- You were the 8th employee of SAP Business Objects. What are the key turning points or transition stages in the BI industry that you remember seeing in the past 18 years, and how has SAP Business objects responded to them.

Timo- Executive information systems and multidimensional databases have been around since at least the 1970s, but modern business intelligence dates from the early 1990s, driven by the widespread use of relational databases, graphical user interfaces, and the invention of the “semantic layer”, pioneered by BusinessObjects, that separated business terms from technical logic. For the first time, non-expert business people had self-service access to data.

This was followed by a period of rapid expansion, as leading vendors combined reporting, multidimensional, and dashboard approaches into fully-fledged suites. During this period, BusinessObjects acquired a series of related technology companies to complete the existing offer (such as the leader in operational reporting, Crystal Reports) and extend into enterprise information management and financial performance management.

Finally, the theme of the last few years has clearly been consolidation – according to Gartner, the top four “megavendors” (SAP, IBM, Microsoft, and Oracle) now make up almost two-thirds of the market, and accounted for fully 83% of the growth since last year. Perhaps as a result, user deployments are accelerating, with usage growth rates doubling last year.

Ajay- How do you think Business Intelligence would be affected by the following

a) Predictive Analytics.

Timo- Predictive analytics has been the “next big thing in BI” for at least a decade. It has been extremely important in some key areas, such as fraud detection, but the dream of “no longer managing by looking out of the rear-view mirror” has proved hard to achieve, notably because business conditions are forever changing.

We offer predictive analytics with our Predictive Workbench product – but I think the real opportunity for this technology in the future is “power analytics”, rather than “prediction”. For example, helping business people automatically cluster similar values, spot outliers, determine causal factors, and detect trend inflection points, using the data that they already have access to with traditional BI.

b) Cloud Computing.

Timo- In terms of architecture, it’s clearly not about on-demand OR on-premise: it’s about having a flexible approach that combines both approaches. You can compare information to money: today, we tend to keep our money in the bank rather than under our own mattress, because it’s safer, more convenient, and more cost-efficient. At the same time, there are situations where the convenience of cash is still essential.

Companies should be able to choose a BI strategy, and decide how to deploy it later. This is what we offer with our BI on-demand solutions, which use the same technology as on-premise. You can start to build on-premise and move it to on-demand, or vice-versa, or have a mix of both.

In terms of data, “cloud intelligence” is still a work in progress. As with modern financial instruments, we can expect to see the growth of new information services, such as our “information on-demand” product that provide data feeds from Reuters, Thompson Financial, and other providers to augment internal information systems. Looking further into the future, we can imagine new information marketplaces that would pay us “interest” to store our data in the cloud, where it can be adapted, aggregated and sold to others.

c) Social Media.

Timo- Conversations and collaboration are an essential part of effective business intelligence. We often talk about the notion of a “single view of the truth” in this industry, but that’s like saying we can have “a single view of politics” – while it’s vital to try to give everybody access to the same data, there will always be plenty of room for interpretation and discussion. BI platforms need to support this collaborative decision-making.

In particular, there are many, many studies that show up our all-too-human limitations when it comes to analyzing data. For example, did you know that children with bigger feet have better handwriting?

It’s absolutely true — because the children are older! Mixing up correlation and causality is a common issue in business intelligence, and one answer to the problem is to add more people: the more reviewers there are of the decision-making process, the better the decisions will be.

Analysis is also critical to the development of social media, such as analyzing sentiment trends in Twitter — a functionality we offer with SAP CRM — or tracking social communities. For example, Jive, the leader in Enterprise 2.0 platforms, offers our BI products as part of their solution, to help their customers analyze and optimize use of the system. Administrators can track if usage is trailing off in a particular department, for example.

d) Social Network Analysis.

Timo- Over the last twenty years, partly as a result of extensive automation of operational tasks with systems such as SAP, there’s has been a huge shift from “routine” to “non-routine” work. Today, fully 90% of business users say that their work involves decision making, problem solving, and the creation of new analysis and insight.

To help support this new creativity, organizations are becoming more porous as we work closer with our ecosystem of customers, partners, and suppliers, and we work in ever-more matrixed environments and cross-functional teams.

We’ve developed a Social Network Analyzer prototype that combines BI and social networking to create a “single view of relationships”. It can gather information from multiple different systems, such as HR, CRM, email distribution lists, project teams, Twitter, etc., to create a multi-layered view of how people are connected, across and beyond the enterprise. For more information, see the SAP Web 2.0 blog post, and you can try it yourself on our ondemand.com web site.

Ajay- What is the area that SAP BusinessObjects is very good at (strength). What are the key areas that you are currently seeking to improve ( opportunities)

Timo- Companies evaluating BI solutions should look at four things: product functionality for their users’ needs, fit with the overall IT architecture, the vendor’s reputation and ecosystem, and (of course) price. SAP BusinessObjects is the clear leader in the BI industry, and I’d say that SAP BusinessObjects has the best overall solution if you’re a large organization (or looking to become one) with a variety of user needs, multiple data sources, and a heterogeneous IT infrastructure.

In terms of opportunities, we have high expectations for new interfaces for casual users, and in-memory processing, which we have combined in our SAP BusinessObjects Explorer product. Initial customer feedback has been excellent, with quotes such as “finding information is as easy as using the internet” and “if you can use a computer, you can use Explorer”.

In terms of future directions, we’re taking a very transparent, Web 2.0 approach. The SAP Business Objects innovation center is modeled on Google Labs and we share our prototypes (including the Social Network Analyzer mentioned above) with anybody who’s interested, and let our customers give us early feedback on what directions we should go.

Ajay- What does Timo Elliott do for work life balance when not writing, talking, and evangelizing about Business Intelligence?

Timo- I’m a keen amateur photographer – see www.timoelliott.com/personal for more!

Biography- http://timoelliott.com/blog/about

Timo Elliott is Senior Director of Strategic Marketing for SAP BusinessObjects. For the last twenty years he has been a thought leader and conference speaker in business intelligence and performance management.

A popular and engaging speaker, Elliott presents regularly to IT and business audiences at international conferences, drawing on his experience working with enterprise customers around the globe. Topics include the latest developments in BI/PM technology, how best to suceed with BI/PM projects, and future trends in the industry. 

Prior to Business Objects, Elliott was a computer consultant in Hong Kong and led analytics projects for Shell in New Zealand. He holds a first-class honors degree in Economics with Statistics from Bristol University, England.

Additional websites: http://www.sapweb20.com —  web 2.0 technology by, with, and at SAP

Email: telliott@timoelliott.com or timo.elliott@sap.com

LinkedIn: http://www.linkedin.com/in/timoelliott

Twitter: http://twitter.com/timoelliott

Flickr: http://www.flickr.com/photos/timoelliott/

Facebook: http://www.facebook.com/people/Timo-Elliott/544744135

For an earlier interview with Oracle Data Mining Product Management, Charlie Berger see https://decisionstats.wordpress.com/2009/09/02/oracle/

%d bloggers like this: