Interview Timo Elliott SAP

Here is an interview with Timo Elliott, Senior Product Director SAP Business Objects.

Ajay- Describe your career in science from school to Senior Director in SAP to blogger/speaker. How do you think we can convince students of the benefits of learning science and maths.

Timo- I studied economics with statistics in the UK, but I had always been a closet geek and had dabbled with computers ever since I was a kid, starting with Z80 assembler code. I started my career doing low-level computer consulting in Hong Kong, and worked on a series of basic business intelligence projects at Shell in New Zealand, cobbling together a solution based on a mainframe HR system, floppy-disk transfers, and Lotus 1-2-3 macros. When I returned to Europe, I stumbled across a small French startup that provided exactly the “decision support systems” that I had been looking for, and enthusiastically joined the company.

Over the last eighteen years, I’ve worked with hundreds of companies around the world on their BI strategy and my job today is to help evangelize what works and what doesn’t, to help organizations avoid the mistakes that others have made.

When it comes to BI initiatives, I see the results of one fundamental problem almost on a daily basis: 75% of project success depends on people, process, organization, culture, and leadership, but we typically spend 92% of our time on data and technology.

BI is NOT about technology – it’s about helping people do their jobs. So when it comes to education, we need to teach our technologists more about people, not science!

Ajay- You were the 8th employee of SAP Business Objects. What are the key turning points or transition stages in the BI industry that you remember seeing in the past 18 years, and how has SAP Business objects responded to them.

Timo- Executive information systems and multidimensional databases have been around since at least the 1970s, but modern business intelligence dates from the early 1990s, driven by the widespread use of relational databases, graphical user interfaces, and the invention of the “semantic layer”, pioneered by BusinessObjects, that separated business terms from technical logic. For the first time, non-expert business people had self-service access to data.

This was followed by a period of rapid expansion, as leading vendors combined reporting, multidimensional, and dashboard approaches into fully-fledged suites. During this period, BusinessObjects acquired a series of related technology companies to complete the existing offer (such as the leader in operational reporting, Crystal Reports) and extend into enterprise information management and financial performance management.

Finally, the theme of the last few years has clearly been consolidation – according to Gartner, the top four “megavendors” (SAP, IBM, Microsoft, and Oracle) now make up almost two-thirds of the market, and accounted for fully 83% of the growth since last year. Perhaps as a result, user deployments are accelerating, with usage growth rates doubling last year.

Ajay- How do you think Business Intelligence would be affected by the following

a) Predictive Analytics.

Timo- Predictive analytics has been the “next big thing in BI” for at least a decade. It has been extremely important in some key areas, such as fraud detection, but the dream of “no longer managing by looking out of the rear-view mirror” has proved hard to achieve, notably because business conditions are forever changing.

We offer predictive analytics with our Predictive Workbench product – but I think the real opportunity for this technology in the future is “power analytics”, rather than “prediction”. For example, helping business people automatically cluster similar values, spot outliers, determine causal factors, and detect trend inflection points, using the data that they already have access to with traditional BI.

b) Cloud Computing.

Timo- In terms of architecture, it’s clearly not about on-demand OR on-premise: it’s about having a flexible approach that combines both approaches. You can compare information to money: today, we tend to keep our money in the bank rather than under our own mattress, because it’s safer, more convenient, and more cost-efficient. At the same time, there are situations where the convenience of cash is still essential.

Companies should be able to choose a BI strategy, and decide how to deploy it later. This is what we offer with our BI on-demand solutions, which use the same technology as on-premise. You can start to build on-premise and move it to on-demand, or vice-versa, or have a mix of both.

In terms of data, “cloud intelligence” is still a work in progress. As with modern financial instruments, we can expect to see the growth of new information services, such as our “information on-demand” product that provide data feeds from Reuters, Thompson Financial, and other providers to augment internal information systems. Looking further into the future, we can imagine new information marketplaces that would pay us “interest” to store our data in the cloud, where it can be adapted, aggregated and sold to others.

c) Social Media.

Timo- Conversations and collaboration are an essential part of effective business intelligence. We often talk about the notion of a “single view of the truth” in this industry, but that’s like saying we can have “a single view of politics” – while it’s vital to try to give everybody access to the same data, there will always be plenty of room for interpretation and discussion. BI platforms need to support this collaborative decision-making.

In particular, there are many, many studies that show up our all-too-human limitations when it comes to analyzing data. For example, did you know that children with bigger feet have better handwriting?

It’s absolutely true — because the children are older! Mixing up correlation and causality is a common issue in business intelligence, and one answer to the problem is to add more people: the more reviewers there are of the decision-making process, the better the decisions will be.

Analysis is also critical to the development of social media, such as analyzing sentiment trends in Twitter — a functionality we offer with SAP CRM — or tracking social communities. For example, Jive, the leader in Enterprise 2.0 platforms, offers our BI products as part of their solution, to help their customers analyze and optimize use of the system. Administrators can track if usage is trailing off in a particular department, for example.

d) Social Network Analysis.

Timo- Over the last twenty years, partly as a result of extensive automation of operational tasks with systems such as SAP, there’s has been a huge shift from “routine” to “non-routine” work. Today, fully 90% of business users say that their work involves decision making, problem solving, and the creation of new analysis and insight.

To help support this new creativity, organizations are becoming more porous as we work closer with our ecosystem of customers, partners, and suppliers, and we work in ever-more matrixed environments and cross-functional teams.

We’ve developed a Social Network Analyzer prototype that combines BI and social networking to create a “single view of relationships”. It can gather information from multiple different systems, such as HR, CRM, email distribution lists, project teams, Twitter, etc., to create a multi-layered view of how people are connected, across and beyond the enterprise. For more information, see the SAP Web 2.0 blog post, and you can try it yourself on our ondemand.com web site.

Ajay- What is the area that SAP BusinessObjects is very good at (strength). What are the key areas that you are currently seeking to improve ( opportunities)

Timo- Companies evaluating BI solutions should look at four things: product functionality for their users’ needs, fit with the overall IT architecture, the vendor’s reputation and ecosystem, and (of course) price. SAP BusinessObjects is the clear leader in the BI industry, and I’d say that SAP BusinessObjects has the best overall solution if you’re a large organization (or looking to become one) with a variety of user needs, multiple data sources, and a heterogeneous IT infrastructure.

In terms of opportunities, we have high expectations for new interfaces for casual users, and in-memory processing, which we have combined in our SAP BusinessObjects Explorer product. Initial customer feedback has been excellent, with quotes such as “finding information is as easy as using the internet” and “if you can use a computer, you can use Explorer”.

In terms of future directions, we’re taking a very transparent, Web 2.0 approach. The SAP Business Objects innovation center is modeled on Google Labs and we share our prototypes (including the Social Network Analyzer mentioned above) with anybody who’s interested, and let our customers give us early feedback on what directions we should go.

Ajay- What does Timo Elliott do for work life balance when not writing, talking, and evangelizing about Business Intelligence?

Timo- I’m a keen amateur photographer – see www.timoelliott.com/personal for more!

Biography- http://timoelliott.com/blog/about

Timo Elliott is Senior Director of Strategic Marketing for SAP BusinessObjects. For the last twenty years he has been a thought leader and conference speaker in business intelligence and performance management.

A popular and engaging speaker, Elliott presents regularly to IT and business audiences at international conferences, drawing on his experience working with enterprise customers around the globe. Topics include the latest developments in BI/PM technology, how best to suceed with BI/PM projects, and future trends in the industry. 

Prior to Business Objects, Elliott was a computer consultant in Hong Kong and led analytics projects for Shell in New Zealand. He holds a first-class honors degree in Economics with Statistics from Bristol University, England.

Additional websites: http://www.sapweb20.com —  web 2.0 technology by, with, and at SAP

Email: telliott@timoelliott.com or timo.elliott@sap.com

LinkedIn: http://www.linkedin.com/in/timoelliott

Twitter: http://twitter.com/timoelliott

Flickr: http://www.flickr.com/photos/timoelliott/

Facebook: http://www.facebook.com/people/Timo-Elliott/544744135

For an earlier interview with Oracle Data Mining Product Management, Charlie Berger see https://decisionstats.wordpress.com/2009/09/02/oracle/

Interview Professor John Fox Creator R Commander

Here is an interview with Prof John Fox, creator of the very popular R language based GUI, RCmdr.

Ajay- Describe your career in science from your high school days to the science books you have written. What do you think can be done to increase interest in science in young people.

John Fox- I’m a sociologist and social statistician, so I don’t have a career in science, as that term is generally understood. I was interested in science as a child, however: I attended a science high school in New York City (Brooklyn Tech), and when I began university in 1964 at New York’s City College, I started in engineering. I moved subsequently through majors in philosophy and psychology, before finishing in sociology — had I not graduated in 1968 I probably would have moved on to something else. I took a statistics course during my last year as an undergraduate and found it fascinating. I enrolled in the sociology graduate program at the University of Michigan, where I specialized in social psychology and demography, and finished with a PhD in 1972 when I was 24 years old. I became interested in computers during my first year in graduate school, where I initially learned to program in Fortran. I also took quite a few courses in statistics and math.

I haven’t written any science books, but I have written and edited a number of books on social statistics, including, most recently, Applied Regression Analysis and Generalized Linear Models, Second Edition (Sage, 2008).

I’m afraid that I don’t know how to interest young people in science. Science seemed intrinsically interesting to me when I was young, and still does.

Ajay- What prompted you to R Commander. How would you describe R Commander as a tool, say for a user of other languages and who want to learn R, but get afraid of the syntax.

John- I originally programmed the R Commander so that I could use R to teach introductory statistics courses to sociology undergraduates. I previously taught this course with Minitab or SPSS, which were programs that I never used for my own work. I waited for someone to come up with a simple, portable, easily installed point-and-click interface to R, but nothing appeared on the horizon, and so I decided to give it a try myself.

I suppose that the R Commander can ease users into writing commands, inasmuch as the commands are displayed, but I suspect that most users don’t look at them. I think that serious prospective users of R should be encouraged to use the command-line interface along with a script editor of some sort. I wouldn’t exaggerate the difficulty of learning R: I came to R — actually S then — after having programmed in perhaps a dozen other languages, most recently at that point Lisp, and found the S language particularly easy to pick up.

Ajay- I particularly like the R Cmdr plugins. Is it possible for anyone to increase R Commander with a customized package- plugin.

John- That’s the basic idea, though the plug-in author has to be able to program in R and must learn a little Tcl/Tk.

Ajay- Have you thought of using the R Commander GUI on an Amazon EC2 and thus making R high performance computing say available on demand ( similar to Zementis model deployment using Amazon Ec2). What are you views on the future of statistical computing

John- I’m not sure whether or how an interface like the Rcmdr, which is Tcl/Tk-based, can be adapted to cloud computing. I also don’t feel qualified to predict the future of statistical computing.

I think that R is where the action is for the near future.

Ajay-What are the best ways for using R Commander as a teaching tool ( I noticed the help is a bit outdated).

John- Is the help a bit outdated? My intention is that the R Commander should be largely self-explanatory. Most people know how to use point-and-click interfaces. In the basic courses for which it is principally designed, my goals are to teach the essential ideas of statistical reasoning and some skills in data analysis. In this kind of course, statistical software should facilitate the basic goals of the course.

As I said, for serious data analysis, I believe that it’s a good idea to encourage use of the command-line interface.

Ajay- What are your views on R being recognized by SAS Institute for it’s IML product. Do you think there can be a middle way for open source and proprietary software to exist.

John- I imagine that R is a challenge for producers of proprietary software like SAS, partly because R development moves more quickly, but also because R is giving away something that SAS and other vendors of proprietary statistical software are selling. For example, I once used SAS quite a bit but don’t anymore. I also have the sense that for some time SAS has directed its energies more toward business uses of its software than toward purely statistical applications.

Ajay- Do people in R Core team recognize the importance of GUI? What does the rest of R community feel? What has the feedback of users ben to you. Any plans to corporate sponsors for R Commander ( Rattle , an R language data mining GUI has a version called Rstat at http://www.informationbuilders.com/products/webfocus/predictivemodeling.html while the free version and code is at rattle.togaware.com)

John- I feel that the R Commander GUI has been generally positively received, both by members of R Core who have said something about it to me and by others in the R community. Of course, a nice feature of the R package system is that people can simply ignore packages in which they have no interest. I noticed recently that a Journal of Statistical Software paper that I wrote several years ago on the Rcmdr package has been downloaded nearly 35,000 times.

Because I wouldn’t expect many students using the Rcmdr package in a course to read that paper, I expect that the package is being used fairly widely.

Ajay- What does John Fox do for fun or as a hobby?

John- I’m tempted to say that much of my work is fun — particularly doing research, writing programs, and writing papers and books. I used to be quite a serious photographer, but I haven’t done that in years, and the technology of photography has changed a great deal. I run and swim for exercise, but that’s not really fun. I like to read and to travel, but who doesn’t?

Biography-

Prof John Fox is a giant in his chosen fields and has edited/authored 13 books and written chapters for 12 more books. He has also written and been published in almost 49 Journal articles. He is also editor in chief for R News newsletter. You can read more about Dr Fox at http://socserv.mcmaster.ca/jfox/

On R Cmdr-

R Cmdr has substantially decreased the hygiene factor for people wanting to learn R- they begin with the GUI and then later transition to customization using command line. It is so simple in its design that even under graduates have started basic data analysis with R Cmdr after just a class.You can read more on it here at http://socserv.mcmaster.ca/jfox/Misc/Rcmdr/Getting-Started-with-the-Rcmdr.pdf

Interview Stephen Baker Author The Numerati

Here is an interview with Stephen Baker, the author of the famous and remarkable book The Numerati. Stephen is the senior editor at Businessweek and his remarkable book made the world sit up and pay attention because for the first time, anyone wrote of the increasingly quant driven lives we lead thanks to the internet and the analytical brains that power the stimulus, design and targeting of it. Increasing amounts of data is collected about consumers than at any previous point of time in human history and the number crunchers or the quant jocks are the ones who increasingly help with decision making and decision management. Steve calls these people “The Numerati” or the new math people who help shape our lives.

There will always be lawyers and financiers who make loads of money. But they will have quantitative experts on their teams- Stephen L Baker

Ajay- Describe your career journey from high school to a technology writer to author of The Numerati.

Steve- I was always interested in history and in literature, and in college I fell in love with Spanish. So after college, I moved to Ecuador, taught English, and wrote fiction. I saw early on that I wasn’t going to be able to make a living with fiction. So I went into journalism. My goal was to become a correspondent in Latin America. Through my 20s, I worked in Vermont, Madrid, Argentina, Venezuela, Washington DC, and El Paso, Texas. And I finally got the job I was looking for, bureau chief for the Mexico bureau of BusinessWeek magazine.

After Mexico, my family and I moved to Pittsburgh. It appeared that the magazine was losing interest in heavy industry in the mid-90s, so I began to write about software and robotics coming out of at Carnegie Mellon University. That was my transition into technology. A year later, BusinessWeek offered me a job covering technology in Europe. I moved with my family to Paris, where we lived for four years. I focused largely on mobile communications. It seemed to me that the combination of mobility and the Internet would fundamentally change communications.

I returned to New York in 2002. I focused on big picture stories. One day in 2005, I proposed a story about the decline of the U.S. technology industry. I argued that we were behind in wireless and in broadband, we were graduating fewer scientists and engineers than other great powers, especially in Asia. One editor pointed out that mathematics was critical for these competitive issues. The editor in chief, Steve Adler, called for a cover story on math, and he assigned it to me. I didn’t know much about math at the time, and I still don’t. But this gave me the chance to dive into the world of data analysis. I wrote a cover story, Math will Rock Your Business, and later got the contract to write the Numerati.
Ajay- How do you think the government can motivate more American students to science careers?

Steve- I think focusing on the science that kids find cool–robotics, space and ocean exploration, would help. Funding basic research would be useful. But I don’t think it’s entirely a governmental issue. Parents, companies, universities, they all have to participate.
Ajay- What are the top  tips you would give to aspiring technology writers and bloggers (like myself)?

Steve-
1) Learn about non tech subjects, such as history, literature, art and psychology
2) Work on writing clearly for non experts. Avoid jargon.
3) Do reporting
4) Do more reporting

Ajay-The Numerati portrays a math elite which breaks the stereotype of the lonely, nerdy geek. How important do you think is that common people be more educated in math so they are more aware of marketing operations and credit offers?

Steve- I think it’s important for common people, as you call them, to understand basic statistics. More and more of our lives are going to be analyzed and communicated to us statistically. Those who do not understand this will not know to ask the right questions, and will be easily fooled. This is also true within companies. CEOs can be fooled by numbers, just like anyone else.
Ajay- Asia delivers a disproportionate number of science graduates. Yet one generation ago American and European heritage scientists made the trip to the moon with very basic computers. As our lives get increasingly shaped by the Numerati, how important are geo-cultural influences in its membership?

Steve- Most of the Numerati I met in the United States were born outside the U.S. The US has long relied on foreign brains, especially for its technology industry. As the Numerati study people’s lives, the quantitative experts will increasingly need to work closely with linguists, anthropologists, and psychologists. And they’ll need to understand different global cultures and languages. In this sense, the international nature of the Numerati is an advantage.
Ajay- Do you think the shift in money and influence from lawyers and financiers to scientists and mathematicians is temporary or is it here to stay?

Steve-I think it’s here to stay. There will always be lawyers and financiers who make loads of money. But they will have quantitative experts on their teams.

Ajay- What influenced your decision to be associated with Predictive Analytics world?

Steve- I had the privilege of interviewing Eric Siegel as I was researching the book. We’ve kept in touch since then. I think he’s very bright and does excellent work.
Ajay- What does Stephen Baker do when not writing books or articles or observing the world go around him?

Steve- I like to ride bicycles, I like to travel. I love Spanish and French and baseball and music

Biography-

Stephen L. Baker is the author of The Numerati and a senior writer at BusinessWeek, covering technology. Previously he was a Paris correspondent. Baker joined BusinessWeek in March, 1987, as manager of the Mexico City bureau, where he was responsible for covering Mexico and Latin America. He was named Pittsburgh bureau manager in 1992. Before BusinessWeek, Baker was a reporter for the El Paso Herald-Post. Prior to that, he was chief economic reporter for The Daily Journal in Caracas, Venezuela. Baker holds a bachelor’s degree from the University of Wisconsin and a master’s from the Columbia University Graduate School of Journalism.

You can read more about the Numerati at http://thenumerati.net/index.cfm?catID=18 Stephen L Baker is the keynote speaker at Predictive Analytics World and you can check the details here http://www.predictiveanalyticsworld.com/register.php if you want to listen to  him at the event.

You can follow Steve on twitter at http://twitter.com/stevebaker and follow his blog here http://www.businessweek.com/the_thread/blogspotting/

Interview Jeff Bass, Bass Institute (Part 2)

During the 1980’s and early 1990’s, the Bass Institute managed to attract a loyal following with it’s SAS language compiler, ultimately bowing to the financial pressures and technological pressures of the move to the Desktop. In the year 2009, as SAS language gains a new compiler in terms of the WPS, AND computing paradigms begin to shift to cloud computing from the desktop- Jeff Bass, founder of Bass Institute and genius tech coder brings a perspective rich in experience.

If we don’t learn from history, we are condemned to repeat it.

Ajay- Describe your career in science. How would you motivate children in class rooms today to be as excited about science as the moon generation was?

J Bass- My graduate training was in economics and statistics.  I have used that training in ways that I would never have anticipated when I was in graduate school 30 years ago.  But it is still exciting for me.  I started out building microeconomic models, then went on to write statistical language compliers and build health policy macroeconomic models.  These days I develop and articulate health policy to help increase patient’s access to cutting edge medicines.  The company I work for now is very science based and even applies scientific thinking, measurement and testing of alternatives in the business side of its operations.

I spend volunteer time as a guest teacher at local middle schools, high schools and community colleges.  I often talk about math and statistics and have found that one way to help motivate students is to give them “fun” example problems.  I often use an example of the 1969 lunar orbital calculations to motivate basic trigonometry and quite a number of students who say they don’t like math end up loving solving parts of that problem.  I think our school curriculums need to come up with problems and examples that the students find interesting.  I’m not sure our existing curricula processes make this an easy thing to do.  All too often we teach techniques without combining that teaching with strong motivating examples that make learning fun.

Ajay-  What are the changes in paradigms that you have seen across the decades? What are the key insights and summaries that you can provide.

J Bass- Our increasing understanding of biology and DNA is a major paradigm shift that is combining molecular biology and protein chemistry with computer science.  Identifying the human DNA sequence was only the beginning.  Imagine that you were handed the bit sequence of a CD-ROM and were told to figure out what parts of it were a text document, what parts were a JPEG photograph and what parts were an MP3 music file – if you did NOT know the coding schemes of such files.  That’s analogous to where we are today with DNA sequences…we know the ATCG sequence, but we are only scratching the surface of understanding the things that the DNA sequence codes for – proteins, cell metabolism, differentiating cell reproduction. Continue reading “Interview Jeff Bass, Bass Institute (Part 2)”

Interview Neil Raden Founder of Hired Brains Inc

Here is an interview with one terrific person who has always inspired my writing ( or atleast my attempts to write) on data and systems. Neil Raden is a giant in the publishing and consulting space for business intelligence ,analytics, and decision management. In a nice interview Neil talks of his passion for his work, his prolific authoring of white papers, his seminal work with James Taylor and how he sees the BI space evolve.

The history of BI pretty much follows the history of computing platforms. First we had time-sharing, then mainframes, then mini’s, then client-server vs. PC, then a number passes at distributed computing, such as CORBA, then SOA and now the cloud.- Neil Raden


Ajay- Describe your career in math and technology and your current activities. How would you explain what you do for a living to a group of high school students who are wondering to take up mathematical and technical subjects or not.

Neil- I didn’t earn a dime at the career I was meant for, consulting, until I was 33 years old. So I would tell college students not to be in such a hurry to corner themselves into a career. It may take a while to figure out what you really want.

Though I went to college to study theatre, within a few weeks I was inspired by a math professor and switched my major. From that point on, it was pads of paper and sharp pencils. I was totally in my own head with math. I never took a statistics course, or even differential equations, because I was consumed by discrete math (graph theory too), topology and logic and later game theory/economics.

When I went looking for a job in 1974, in the midst of a deep recession, I was confronted with the stark reality (in New York ) that I could be a COBOL programmer or an actuary. I chose the latter. Working at AIG in New York in the 70’s was pretty exciting. We broke new ground in commercial property and casualty insurance and reinsurance every week. I was part of a small R&D group under the chief actuary, who reported directed to Maurice Greenberg, the legendary (but now maligned) inventor of AIG, and I loved the work.

I had to go back and teach myself probability and statistics to get through the exams, but ultimately, two kids and one on the way in NYC on one not-so-great salary was a deal-breaker. I left AIG and joined a software company doing modeling and prediction. The rest, as they say, is history. I formed my own consulting company in 1985 and I’m still at it.

To me, consulting isn’t something you do between jobs or a title you get because you implement software for clients. Consulting is a craft, it’s a career and it is rather easy to do but very difficult to learn. I work very hard to teach this to people who work for me. It’s about commitment, hard work and, most of all, ethics and being authentic with your client.

Ajay- Writing books is a lonely yet rewarding work. Could you briefly elucidate on your recent book, Smart (Enough) Systems?

Neil- I have to credit my partner, James Taylor, with the concept for the book. He was working at Fair Isaac (now FICO) at the time and this was exactly what he was doing there. It was a little tangential to my work, but when James approached me, he said he wanted a partner who was proficient in the data integration and analytics aspects of EDM (Enterprise Decision Management).

James made it pretty easy because

1) he is very prolific and 2) he took most of my comments and integrated them without argument.

I’d say I was pretty lucky and it went very well. I don’t know if I’ll ever write another book. I suppose I won’t know until the idea hits me. I’m sure it will be more difficult doing it on my own.

Ajay- What are the various stages that you have seen the BI industry go through. What are the next few years going to bring to us-

What is your wishlist for changes the industry makes for better customer ROI.

Neil- The history of BI pretty much follows the history of computing platforms. First we had time-sharing, then mainframes, then mini’s, then client-server vs. PC, then a number passes at distributed computing, such as CORBA, then SOA and now the cloud. But while the locus of BI storage, computing and presentation has changed, it’s focus changes very slowly.

Historically, there have been two major subject areas in BI: f inance and sales/marketing, All of the other subject areas still rest on periphery.

Complex Event Processing ( CEP ) for example, is making a lot of noise lately , but not much implementation. Visualization is here to stay . When the BI app and the Web a pp are the same, BI will be everywhere, but it will be a sort of pyrrhic victory because it won’t be recognized as such. Now you can take all of this with a grain of salt because I don’t really follow the industry per se, I’m more interested in how my clients can apply the technology to get the results they need.

Ajay- There is a lot of buzz about predictive analytics lately. Do you think it will have a noticeable impact or is it just the latest thing?

Neil- There are only so many people who understand quant itative meth ods and it isn’t going to grow very much. This puts a damper on PA (Predictive Analytics) because no manager is going to act on the recommendations of a black box without an articulate quant who can explain the methodology and the limits of its precision.

That isn’t a bad thing, and those who practice in predictive analytics will prosper.

On the other hand, I believe there will be an expansion of the use of generic PA models that have been vetted in practice. The FICO score is a good example, and the ability to develop and implement these applications (it’s much easier now thanks to PA software and computing environments in general) should allow for a nice market to develop around them. This is especially true with decision automation systems, like logistics, material handling, credit authorization, etc.

Ajay- What were your most interesting projects as an implementer? Most rewarding?

Neil- Most Interesting: I was the Chairman of an Advisory Board at Sandia National Laboratories for a few years.Our goal was to encourage the lab to adopt more modern and effective information management tools for their dual purpose of

1) designing and manufacturing nuclear weapons (frightening isn’t it?) and

2) certification of nuclear waste repositories.

I was able to work with scientists, physicists, engineers, geologists and computer sciences, all from backgrounds very different from those I normally engaged. The problems were monumental.

Most rewarding: We developed a data warehouse to capture the daily sales of products at the most detailed level for a cosmetics company. They never had this information before because the retailers were counters in hundreds of department stores. Thus they were able for the first time to truly understand the “sell through” of their products. Beyond just allowing a better understanding of the flow, they could tailor their promotions and, not much later, implement a continuous replenishment system.

The president of the company came to the launch and explained how we had allowed the company to do things it had never done before which would change it for the better. You don’t get those accolades from the CEO very often.

Ajay- You’ve written forty white papers. That’s a lot. What impact do you think they’ve had?

Neil- I couldn’t tell you. I don’t track downloads, my website doesn’t even require registration. I don’t see them quoted or cited very often, but then, people don’t quote or cite other’s work in this field very often anyway. I can say that I have many repeat customers among the vendors, so they must be deriving some value from them.

Ajay- What are your views on creating a community for the top 100 BI analysts in the world – a bit like a Reuters or a partnership firm. How pleased do you think will BI vendors be by this.

Neil- I was actually involved in an effort like this about a dozen years ago, called BI Alliance . Doug Hackney and I started it, and we had about a dozen BI luminaries in the organization. I’ll try to remember some: Sid Adelman, David Marco, Richard Winter, David Foote, Herb Edelstein.

You could only join if you were an independent or the head of your own firm.

It was a useful marketing tool as we were able to 1) share references and 2) staff projects. But it sort of lost its inertia after a few years.

But a few hundred BI analysts? Are there that many?? LOL I don’t know how the vendors would react, but I sort of doubt this sort of organization would have any kind of clout – too many divergent opinions.

Ajay- Do you think the work you do matters?

Neil- It certainly has an economic impact on my family! LOL I don’t know, I hope it does and proportionate to my income versus the size of the industry, yes, I guess it does. Not necessarily directly though .

A company in Dayton or Macon doesn’t make a decision because I said so, but I think I do influence some analysts and vendor s a nd to the extent I influence them, then I guess I do . I limit my analysis to my clients. If they think this work matters, then it does.

Biography-

Neil Raden, consultant, analyst and author is followed by technology providers, consultants and even other analysts. His knowledge of the analytical applications is the result of thirty years of intensive work. He is the founder of Hired Brains, a research and advisory firm in Santa Barbara, CA, offering research and analysis services to technology providers as well as providing consulting and implementation services. Mr. Raden began his career as a casualty actuary with AIG before moving into software engineering and consulting in the application of analytics in fields as diverse as health care to nuclear waste management to cosmetics marketing. His blog can be found at intelligententerprise.com/experts/raden/. He is the author of dozens articles and white papers and he has has contributed to numerous books and is the co-author of “Smart (Enough) Systems” (Prentice Hall, 2007) with James Taylor. nraden@hiredbrains.com

Alternatively you can just follow Neil Raden at his twitter id neilraden

Interview Dylan Jones DataQualityPro.com

Here is an interview with Dylan Jones the founder/editor of Dataqualitypro.com , the site to go to for anything related to Data Quality discussions. Dylan is a great charming person and in this interview talks candidly on his views.Dylan Jones

Ajay: Describe your career in science and in business intelligence. How would you convince young students to take more maths and science courses for scientific careers.

Dylan: My main education for the profession was a degree in Information Technology and Software Development. No surprises what my first job entailed – software development for an IT company!

That role took me straight into the trials and tribulations of business intelligence and data quality. After a couple of years I went freelance and have pretty much worked for myself ever since. There has been a constant thread of data quality, business intelligence and data migration throughout my career which culminated in me setting up the more recent social media initiatives to try and pull professionals together in this space.

In all honesty, I’m probably the worst person to give career advice Ajay as I’m a hopeless dreamer. I’ve never really structured my career. I fell into data quality early on and it has led me to work in some wonderful places and with some great people, largely by accident and fate.

I have a simple philosophy, do what you love doing. I’m incredibly lucky to wake up every day with an absolute passion for what I do. In the past, whenever I have found myself working in a situation that I find soul destroying (and in our profession that can happen regularly) I move on to something new.

So, my advice for people starting out would be to first question what makes them happy in life. Don’t simply follow the herd. The internet has totally transformed the rules of the game in terms of finding an outlet for your skills so follow your heart, not conventional wisdom.

That said, I think there are some core skills that will always provide a springboard. Maths is obviously one of those skills that can open many doors but I would also advise people to learn about marketing, sales and other business fundamentals. From a business intelligence perspective it really adds an attractive dimension to your skills if you can link technical ability with a deeper understanding of how businesses operate.

Ajay You are a top expert and publisher on BI topics. Tell us something about

a) http://www.datamigrationpro.com/

b) http://www.dataqualitypro.com/

c) Involvement with the DataFlux community of experts

d) Your latest venture http://www.dqvote.com

Dylan- Data Migration Pro was my first foray into the social media space. I realised that very few people were talking about the challenges and techniques of data migration. On average, large organisations implement around 4 migration projects a year and most end in failure. A lot of this is due to a lack of awareness. Having worked for so long in this space I felt it was time to create a social media site to bring the wider community together. So we now have forums, regular articles, tools and techniques on the site with about 1400 members worldwide plus lots of plans in the pipeline for 2010.

Data Quality Pro followed on from the success of Data Migration Pro and our speed of growth really demonstrates how important data quality is right now. Again, awareness of the basic techniques and best-practices is key. I think many organisations are really starting to recognise the importance of better data quality management practices so a lot of our focus is on giving people practical advice and tools to get started. We are a community publishing platform, I do write regularly but we’ve always had a significant community contribution from expert practitioners and authors.

I didn’t just want to take a corporate viewpoint with these communities. As a result they are very much focused on the individual. That is why we post so many features on how to promote your skills, search for work, gain personal skills and generally get ahead in the profession. Data Quality Pro has just under 2,000 members and about 6,000 regular visitors a month so it demonstrates just how many people are really committed to learning about this discipline as it impacts practically every part of the business. I also think it is an excellent career choice as so many projects are dependent on good quality data there will always be demand.

The DataFlux community of experts is a great resource that I’ve actually admired for some time. I am a big fan of Jill Dyche who used to write on the community and of course there is a great line-up on there now with experts like David Loshin, Joyce Norris-Montanari and Mike Ferguson so I was delighted to be invited to participate. DataFlux have sponsored our sites from the very beginning and without their support we wouldn’t have grown to our current size. So although I’m vendor independent, it’s great to be sharing my thoughts and ideas with people who visit their site.

DQVote.com is a relatively new initiative. I noticed that there was some great data quality content being linked through platforms like Twitter but it would essentially become hard to find after several days. Also, there was no way for the community to vote on what content they found especially useful. DQVote.com allows people to promote their own content but also to vote and share other useful data quality articles, blogs, presentations, videos, tutorials – anything that adds value to the data quality community. It is also a great springboard for emerging data quality bloggers and publishers of useful content.

Ajay- Do you think BI projects can be more successful if we reward data entry people, or at least pay more for better quality data rather than ask them to fill in database tables as fast as they can? Especially in offshore call centres.

Dylan- Data entry is a pet frustration of mine. I regularly visit companies who are investing hundreds of thousands of pounds in data quality technology and consultants but nothing in grass-roots education and cultural change. They would rather create cleansing factories than resolve the issues at source.

So, yes I completely agree, the reward system has to change. I personally suffer from this all the time – call centre staff record incorrect or incomplete information about my service or account and it leads to billing errors, service problems, annoyance and eventually lost business. Call centre staff are not to blame, they are simply rewarded on the volume of customer service calls they can make, they are not encouraged to enter good quality data. The fault ultimately lies with the corporations that use these services and I don’t think offshore or onshore makes a difference. I’ve witnessed terrible data quality in-house also. The key is to have service level agreements on what quality of data is acceptable. I also think a reward structure as opposed to a penalty structure can be a much more progressive way of improving the quality of call-centre data.

Ajay- What are the top 5 things that you can help summarize your views on Business Intelligence – assume you are speaking to a class of freshmen statisticians.

Dylan- Business intelligence is wholly dependent on data quality. Accessibility, timeliness, accuracy, completeness, duplication – data quality dimensions like these can dramatically change the value of business intelligence to the organisation. Take nothing for granted with data, assume nothing. I have never, ever, assessed a dataset in a large business that did not have some serious data defects that were impacting decision making.

As statisticians, they therefore possess the tools to help organisations discover and measure these defects. They can find ways to continuously improve and ensure that future decisions are based on reliable data.

I would also add that business intelligence is not just about technology, it is about interpreting data to determine trends that will enable a company to improve their competitive advantage. Statistics are important but freshmen must also understand how organisations really create value for their customers.

My advice is to therefore step away from the tools and learn how the business operates on the ground. Really listen to workers and customers as they can bring the data to life. You will be able to create far more accurate dashboards and reports of where the issues and opportunities lie within a business if you immerse yourself with the people who create the data and the senior management who depend on the quality of your business intelligence platforms.

Ajay- Which software have you personally coded or implemented. Which one did you like the best and why?

Dylan- I’ve used most of the BI and DQ tools out there, all have strengths and weaknesses so it is very subjective. I have my favourites but I try to remain vendor neutral so I’ll have to gracefully decline on this one Ajay!

However, I did build a data profiling and data quality assessment tool several years ago. To be honest, that is the tool I like best because it had a range of features I still haven’t seen implemented so far in any other tools. If I ever get chance, and if no other vendor comes up with the same concept, I may yet take it to market. For now though, two young kids, two communities and a 12 hour day mean it is something of pipedream.

Ajay-What does Dylan Jones do when not helping data quality of the world go better.

Dylan- I’ve recently had another baby boy so kids take up most of whatever free time I have left. When we do get a break though I like to head to my home town and just hang out on the beach or go up into the mountains. I love travelling and as I effectively work completely online now, we’re really trying to figure out a way of combining travel and work.

Biography-

Dylan Jones is the founder and editor of Data Quality Pro and Data Migration Pro, the leading online expert community resources. Since the early nineties he has been helping large organisations tackle major information management challenges. He now devotes his time to fostering greater awareness, community and education in the fields of data quality and data migration via the use of social media channels. Dylan can be contacted via his profile page at http://www.dataqualitypro.com/data-quality-dylan-jones/ or at http://www.twitter.com/dataqualitypro

Best of Decision Stats- Modeling and Text Mining Part3

Here are some of the top articles by way of views, in an  area I love– of modeling and text mining.

1) Karl Rexer – Rexer Analytics

http://www.decisionstats.com/2009/06/09/interview-karl-rexer-rexer-analytics/

Karl produces one of the most respected surveys that captures emerging trends in data mining and technology. Karl was also one of the most enthusiastic people I have interviewed- and I am thankful for his help in getting me some more interviews.

2) Gregory Piatesky Shapiro

One of the earliest and easily the best Knowledge Discoverer of all times, Gregory produces http://www.kdnuggets.com and the newsletter is easily the must newsletter to be on. Gregory was doing data mining , while the Google boys were still debating whether to drop out of Stanford or not.
Continue reading “Best of Decision Stats- Modeling and Text Mining Part3”