Brief Interview Timo Elliott

Here is a brief interview with Timo Elliott.Timo Elliott is a 19-year veteran of SAP Business Objects.

Ajay- What are the top 5 events in Business Integration and Data Visualization services you saw in 2010 and what are the top three trends you see in these in 2011.


Timo-

Top five events in 2010:

(1) Back to strong market growth. IT spending plummeted last year (BI continued to grow, but more slowly than previous years). This year, organizations reopened their wallets and funded new analytics initiatives — all the signs indicate that BI market growth will be double that of 2009.

(2) The launch of the iPad. Mobile BI has been around for years, but the iPad opened the floodgates of organizations taking a serious look at mobile analytics — and the easy-to-use, executive-friendly iPad dashboards have considerably raised the profile of analytics projects inside organizations.

(3) Data warehousing got exciting again. Decades of incremental improvements (column databases, massively parallel processing, appliances, in-memory processing…) all came together with robust commercial offers that challenged existing data storage and calculation methods. And new “NoSQL” approaches, designed for the new problems of massive amounts of less-structured web data, started moving into the mainstream.

(4) The end of Google Wave, the start of social BI.Google Wave was launched as a rethink of how we could bring together email, instant messaging, and social networks. While Google decided to close down the technology this year, it has left its mark, notably by influencing the future of “social BI”, with several major vendors bringing out commercial products this year.

(5) The start of the big BI merge. While several small independent BI vendors reported strong growth, the major trend of the year was consolidation and integration: the BI megavendors (SAP, Oracle, IBM, Microsoft) increased their market share (sometimes by acquiring smaller vendors, e.g. IBM/SPSS and SAP/Sybase) and integrated analytics with their existing products, blurring the line between BI and other technology areas.

Top three trends next year:

(1) Analytics, reinvented. New DW techniques make it possible to do sub-second, interactive analytics directly against row-level operational data. Now BI processes and interfaces need to be rethought and redesigned to make best use of this — notably by blurring the distinctions between the “design” and “consumption” phases of BI.

(2) Corporate and personal BI come together. The ability to mix corporate and personal data for quick, pragmatic analysis is a common business need. The typical solution to the problem — extracting and combining the data into a local data store (either Excel or a departmental data mart) — pleases users, but introduces duplication and extra costs and makes a mockery of information governance. 2011 will see the rise of systems that let individuals and departments load their data into personal spaces in the corporate environment, allowing pragmatic analytic flexibility without compromising security and governance.

(3) The next generation of business applications. Where are the business applications designed to support what people really do all day, such as implementing this year’s strategy, launching new products, or acquiring another company? 2011 will see the first prototypes of people-focused, flexible, information-centric, and collaborative applications, bringing together the best of business intelligence, “enterprise 2.0”, and existing operational applications.

And one that should happen, but probably won’t:

(4) Intelligence = Information + PEOPLE. Successful analytics isn’t about technology — it’s about people, process, and culture. The biggest trend in 2011 should be organizations spending the majority of their efforts on user adoption rather than technical implementation.                 About- http://timoelliott.com/blog/about

Timo Elliott is a 19-year veteran of SAP BusinessObjects, and has spent the last twenty years working with customers around the world on information strategy.

He works closely with SAP research and innovation centers around the world to evangelize new technology prototypes.

His popular Business Analytics and SAPWeb20 blogs track innovation in analytics and social media, including topics such as augmented corporate reality, collaborative decision-making, and social network analysis.

His PowerPoint Twitter Tools lets presenters see and react to tweets in real time, embedded directly within their slides.

A popular and engaging speaker, Elliott presents regularly to IT and business audiences at international conferences, on subjects such as why BI projects fail and what to do about it, and the intersection of BI and enterprise 2.0.

Prior to Business Objects, Elliott was a computer consultant in Hong Kong and led analytics projects for Shell in New Zealand. He holds a first-class honors degree in Economics with Statistics from Bristol University, England. He blogs on http://timoelliott.com/blog/ (one of the best designed blogs in BI) . You can see more about him personal web site here and photo/sketch blog here. You should follow Timo at http://twitter.com/timoelliott

Art Credit- Timo Elliott

Related Articles

Short Interview Jill Dyche

Here is brief one question interview with Jill Dyche , founder Baseline Consulting.

 

In 2010.

 

  • It was more about consciousness-raising in the executive suite—
  • getting C-level managers to understand the ongoing value proposition of BI,
  • why MDM isn’t their father’s database, and
  • how data governance can pay for itself over time.
  • Some companies succeeded with these consciousness-raising efforts. Some didn’t.

 

But three big ones in 2011 would be:

  1. Predictive analytics in the cloud. The technology is now ready, and so is the market—and that includes SMB companies.
  2. Enterprise search being baked into (commoditized) BI software tools. (The proliferation of static reports is SO 2006!)
  3. Data governance will begin paying dividends. Until now it was all about common policies for data. In 2011, it will be about ROI.

I do a “Predictions for the coming year” article every January for TDWI,

Note- Jill ‘s January TDWI article seems worth waiting for in this case.

About-

Source-http://www.baseline-consulting.com/pages/page.asp?page_id=49125

Partner and Co-Founder

Jill Dyché is a partner and co-founder of Baseline Consulting.  She is responsible for key client strategies and market analysis in the areas of data governance, business intelligence, master data management, and customer relationship management. 

Jill counsels boards of directors on the strategic importance of their information investments.

Author

Jill is the author of three books on the business value of IT. Jill’s first book, e-Data (Addison Wesley, 2000) has been published in eight languages. She is a contributor to Impossible Data Warehouse Situations: Solutions from the Experts (Addison Wesley, 2002), and her book, The CRM Handbook (Addison Wesley, 2002), is the bestseller on the topic. 

Jill’s work has been featured in major publications such as Computerworld, Information Week, CIO Magazine, the Wall Street Journal, the Chicago Tribune and Newsweek.com. Jill’s latest book, Customer Data Integration (John Wiley and Sons, 2006) was co-authored with Baseline partner Evan Levy, and shows the business breakthroughs achieved with integrated customer data.

Industry Expert

Jill is a featured speaker at industry conferences, university programs, and vendor events. She serves as a judge for several IT best practice awards. She is a member of the Society of Information Managementand Women in Technology, a faculty member of TDWI, and serves as a co-chair for the MDM Insight conference. Jill is a columnist for DM Review, and a blogger for BeyeNETWORK and Baseline Consulting.

 

Brief Interview with James G Kobielus

Here is a brief one question interview with James Kobielus, Senior Analyst, Forrester.

Ajay-Describe the five most important events in Predictive Analytics you saw in 2010 and the top three trends in 2011 as per you.

Jim-

Five most important developments in 2010:

  • Continued emergence of enterprise-grade Hadoop solutions as the core of the future cloud-based platforms for advanced analytics
  • Development of the market for analytic solution appliances that incorporate several key features for advanced analytics: massively parallel EDW appliance, in-database analytics and data management function processing, embedded statistical libraries, prebuilt logical domain models, and integrated modeling and mining tools
  • Integration of advanced analytics into core BI platforms with user-friendly, visual, wizard-driven, tools for quick, exploratory predictive modeling, forecasting, and what-if analysis by nontechnical business users
  • Convergence of predictive analytics, data mining, content analytics, and CEP in integrated tools geared  to real-time social media analytics
  • Emergence of CRM and other line-of-business applications that support continuously optimized “next-best action” business processes through embedding of predictive models, orchestration engines, business rules engines, and CEP agility

Three top trends I see in the coming year, above and beyond deepening and adoption of the above-bulleted developments:

  • All-in-memory, massively parallel analytic architectures will begin to gain a foothold in complex EDW environments in support of real-time elastic analytics
  • Further crystallization of a market for general-purpose “recommendation engines” that, operating inline to EDWs, CEP environments, and BPM platforms, enable “next-best action” approaches to emerge from today’s application siloes
  • Incorporation of social network analysis functionality into a wider range of front-office business processes to enable fine-tuned behavioral-based customer segmentation to drive CRM optimization

About –http://www.forrester.com/rb/analyst/james_kobielus

James G. Kobielus
Senior Analyst, Forrester Research

RESEARCH FOCUS

James serves Business Process & Applications professionals. He is a leading expert on data warehousing, predictive analytics, data mining, and complex event processing. In addition to his core coverage areas, James contributes to Forrester’s research in business intelligence, data integration, data quality, and master data management.

PREVIOUS WORK EXPERIENCE

James has a long history in IT research and consulting and has worked for both vendors and research firms. Most recently, he was at Current Analysis, an IT research firm, where he was a principal analyst covering topics ranging from data warehousing to data integration and the Semantic Web. Prior to that position, James was a senior technical systems analyst at Exostar (a hosted supply chain management and eBusiness hub for the aerospace and defense industry). In this capacity, James was responsible for identifying and specifying product/service requirements for federated identity, PKI, and other products. He also worked as an analyst for the Burton Group and was previously employed by LCC International, DynCorp, ADEENA, International Center for Information Technologies, and the North American Telecommunications Association. He is both well versed and experienced in product and market assessments. James is a widely published business/technology author and has spoken at many industry events

Interview Jamie Nunnelly NISS

An interview with Jamie Nunnelly, Communications Director of National Institute of Statistical Sciences

Ajay– What does NISS do? And What does SAMSI do?

Jamie– The National Institute of Statistical Sciences (NISS) was established in 1990 by the national statistics societies and the Research Triangle universities and organizations, with the mission to identify, catalyze and foster high-impact, cross-disciplinary and cross-sector research involving the statistical sciences.

NISS is dedicated to strengthening and serving the national statistics community, most notably by catalyzing community members’ participation in applied research driven by challenges facing government and industry. NISS also provides career development opportunities for statisticians and scientists, especially those in the formative stages of their careers.

The Institute identifies emerging issues to which members of the statistics community can make key contributions, and then catalyzes the right combinations of researchers from multiple disciplines and sectors to tackle each problem. More than 300 researchers from over 100 institutions have worked on our projects.

The Statistical and Applied Mathematical Sciences Institute (SAMSI) is a partnership of Duke University,  North Carolina State University, The University of North Carolina at Chapel Hill, and NISS in collaboration with the William Kenan Jr. Institute for Engineering, Technology and Science and is part of the Mathematical Sciences Institutes of the NSF.

SAMSI focuses on 1-2 programs of research interest in the statistical and/or applied mathematical area and visitors from around the world are involved with the programs and come from a variety of disciplines in addition to mathematics and statistics.

Many come to SAMSI to attend workshops, and also participate in working groups throughout the academic year. Many of the working groups communicate via WebEx so people can be involved with the research remotely. SAMSI also has a robust education and outreach program to help undergraduate and graduate students learn about cutting edge research in applied mathematics and statistics.

Ajay– What successes have you had in 2010- and what do you need to succeed in 2011. Whats planned for 2011 anyway

Jamie– NISS has had a very successful collaboration with the National Agricultural Statistical Service (NASS) over the past two years that was just renewed for the next two years. NISS & NASS had three teams consisting of a faculty researcher in statistics, a NASS researcher, a NISS mentor, a postdoctoral fellow and a graduate student working on statistical modeling and other areas of research for NASS.

NISS is also working on a syndromic surveillance project with Clemson University, Duke University, The University of Georgia, The University of South Carolina. The group is currently working with some hospitals to test out a model they have been developing to help predict disease outbreak.

SAMSI had a very successful year with two programs ending this past summer, which were the Stochastic Dynamics program and the Space-time Analysis for Environmental Mapping, Epidemiology and Climate Change. Several papers were written and published and many presentations have been made at various conferences around the world regarding the work that was conducted as SAMSI last year.

Next year’s program is so big that the institute has decided to devote all it’s time and energy around it, which is uncertainty quantification. The opening workshop, in addition to the main methodological theme, will be broken down into three areas of interest under this broad umbrella of research: climate change, engineering and renewable energy, and geosciences.

Ajay– Describe your career in science and communication.

Jamie– I have been in communications since 1985, working for large Fortune 500 companies such as General Motors and Tropicana Products. I moved to the Research Triangle region of North Carolina after graduate school and got into economic development and science communications first working for the Research Triangle Regional Partnership in 1994.

From 1996-2005 I was the communications director for the Research Triangle Park, working for the Research Triangle Foundation of NC. I published a quarterly magazine called The Park Guide for awhile, then came to work for NISS and SAMSI in 2008.

I really enjoy working with the mathematicians and statisticians. I always joke that I am the least educated person working here and that is not far from the truth! I am honored to help get the message out about all of the important research that is conducted here each day that is helping to improve the lives of so many people out there.

Ajay– Research Triangle or Silicon Valley– Which is better for tech people and why? Your opinion

Jamie– Both the Silicon Valley and Research Triangle are great regions for tech people to locate, but of course, I have to be biased and choose Research Triangle!

Really any place in the world that you find many universities working together with businesses and government, you have an area that will grow and thrive, because the collaborations help all of us generate new ideas, many of which blossom into new businesses, or new endeavors of research.

The quality of life in places such as the Research Triangle is great because you have people from around the world moving to a place, each bringing his/her culture, food, and uniqueness to this place, and enriching everyone else as a result.

Two advantages the Research Triangle has over Silicon Valley are that the Research Triangle has a bigger diversity of industries, so when the telecommunications industry busted back in 2001-02, the region took a hit, but the biotechnology industry was still growing, so unemployment rose, but not to the extent that other areas might have experienced.

The latest recession has hit us all very hard, so even this strategy has not made us immune to having high unemployment, but the Research Triangle region has been pegged by experts to be one of the first regions to emerge out of the Great Recession.

The other advantage I think we have is that our cost of living is still much more reasonable than Silicon Valley. It’s still possible to get a nice sized home, some land and not break the bank!

Ajay– How do you manage an active online social media presence, your job and your family. How important is balance in professional life and when young professional should realize this?

Jamie– Balance is everything, isn’t it? When I leave the office, I turn off my iPhone and disconnect from Twitter/Facebook etc.

I know that is not recommended by some folks, but I am a one person communications department and I love my family and friends and feel its important to devote time to them as well as to my career.

I think it is very important for young people to establish this early in their careers because if they don’t they will fall victim to working way too many hours and really, who loves you at the end of the day?

Your company may appreciate all you do for them, but if you leave, or you get sick and cannot work for them, you will be replaced

. Lee Iacocca, former CEO of Chrystler, said, “No matter what you’ve done for yourself or for humanity, if you can’t look back on having given love and attention to your own family, what have you really accomplished?” I think that is what is really most important in life.

About-

Jamie Nunnelly has been in communications for 25 years. She is currently on the board of directors for Chatham County Economic Development Corporation and Leadership Triangle & is a member of the International Association of Business Communicators and the Public Relations Society of America. She earned a bachelor’s degree in interpersonal and public communications at Bowling Green State University and a master’s degree in mass communications at the University of South Florida.

You can contact Jamie at http://niss.org/content/jamie-nunnelly or on twitter at

Interview James Dixon Pentaho

Here is an interview with James Dixon the founder of Pentaho, self confessed Chief Geek and CTO. Pentaho has been growing very rapidly and it makes open source Business Intelligence solutions- basically the biggest chunk of enterprise software market currently.

Ajay-  How would you describe Pentaho as a BI product for someone who is completely used to traditional BI vendors (read non open source). Do the Oracle lawsuits over Java bother you from a business perspective?

James-

Pentaho has a full suite of BI software:

* ETL: Pentaho Data Integration

* Reporting: Pentaho Reporting for desktop and web-based reporting

* OLAP: Mondrian ROLAP engine, and Analyzer or Jpivot for web-based OLAP client

* Dashboards: CDF and Dashboard Designer

* Predictive Analytics: Weka

* Server: Pentaho BI Server, handles web-access, security, scheduling, sharing, report bursting etc

We have all of the standard BI functionality.

The Oracle/Java issue does not bother me much. There are a lot of software companies dependent on Java. If Oracle abandons Java a lot resources will suddenly focus on OpenJDK. It would be good for OpenJDK and might be the best thing for Java in the long term.

Ajay-  What parts of Pentaho’s technology do you personally like the best as having an advantage over other similar proprietary packages.

Describe the latest Pentaho for Hadoop offering and Hadoop/HIVE ‘s advantage over say Map Reduce and SQL.

James- The coolest thing is that everything is pluggable:

* ETL: New data transformation steps can be added. New orchestration controls (job entries) can be added. New perspectives can be added to the design UI. New data sources and destinations can be added.

* Reporting: New content types and report objects can be added. New data sources can be added.

* BI Server: Every factory, engine, and layer can be extended or swapped out via configuration. BI components can be added. New visualizations can be added.

This means it is very easy for Pentaho, partners, customers, and community member to extend our software to do new things.

In addition every engine and component can be fully embedded into a desktop or web-based application. I made a youtube video about our philosophy: http://www.youtube.com/watch?v=uMyR-In5nKE

Our Hadoop offerings allow ETL developers to work in a familiar graphical design environment, instead of having to code MapReduce jobs in Java or Python.

90% of the Hadoop use cases we hear about are transformation/reporting/analysis of structured/semi-structured data, so an ETL tool is perfect for these situations.

Using Pentaho Data Integration reduces implementation and maintenance costs significantly. The fact that our ETL engine is Java and is embeddable means that we can deploy the engine to the Hadoop data nodes and transform the data within the nodes.

Ajay-  Do you think the combination of recession, outsourcing,cost cutting, and unemployment are a suitable environment for companies to cut technology costs by going out of their usual vendor lists and try open source for a change /test projects.

Jamie- Absolutely. Pentaho grew (downloads, installations, revenue) throughout the recession. We are on target to do 250% of what we did last year, while the established vendors are flat in terms of new license revenue.

Ajay-  How would you compare the user interface of reports using Pentaho versus other reporting software. Please feel free to be as specific.

James- We have all of the everyday, standard reporting features covered.

Over the years the old tools, like Crystal Reports, have become bloated and complicated.

We don’t aim to have 100% of their features, because we’d end us just as complicated.

The 80:20 rule applies here. 80% of the time people only use 20% of their features.

We aim for 80% feature parity, which should cover 95-99% of typical use cases.

Ajay-  Could you describe the Pentaho integration with R as well as your relationship with Weka. Jaspersoft already has a partnership with Revolution Analytics for RevoDeployR (R on a web server)-

Any  R plans for Pentaho as well?

James- The feature set of R and Weka overlap to a small extent – both of them include basic statistical functions. Weka is focused on predictive models and machine learning, whereas R is focused on a full suite of statistical models. The creator and main Weka developer is a Pentaho employee. We have integrated R into our ETL tool. (makes me happy 🙂 )

(probably not a good time to ask if SAS integration is done as well for a big chunk of legacy base SAS/ WPS users)

About-

As “Chief Geek” (CTO) at Pentaho, James Dixon is responsible for Pentaho’s architecture and technology roadmap. James has over 15 years of professional experience in software architecture, development and systems consulting. Prior to Pentaho, James held key technical roles at AppSource Corporation (acquired by Arbor Software which later merged into Hyperion Solutions) and Keyola (acquired by Lawson Software). Earlier in his career, James was a technology consultant working with large and small firms to deliver the benefits of innovative technology in real-world environments.

Is 21 st century cloud computing same as 1960's time sharing

Diagram showing three main types of cloud comp...
Image via Wikipedia

and yes Prof Goodnight, cloud computing is not time sharing. (Dr J was on a roll there- bashing open source AND cloud computing in the SAME interview at http://www.cbronline.com/news/sas-ceo-says-cep-open-source-and-cloud-bi-have-limited-appeal)

What was time sharing? In the 1960’s when people had longer hair, listened to the Beatles and IBM actually owned ALL computers-

http://en.wikipedia.org/wiki/Time-sharing

or is it?

The Internet has brought the general concept of time-sharing back into popularity. Expensive corporate server farms costing millions can host thousands of customers all sharing the same common resources. As with the early serial terminals, websites operate primarily in bursts of activity followed by periods of idle time. This bursting nature permits the service to be used by many website customers at once, and none of them notice any delays in communications until the servers start to get very busy.

What is 21 st century cloud computing? Well… they are still writing papers to define it BUT http://en.wikipedia.org/wiki/Cloud_computing

Cloud computing is Web-based processing, whereby shared resources, software, and information are provided to computers and other devices (such as smartphones) on demand over the Internet.

 

 

John Sall sets JMP 9 free to tango with R

 

Diagnostic graphs produced by plot.lm() functi...
Image via Wikipedia

 

John Sall, founder SAS AND JMP , has released the latest blockbuster edition of flagship of JMP 9 (JMP Stands for John’s Macintosh Program).

To kill all birds with one software, it is integrated with R and SAS, and the brochure frankly lists all the qualities. Why am I excited for JMP 9 integration with R and with SAS- well it integrates bigger datasets manipulation (thanks to SAS) with R’s superb library of statistical packages and a great statistical GUI (JMP). This makes JMP the latest software apart from SAS/IML, Rapid Miner,Knime, Oracle Data Miner to showcase it’s R integration (without getting into the GPL compliance need for showing source code– it does not ship R- and advises you to just freely download R). I am sure Peter Dalgaard, and Frankie Harell are all overjoyed that R Base and Hmisc packages would be used by fellow statisticians  and students for JMP- which after all is made in the neighborhood state of North Carolina.

Best of all a JMP 30 day trial is free- so no money lost if you download JMP 9 (and no they dont ask for your credit card number, or do they- but they do have a huuuuuuge form to register before you download. Still JMP 9 the software itself is more thoughtfully designed than the email-prospect-leads-form and the extra functionality in the free 30 day trial is worth it.

Also see “New Features  in JMP 9  http://www.jmp.com/software/jmp9/pdf/new_features.pdf

which has this regarding R.

Working with R

R is a programming language and software environment for statistical computing and graphics. JMP now  supports a set of JSL functions to access R. The JSL functions provide the following options:

• open and close a connection between JMP and R

• exchange data between JMP and R

•submit R code for execution

•display graphics produced by R

JMP and R each have their own sets of computational methods.

R has some methods that JMP does not have. Using JSL functions, you can connect to R and use these R computational methods from within JMP.

Textual output and error messages from R appear in the log window.R must be installed on the same computer as JMP.

JMP is not distributed with a copy of R. You can download R from the Comprehensive R Archive Network Web site:http://cran.r-project.org

Because JMP is supported as both a 32-bit and a 64-bit Windows application, you must install the corresponding 32-bit or 64-bit version of R.

For details, see the Scripting Guide book.

and the download trial page ( search optimized URL) –

http://www.sas.com/apps/demosdownloads/jmptrial9_PROD__sysdep.jsp?packageID=000717&jmpflag=Y

In related news (Richest man in North Carolina also ranks nationally(charlotte.news14.com) , Jim Goodnight is now just as rich as Mark Zuckenberg, creator of Facebook-

though probably they are not creating a movie on Jim yet (imagine a movie titled “The Statistical Software” -not just the same dude feel as “The Social Network”)

See John’s latest interview :

The People Behind the Software: John Sall

http://blogs.sas.com/jmp/index.php?/archives/352-The-People-Behind-the-Software-John-Sall.html

Interview John Sall Founder JMP/SAS Institute

https://decisionstats.com/2009/07/28/interview-john-sall-jmp/

SAS Early Days

https://decisionstats.com/2010/06/02/sas-early-days/