The World of Data as I think

Post discussions on my performance at grad school and WHAT exactly DO I want to work in- I drew the following curves.

Feel free to draw better circles- and I will include your reference here

Caution- Based upon a very ordinary understanding of extra ordinary technical things.

THE WORLD OF DATA

Screenshot-18

AND WHAT I WANT TO DO IN IT

Screenshot-19

ps- What do you think? Add a comment

“Build a better mousetrap, and the world will beat a path to your door.”- Emerson

Interview Ken O Connor Business Intelligence Consultant

Here is an interview with an industry veteran of Business Intelligence, Ken O Connor.

Ajay- Describe your career journey across the full development cycle of Business Intelligence.

Ken- I started my career in the early 80’s in the airline industry, where I worked as an application programmer and later as a systems programmer. I took a computer science degree by night. The airline industry was one of the first to implement computer systems in the ‘60s, and the legacy of being an early adaptor was that airline reservation systems were developed in Assembler. Remarkable as it sounds now, as application programmers, we wrote our own file access methods. Even more remarkable, as systems programmers, we modified the IBM supplied Operating System, originally known as the Airline Control Program (ACP), later renamed as Transaction Processing Facility (TPF). The late ‘80s saw the development of Global “Computer Reservations Systems” (CRS systems) including AMADEUS and GALILEO. I moved from Aer Lingus, a small Irish airline, to work in London on the British Airways systems, to enable the British Airways systems share information and communicate with the new Global CRS systems.

I learnt very important lessons during those years.

* The criticality of standards

* The drive for interoperability of systems

* The drive towards information sharing

* The drive away from bespoke development

In the 90’s I returned to Dublin, where I worked as an independent consultant with IBM on many data intensive projects. On one project I was lead developer in the IBM Dublin Laboratory on the development of the Data Replication tool called “Data Propagator NonRelational”. This tool automatically propagates updates made on IMS databases to DB2 databases. On this project, we successfully piloted using the Cleanroom Development Method, as part of IBM’s derive towards Six Sigma quality.

In the past 15 years I have moved away from IT towards the business. I describe myself as a Hybrid. I believe there is a serious communications gap between business users and IT, and this is a frequent cause of project failures. I seek to bridge that gap. I ensure that requirements are clear, measurable, testable, and capable of being easily understood and signed off by business owners.

One of my favorite programmes was Euro Changeover, This was a hugely data intensive programme. It was the largest changeover undertaken by European Financial Institutions. I worked as an independent consultant with the IBM Euro Centre of Competence. I developed changeover strategies for a number of Irish Enterprises, and was the End to End IT changeover process owner in a major Irish bank. Every application and every data store holding currency sensitive data (not just amounts, but currency signs etc.) had to be converted at exactly the same time to ensure that all systems successfully switched to euro processing on 1st January 2002.

I learnt many, many lasting lessons about data the hard way on Euro Changeover programmes, such as:

* The extent to which seemingly separate applications share operational data – often without the knowledge of the owning application.

* The extent to which business users use (abuse) data fields to hold information never intended for the data field.

* The critical distinction between the underlying data (in a data store) and the information displayed to a business user.

I have worked primarily on what I call “End of food chain” projects and programmes, such as Single View of Customer, data migrations, and data population of repositories for BASEL II and Anti Money Laundering (AML) systems. Business Intelligence is another example of an “End of food chain” project. “End of food-chain” projects share the following characteristics:

* Dependent on existing data

* No control over the quality of existing data they depend on

* No control over the data entry processes by which the data they require is captured.

* The data required may have been captured many years previously.

Recently, I have shared my experience of “Enterprise wide data issues” in a series of posts on my blog, together with a process for assessing the status of those issues within an Enterprise (more details). In my experience, the success of a Business Intelligence programme and the ease with which an Enterprise completes “End of food chain” data dependent programmes directly depends on the status of the common Enterprise Wide data issues I have identified.

Ajay -Describe the educational scene for science graduates in Ireland. What steps do you think governments and universities can do to better teach science and keep young people excited about it?

Ken- I am not in a position to comment on the educational scene for science graduates in Ireland. However, I can say that currently there are insufficient numbers of school children studying science in primary and 2nd level education. There is a need to excite young people about science. There is a need for more interactive science museums, like W5 in Belfast which is hugely successful. Kids love to get involved, and practical science can be great fun.

Ajay- What are some of the key trends in business intelligence that you have seen-

Ken- Since the earliest days of my career, I have seen an ever increasing move towards standards based interoperability of systems, and interchange of data. This has accelerated dramatically in recent years. This is the good news. Further good news is the drive towards the use of external reference databases to verify the accuracy of data, at point of data entry (See blog post on Upstream prevention by Henrik Liliendahl Sørensen). One example of this drive is cloud based verification services from new companies like Ireland based Clavis Technology.

The harsh reality is that “Old hardware goes into museums, while old software goes into production every night”. Enterprises have invested vast amounts of money in legacy applications over decades. These legacy systems access legacy data in legacy data stores. This legacy data will continue to pose challenges in the delivery of Business Intelligence to the Business community that needs it. These challenges will continue to provide opportunities for Data Quality professionals.

Ajay- What is going to be the next fundamental change in this industry in your opinion?

Ken- The financial crisis will result in increased regulatory requirements. This will be good news for the Business Intelligence / Data Quality industry. In time, it will no longer be sufficient to provide the regulator with ‘just’ the information requested. The regulator will want to see the process by which the information was gathered; the process controls, and evidence of the quality the underlying data from which the information was derived. This move will result in funding for Data Governance programmes, which will lead to increased innovation in our industry.

Ajay- Describe your startup Map My Business, your target customer and your vision for it.

Ken- I started MapMyBusiness.com as a “recession buster”. Ireland was hit particularly hard by the financial crisis. I had become over dependent on the financial services industry, and a blanket ban on the use of external consultants left me with no option but to reinvent myself. MapMyBusiness.com helps small businesses to attract clients, by getting them on Google page one. Having been burnt by an over dependence on one industry, my vision is to diversify. I believe that Data Governance is industry independent, and I am focussing on increasing my customer base for my Data Governance consultancy skills, via my company Professional IT Personnel Ltd.

Ajay- What do you do when not working with customers or blogging on your website?

Ken- I try to achieve a reasonable work/life balance. I am married with two children aged 12 and 10, and like to spend time with them, especially outdoors, walking, hiking, playing tennis etc. I am involved in my community, lobbying for improved cycling infrastructure in our area (more details). Ireland, like most countries, is facing an obesity epidemic, due to an increasingly sedentary lifestyle. Too many people get little or no exercise, and don’t have the time, willpower, or perhaps money, to regularly work out in a gym. By including “Active Travel” in our daily lives – by walking or cycling to schools and local amenities, we can get enough physical exercise to prevent obesity, and obesity related health problems. We need to make our cities, towns and villages more pedestrian and cyclist friendly, to encourage “active travel”. My voluntary work in this area introduced me to mapping (see example), and enabled me to set up MapMyBusiness.com.

Biography-

Ken O’Connor is an independent IT Consultant with almost 30 years of work experience. He specialises in Data: Data Migration, Data Population, Data Governance, Data Quality, Data Profiling…His  company is called Professional IT Personnel Ltd.

Ken started his blog (Ken O’ Connor Data Consultant) to share his experience and to learn from the experience of others.   Dylan Jones, editor of dataqualitypro, describe Ken as a “grizzled veteran”, with almost 30 years experience across the full development lifecycle.

Interview Thomas C. Redman Author Data Driven

Here is an interview with Tom Redman, author of Data Driven. Among the first to recognize the need for high-quality data in the information age, Dr. Redman established the AT&T Bell Laboratories Data Quality Lab in 1987 and led it until 1995. He is the author of four books, two patents and leads his own consulting group. In many respects the “Data Doc’ as his nickname is- is also the father of Data Quality Evangelism.

tom redman

Ajay- Describe your career as a science student to an author of science and strategy books.
Redman: I took the usual biology, chemistry, and physics classes in college.  And I worked closely with oceanographers in graduate school.  More importantly, I learned directly from two masters.  First, was Dr. Basu, who was at Florida State when I was.  He thought more deeply and clearly about the nature of data and what we can learn from them than anyone I’ve met since.  And second is people in the Bell Labs’ community who were passionate about making communications better. What I learned there was you don’t always need “scientific proof” to mover forward.


Ajay- What kind of bailout do you think the Government can give to the importance of science education in this country.

Redman: I don’t think the government should bail science education per se. Science departments should compete for students just like the English and anthropology departments do.  At the same time, I do think the government should support some audacious goals, such as slowing global warming or energy independence.  These could well have the effect of increasing demand for scientists and science education.

Ajay- Describe your motivations for writing your book Data Driven-Profiting from your most important business asset.

Redman: Frankly I was frustrated.  I’ve spent the last twenty years on data quality and organizations that improve gain enormous benefit.  But so few do.  I set out to figure out why that was and what to do about it.

Ajay- What can various segments of readers learn from this book-
a college student, a manager, a CTO, a financial investor and a business intelligence vendor.

Redman: I narrowed my focus to the business leader and I want him or her to take away three points.  First, data should be managed as aggressively and professionally as your other assets.  Second, they are unlike other assets in some really important ways and you’ll have to learn how to manage them.  Third, improving quality is a great place to start.

Ajay- Garbage in Garbage out- How much money and time do you believe is given to data quality in data projects.

Redman:   By this I assume you mean data warehouse, BI, and other tech projects.  And the answer is “not near enough.”  And it shows in the low success rate of those projects.

Ajay-Consider a hypothetical scenario- Instead of creating and selling fancy algorithms , a business intelligence vendor uses simple Pareto principle to focus on data quality and design during data projects. How successful do you think that would be?

Redman: I can’t speak to the market, but I do know that if organizations are loaded with problems and opportunities.  They could make great progress on most important ones if could clearly state the problem and bring high-quality data and simple techniques to bear.  But there are a few that require high-powered algorithms.  Unfortunately those require high-quality data as well.

Ajay- How and when did you first earn the nickname “Data Doc”. Who gave it to you and would you rather be known by some other names.

Redman: One of my clients started calling me that about a dozen years ago.  But I felt uncomfortable and didn’t put it on my business card until about five years ago.  I’ve grown to really like it.

Ajay- The pioneering work at AT & T Bell laboratories and at Palo Alto laboratory- who do you think are the 21st century successors of these laboratories. Do you think lab work has become too commercialized even in respected laboratories like Microsoft Research and Google’s research in mathematics.

Redman: I don’t know.  It may be that the circumstances of the 20th century were conducive to such labs and they’ll never happen again.  You have to remember two things about Bell Labs.  First, was the cross-fertilization that stemmed from having leading-edge work in dozens of areas.  Second, the goal is not just invention, but innovation, the end-to-end process which starts with invention and ends with products in the market.  AT&T, Bell Labs’ parent, was quite good at turning invention to product.  These points lead me to think that the commercial aspect of laboratory work is so much the better.

Ajay-What does ” The Data Doc” do to relax and maintain a work life balance. How important do you think is work-life balance for creative people and researchers.

Redman: I think everyone needs a balance, not just creative people.  Two things have made this easier for me.  First, I like what I do.  A lot of days it is hard to distinguish “work” from “play.”  Second is my bride of thirty-three years, Nancy.  She doesn’t let me go overboard too often.

Biography-

Dr. Thomas C. Redman is President of Navesink Consulting Group, based in Little Silver, NJ.  Known by many as “the Data Doc” (though “Tom” works too), Dr. Redman was the first to extend quality principles to data and information.  By advancing the body of knowledge, his innovations have raised the standard of data quality in today’s information-based economy.

Dr. Redman conceived the Data Quality Lab at AT&T Bell Laboratories in 1987 and led it until 1995.  There he and his team developed the first methods for improving data quality and applied them to important business problems, saving AT&T tens of millions of dollars. He started Navesink Consulting Group in 1996 to help other organizations improve their data, while simultaneously lowering operating costs, increasing revenues, and improving customer satisfaction and business relationships.

Since then – armed with proven, repeatable tools, techniques and practical advice – Dr. Redman has helped clients in fields ranging from telecommunications, financial services, and dot coms, to logistics, consumer goods, and government agencies. His work has helped organizations understand the importance of high-quality data, start their data quality programs, and also save millions of dollars per year.

Dr. Redman holds a Ph.D. in statistics from Florida State University.  He is an internationally renowned lecturer and the author of numerous papers, including “Data Quality for Competitive Advantage” (Sloan Management Review, Winter 1995) and “Data as a Resource: Properties, Implications, and Prescriptions” (Sloan Management Review, Fall 1998). He has written four books: Data Driven (Harvard Business School Press, 2008), Data Quality: The Field Guide (Butterworth-Heinemann, 2001), Data Quality for the Information Age (Artech, 1996) and Data Quality: Management and Technology (Bantam, 1992). He was also invited to contribute two chapters to Juran’s Quality Handbook, Fifth Edition (McGraw Hill, 1999). Dr. Redman holds two patents.

About Navesink Consulting Group (http://www.dataqualitysolutions.com/ )

Navesink Consulting Group was formed in 1996 and was the first company to focus on data quality.  Led by Dr. Thomas Redman, “the Data Doc” and former AT&T Bell Labs director, we have helped clients understand the importance of high-quality data, start their data quality programs, and save millions of dollars per year.

Our approach is not a cobbling together of ill-fitting ideas and assertions – it is based on rigorous scientific principles that have been field-tested in many industries, including financial services (see more under “Our clients”).  We offer no silver bullets; we don’t even offer shortcuts. Improving data quality is hard work.

But with a dedicated effort, you should expect order-of-magnitude improvements and, as a direct result, an enormous boost in your ability to manage risk, steer a course through the crisis, and get back on the growth curve.

Ultimately, Navesink Consulting brings tangible, sustainable improvement in your business performance as a result of superior quality data.

Interview Timo Elliott SAP

Here is an interview with Timo Elliott, Senior Product Director SAP Business Objects.

Ajay- Describe your career in science from school to Senior Director in SAP to blogger/speaker. How do you think we can convince students of the benefits of learning science and maths.

Timo- I studied economics with statistics in the UK, but I had always been a closet geek and had dabbled with computers ever since I was a kid, starting with Z80 assembler code. I started my career doing low-level computer consulting in Hong Kong, and worked on a series of basic business intelligence projects at Shell in New Zealand, cobbling together a solution based on a mainframe HR system, floppy-disk transfers, and Lotus 1-2-3 macros. When I returned to Europe, I stumbled across a small French startup that provided exactly the “decision support systems” that I had been looking for, and enthusiastically joined the company.

Over the last eighteen years, I’ve worked with hundreds of companies around the world on their BI strategy and my job today is to help evangelize what works and what doesn’t, to help organizations avoid the mistakes that others have made.

When it comes to BI initiatives, I see the results of one fundamental problem almost on a daily basis: 75% of project success depends on people, process, organization, culture, and leadership, but we typically spend 92% of our time on data and technology.

BI is NOT about technology – it’s about helping people do their jobs. So when it comes to education, we need to teach our technologists more about people, not science!

Ajay- You were the 8th employee of SAP Business Objects. What are the key turning points or transition stages in the BI industry that you remember seeing in the past 18 years, and how has SAP Business objects responded to them.

Timo- Executive information systems and multidimensional databases have been around since at least the 1970s, but modern business intelligence dates from the early 1990s, driven by the widespread use of relational databases, graphical user interfaces, and the invention of the “semantic layer”, pioneered by BusinessObjects, that separated business terms from technical logic. For the first time, non-expert business people had self-service access to data.

This was followed by a period of rapid expansion, as leading vendors combined reporting, multidimensional, and dashboard approaches into fully-fledged suites. During this period, BusinessObjects acquired a series of related technology companies to complete the existing offer (such as the leader in operational reporting, Crystal Reports) and extend into enterprise information management and financial performance management.

Finally, the theme of the last few years has clearly been consolidation – according to Gartner, the top four “megavendors” (SAP, IBM, Microsoft, and Oracle) now make up almost two-thirds of the market, and accounted for fully 83% of the growth since last year. Perhaps as a result, user deployments are accelerating, with usage growth rates doubling last year.

Ajay- How do you think Business Intelligence would be affected by the following

a) Predictive Analytics.

Timo- Predictive analytics has been the “next big thing in BI” for at least a decade. It has been extremely important in some key areas, such as fraud detection, but the dream of “no longer managing by looking out of the rear-view mirror” has proved hard to achieve, notably because business conditions are forever changing.

We offer predictive analytics with our Predictive Workbench product – but I think the real opportunity for this technology in the future is “power analytics”, rather than “prediction”. For example, helping business people automatically cluster similar values, spot outliers, determine causal factors, and detect trend inflection points, using the data that they already have access to with traditional BI.

b) Cloud Computing.

Timo- In terms of architecture, it’s clearly not about on-demand OR on-premise: it’s about having a flexible approach that combines both approaches. You can compare information to money: today, we tend to keep our money in the bank rather than under our own mattress, because it’s safer, more convenient, and more cost-efficient. At the same time, there are situations where the convenience of cash is still essential.

Companies should be able to choose a BI strategy, and decide how to deploy it later. This is what we offer with our BI on-demand solutions, which use the same technology as on-premise. You can start to build on-premise and move it to on-demand, or vice-versa, or have a mix of both.

In terms of data, “cloud intelligence” is still a work in progress. As with modern financial instruments, we can expect to see the growth of new information services, such as our “information on-demand” product that provide data feeds from Reuters, Thompson Financial, and other providers to augment internal information systems. Looking further into the future, we can imagine new information marketplaces that would pay us “interest” to store our data in the cloud, where it can be adapted, aggregated and sold to others.

c) Social Media.

Timo- Conversations and collaboration are an essential part of effective business intelligence. We often talk about the notion of a “single view of the truth” in this industry, but that’s like saying we can have “a single view of politics” – while it’s vital to try to give everybody access to the same data, there will always be plenty of room for interpretation and discussion. BI platforms need to support this collaborative decision-making.

In particular, there are many, many studies that show up our all-too-human limitations when it comes to analyzing data. For example, did you know that children with bigger feet have better handwriting?

It’s absolutely true — because the children are older! Mixing up correlation and causality is a common issue in business intelligence, and one answer to the problem is to add more people: the more reviewers there are of the decision-making process, the better the decisions will be.

Analysis is also critical to the development of social media, such as analyzing sentiment trends in Twitter — a functionality we offer with SAP CRM — or tracking social communities. For example, Jive, the leader in Enterprise 2.0 platforms, offers our BI products as part of their solution, to help their customers analyze and optimize use of the system. Administrators can track if usage is trailing off in a particular department, for example.

d) Social Network Analysis.

Timo- Over the last twenty years, partly as a result of extensive automation of operational tasks with systems such as SAP, there’s has been a huge shift from “routine” to “non-routine” work. Today, fully 90% of business users say that their work involves decision making, problem solving, and the creation of new analysis and insight.

To help support this new creativity, organizations are becoming more porous as we work closer with our ecosystem of customers, partners, and suppliers, and we work in ever-more matrixed environments and cross-functional teams.

We’ve developed a Social Network Analyzer prototype that combines BI and social networking to create a “single view of relationships”. It can gather information from multiple different systems, such as HR, CRM, email distribution lists, project teams, Twitter, etc., to create a multi-layered view of how people are connected, across and beyond the enterprise. For more information, see the SAP Web 2.0 blog post, and you can try it yourself on our ondemand.com web site.

Ajay- What is the area that SAP BusinessObjects is very good at (strength). What are the key areas that you are currently seeking to improve ( opportunities)

Timo- Companies evaluating BI solutions should look at four things: product functionality for their users’ needs, fit with the overall IT architecture, the vendor’s reputation and ecosystem, and (of course) price. SAP BusinessObjects is the clear leader in the BI industry, and I’d say that SAP BusinessObjects has the best overall solution if you’re a large organization (or looking to become one) with a variety of user needs, multiple data sources, and a heterogeneous IT infrastructure.

In terms of opportunities, we have high expectations for new interfaces for casual users, and in-memory processing, which we have combined in our SAP BusinessObjects Explorer product. Initial customer feedback has been excellent, with quotes such as “finding information is as easy as using the internet” and “if you can use a computer, you can use Explorer”.

In terms of future directions, we’re taking a very transparent, Web 2.0 approach. The SAP Business Objects innovation center is modeled on Google Labs and we share our prototypes (including the Social Network Analyzer mentioned above) with anybody who’s interested, and let our customers give us early feedback on what directions we should go.

Ajay- What does Timo Elliott do for work life balance when not writing, talking, and evangelizing about Business Intelligence?

Timo- I’m a keen amateur photographer – see www.timoelliott.com/personal for more!

Biography- http://timoelliott.com/blog/about

Timo Elliott is Senior Director of Strategic Marketing for SAP BusinessObjects. For the last twenty years he has been a thought leader and conference speaker in business intelligence and performance management.

A popular and engaging speaker, Elliott presents regularly to IT and business audiences at international conferences, drawing on his experience working with enterprise customers around the globe. Topics include the latest developments in BI/PM technology, how best to suceed with BI/PM projects, and future trends in the industry. 

Prior to Business Objects, Elliott was a computer consultant in Hong Kong and led analytics projects for Shell in New Zealand. He holds a first-class honors degree in Economics with Statistics from Bristol University, England.

Additional websites: http://www.sapweb20.com —  web 2.0 technology by, with, and at SAP

Email: telliott@timoelliott.com or timo.elliott@sap.com

LinkedIn: http://www.linkedin.com/in/timoelliott

Twitter: http://twitter.com/timoelliott

Flickr: http://www.flickr.com/photos/timoelliott/

Facebook: http://www.facebook.com/people/Timo-Elliott/544744135

For an earlier interview with Oracle Data Mining Product Management, Charlie Berger see https://decisionstats.wordpress.com/2009/09/02/oracle/

Interview Karen Lopez Data Modeling Expert

Zachman Framework
Image via Wikipedia

Here is an interview with Karen Lopez who has worked in data modeling for almost three decades and is a renowned data management expert in her field.

Data professionals need to know about the data domain in addition to the data structure domain – Karen Lopez

Ajay- Describe your career in science. How would you persuade younger students to take more science courses.

Karen- I’ve always had an interest in science and I attribute that to the great science teachers I had. I studied information systems at Purdue University though a unique program that focuses on systems analysis and computer technologies. I’m one of the few who studied data and process modeling in an undergraduate program 25+ years ago.

I believe that it is very important that we find a way of attracting more scientists to teach. In both the natural and computer sciences, it’s difficult for institutions to tempt scientists away from professional positions that offer much greater compensation. So I support programs that find ways to make that happen.

Ajay- If you had to give advice to a young person starting their career in BI and had to give them advice in just three points – what would they be?

Karen- Wow. It’s tough to think of just three things, but these are recommendations that I make often:

– Remember that every design decision should be made based on cost, benefit, and risk. If you can’t clearly describe these for every side of a decision, then you aren’t doing design; you are guessing.

– No one beside you is responsible for advancing your skills and keeping an eye on emerging practices. Don’t expect your employer to lay out a career plan that is in your best interest. That’s not their job. Data professionals need to know about the data domain in addition to the data structure domain. The best database or data warehouse design in the world is worse than uses useless if the how the data is processed is wrong. Remember to expand your knowledge about data, not just the data structures and tools.

– All real-world work involves collaboration and negotiation. There is no one right answer that works for every situation. Building your skills in these areas will pay off significantly.

Ajay- What do you think is the best way for a technical consultant and client to be on the same page regarding requirements. Which methodology or template have you used, and which has given you the most success.

Karen- While I’m a huge fan of modeling (data modeling and other modeling), I still think that giving clients a prototype or mockup of something that looks real to them goes a long way. We need to build tools and competencies to develop these prototypes quickly. It’s a lost art in the data world.

Ajay- What are the special incentives that make Canada a great place for tech entrepreneurs rather than say go to the United States. ( Note- Disclaimer I have family in Canada and study in the US)

Karen- I prefer not to think of this as an either-or decision. I immigrated to Canada from the US about 15 years ago, but most of our business is outside of Canada. I have enjoyed special incentives here in Canada for small businesses as well as special programs that allowed me to work in Canada as a technical professional before I moved here permanently.

Overall, I have found Canadian employers more open to sponsoring foreign workers and it is easier for them to do so than what my US clients experience. Having said that, a significant portion of my work over the last few years has been on global projects where we leverage online collaboration tools to meet our goals. The advent of these tools has made it much easier to work from wherever I am and to work with others regardless of their visa statuses.

Where a company forms is less tied to where one lives or works these days.

Ajay- Could you tell us more about the Zachman framework (apart from the wikipedia reference)? A practical example on how you used it on an actual project would be great.

Karen- Of course the best resource for finding out about the Zachman framework is from John Zachman himself http://www.zachmaninternational.com/index.php/home-article/13 . He offers some excellent courses and does a great deal of public speaking at government and DAMA events. I highly recommend anyone interested in the Framework to hear about it directly from him.

There are many misunderstandings about John’s intent, such as the myth that he requires big upfront modeling (he doesn’t), that the Framework is a methodology (it isn’t), or that it can only be used to build computer systems (it can be used for more than that).

I have used the Zachman Framework to develop a joint Business-IT Strategic Information Systems Plan as well as to inventory and track progress of multi-project programs. One interesting use was a paper I authored for the Canadian Information Processing Society (CIPS) on how various educational programs, specializations, and certifications map to the Zachman Framework. I later developed a presentation about this mapping for a Zachman conference.

For a specific project, the Zachman Framework allows business to understand where their enterprise assets are being managed – and how well they are managed. It’s not an IT thing; it’s an enterprise architecture thing.

Ajay- What does Karen Lopez do for fun when not at work, traveling, speaking or blogging.

Karen- Sometimes it seems that’s all I do. I enjoy volunteering for IT-related organizations such as DAMA and CIPS. I participate in the accreditation of college and university educational programs in Canada and abroad. As a member of data-related standards bodies, namely the Association for Retail Technology Standards and the American Dental Association, I help develop industry standard data models. I’ve also been a spokesperson for a CIPS program to encourage girls to take more math and science courses throughout their student careers so that they may have access to great opportunities in the future.

I like to think of myself as a runner; last year I completed my first half marathon, which I’d never thought was possible. I am studying Hindi and Sanskrit. I’m also a addicted to reading and am thankful that some of it I actually get paid to do.

Biography

Karen López is a Senior Project Manager at InfoAdvisors, Inc. Karen is a frequent speaker at DAMA conferences and DAMA Chapters. She has 20+ years of experience in project and data management on large, multi-project programs. Karen specializes in the practical application of data management principles. Karen is also the ListMistress and moderator of the InfoAdvisors Discussion Groups at http://www.infoadvisors.com. You can reach her at www.twitter.com/datachick

Interview Jim Harris Data Quality Expert OCDQ Blog

Here is an interview with one of the chief evangelists to data quality in the field of Business Intelligence, Jim Harris who has a renowned blog at http://www.ocdqblog.com/. I asked Jim about his experiences in the field on data quality messing up big budget BI projects, and some tips and methodologies to avoid them.

No one likes to feel blamed for causing or failing to fix the data quality problems- Jim Harris, Data Quality Expert.

Jim Harris Large Photo

Ajay- Why the name OCDQ? What drives your passion for data quality? Name any anecdotes where bad data quality really messed up a big BI project.

Jim Harris – Ever since I was a child, I have had an obsessive-compulsive personality. If you asked my professional colleagues to describe my work ethic, many would immediately respond: “Jim is obsessive-compulsive about data quality…but in a good way!” Therefore, when evaluating the short list of what to name my blog, it was not surprising to anyone that Obsessive-Compulsive Data Quality (OCDQ) was what I chose.

On a project for a financial services company, a critical data source was applications received by mail or phone for a variety of insurance products. These applications were manually entered by data entry clerks. Social security number was a required field and the data entry application had been designed to only allow valid values. Therefore, no one was concerned about the data quality of this field – it had to be populated and only valid values were accepted.

When a report was generated to estimate how many customers were interested in multiple insurance products by looking at the count of applications per social security number, it appeared as if a small number of customers were interested in not only every insurance product the company offered, but also thousands of policies within the same product type. More confusion was introduced when the report added the customer name field, which showed that this small number of highly interested customers had hundreds of different names. The problem was finally traced back to data entry.

Many insurance applications were received without a social security number. The data entry clerks were compensated, in part, based on the number of applications they entered per hour. In order to process the incomplete applications, the data entry clerks entered their own social security number.

On a project for a telecommunications company, multiple data sources were being consolidated into a new billing system. Concerns about postal address quality required the use of validation software to cleanse the billing address. No one was concerned about the telephone number field – after all, how could a telecommunications company have a data quality problem with telephone number?

However, when reports were run against the new billing system, a high percentage of records had a missing telephone number. The problem was that many of the data sources originated from legacy systems that only recently added a telephone number field. Previously, the telephone number was entered into the last line of the billing address.

New records entered into these legacy systems did start using the telephone number field, but the older records already in the system were not updated. During the consolidation process, the telephone number field was mapped directly from source to target and the postal validation software deleted the telephone number from the cleansed billing address.

Ajay- Data Quality – Garbage in, Garbage out for a project. What percentage of a BI project do you think gets allocated to input data quality? What percentage of final output is affected by the normalized errors?

Jim Harris- I know that Gartner has reported that 25% of critical data within large businesses is somehow inaccurate or incomplete and that 50% of implementations fail due to a lack of attention to data quality issues.

The most common reason is that people doubt that data quality problems could be prevalent in their systems. This “data denial” is not necessarily a matter of blissful ignorance, but is often a natural self-defense mechanism from the data owners on the business side and/or the application owners on the technical side.

No one likes to feel blamed for causing or failing to fix the data quality problems.

All projects should allocate time and resources for performing a data quality assessment, which provides a much needed reality check for the perceptions and assumptions about the quality of the data. A data quality assessment can help with many tasks including verifying metadata, preparing meaningful questions for subject matter experts, understanding how data is being used, and most importantly – evaluating the ROI of data quality improvements. Building data quality monitoring functionality into the applications that support business processes provides the ability to measure the effect that poor data quality can have on decision-critical information.

Ajay- Companies talk of paradigms like Kaizen, Six Sigma and LEAN for eliminating waste and defects. What technique would you recommend for a company just about to start a major BI project for a standard ETL and reporting project to keep data aligned and clean?

Jim Harris- I am a big advocate for methodology and best practices and the paradigms you mentioned do provide excellent frameworks that can be helpful. However, I freely admit that I have never been formally trained or certified in any of them. I have worked on projects where they have been attempted and have seen varying degrees of success in their implementation. Six Sigma is the one that I am most familiar with, especially the DMAIC framework.

However, a general problem that I have with most frameworks is their tendency to adopt a one-size-fits-all strategy, which I believe is an approach that is doomed to fail. Any implemented framework must be customized to adapt to an organization’s unique culture. In part, this is necessary because implementing changes of any kind will be met with initial resistance, but an attempt at forcing a one-size-fits-all approach almost sends a message to the organization that everything they are currently doing is wrong, which will of course only increase the resistance to change.

Starting with a framework as a reference provides best practices and recommended options of what has worked for other organizations. The framework should be reviewed to determine what can best be learned from it and to select what will work in the current environment and what simply won’t. This doesn’t mean that the selected components of the framework will be implemented simultaneously. All change comes gradually and the selected components will most likely be implemented in phases.

Fundamentally, all change starts with changing people’s minds. And to do that effectively, the starting point has to be improving communication and encouraging open dialogue. This means more of listening to what people throughout the organization have to say and less of just telling them what to do. Keeping data aligned and clean requires getting people aligned and communicating.

Ajay- What methods and habits would you recommend to young analysts starting in the BI field for a quality checklist?

Jim Harris- I always make two recommendations.

First, never make assumptions about the data. I don’t care how well the business requirements document is written or how pretty the data model looks or how narrowly your particular role on the project has been defined. There is simply no substitute for looking at the data.

Second, don’t be afraid to ask questions or admit when you don’t know the answers. The only difference between a young analyst just starting out and an expert is that the expert has already made and learned from all the mistakes caused by being afraid to ask questions or admitting when you don’t know the answers.

Ajay- What does Jim Harris do to have quality time when not at work?

Jim- Since I enjoy what I do for a living so much, it sometimes seems impossible to disengage from work and make quality time for myself. I have also become hopelessly addicted to social media and spend far too much time on Twitter and Facebook. I have also always spent too much of my free time watching television and movies. I do try to read as much as I can, but I have so many stacks of unread books in my house that I could probably open my own book store. True quality time typically requires the elimination of all technology by going walking, hiking or mountain biking. I do bring my mobile phone in case of emergencies, but I turn it off before I leave.

Biography-

Jim Harris Small PhotoJim Harris is the Blogger-in-Chief here at Obsessive-Compulsive Data Quality (OCDQ), which is an independent blog offering a vendor-neutral perspective on data quality.

He is an independent consultant, speaker, writer and blogger with over 15 years of professional services and application development experience in data quality (DQ), and business intelligence (BI),

Jim has worked with Global 500 companies in finance, brokerage, banking, insurance, healthcare, pharmaceuticals, manufacturing, retail, telecommunications, and utilities. Jim also has a long history with the product that is now known as IBM InfoSphere QualityStage. Additionally, he has some experience with Informatica Data Quality and DataFlux dfPower Studio.

Jim can be followed at twitter.com/ocdqblog and contacted at http://www.ocdqblog.com/contact/


Interview Peter J Thomas -Award Winning BI Expert

Here is an in depth interview with Peter J Thomas, one of Europe’s top Business Intelligence expert and influential thought leaders. Peter talks about BI tools, data quality, science careers, cultural transformation and BI and the key focus areas.

I am a firm believer that the true benefits of BI are only realised when it leads to cultural transformation. -Peter James Thomas

 

Ajay- Describe about your early career including college to the present.

Peter –I was an all-rounder academically, but at the time that I was taking public exams in the 1980s, if you wanted to pursue a certain subject at University, you had to do related courses between the ages of 16 and 18. Because of this, I dropped things that I enjoyed such as English and ended up studying Mathematics, Further Mathematics, Chemistry and Physics. This was not because I disliked non-scientific subjects, but because I was marginally fonder of the scientific ones. In a way it is nice that my current blogging allows me to use language more.

The culmination of these studies was attending Imperial College in London to study for a BSc in Mathematics. Within the curriculum, I was more drawn to Pure Mathematics and Group Theory in particular, and so went on to take an MSc in these areas. This was an intercollegiate course and I took a unit at each of King’s College and Queen Mary College, but everything else was still based at Imperial. I was invited to stay on to do a PhD. It was even suggested that I might be able to do this in two years, given my MSc work, but I decided that a career in academia was not for me and so started looking at other options.

As sometimes happens a series of coincidences and a slice of luck meant that I joined a technology start-up, then called Cedardata, late in 1988; my first role was as a Trainee Analyst / Programmer. Cedardata was one of the first organisations to offer an Accounting system based on a relational database platform; something that was then rather novel, at least in the commercial arena. The RDBMS in question was Oracle version 5, running on VAX VMS – later DEC Ultrix and a wide variety of other UNIX flavours. Our input screens were written in SQL*Forms 2 – later Oracle Forms – and more complex processing logic and reports were in Pro*C; this was before PL/SQL. Obviously this environment meant that I had to become very conversant with SQL*Plus and C itself.

When I joined Cedardata, they had 10 employees, 3 customers and annual revenue of just £50,000 ($80,000). By the time I left the company eight years later, it had grown dramatically to having a staff of 250, over 300 clients in a wide range of industries and sales in excess of £12 million ($20 million). It had also successfully floatated on the main London Stock Exchange. When a company grows that quickly the same thing tends to happen to its employees.

Cedardata was probably the ideal environment for me at the time; an organisation that grew rapidly, offering new opportunities and challenges to its employees; that was fiercely meritocratic; and where narrow, but deep, technical expertise was encouraged to be rounded out by developing more general business acumen, a customer-focused attitude and people-management skills. I don’t think that I would have learnt as much, or progressed anything like as quickly in any other type of organisation.

It was also at Cedardata that I had my first experience of the class of applications that later became known as Business Intelligence tools. This was using BusinessObjects 3.0 to write reports, cross-tabs and graphs for a prospective client, the UK Foreign and Commonwealth Office (State Department). The approach must have worked as we beat Oracle Financials in a play-off to secure the multi-million pound account.

During my time at Cedardata, I rose to become an executive and filled a number of roles including Head of Development and also Assistant to the MD / Head of Product Strategy. Spending my formative years in an organisation where IT was the business and where the customer was King had a profound impact on me and has influenced my subsequent approach to IT / Business alignment.

Ajay- How would you convince young people to take maths and science more? What advice would you give to policy makers to promote more maths and science students?

Peter- While I have used little of my Mathematics directly in my commercial career, the approach to problem-solving that it inculcated in me has been invaluable. On arriving at University, it was something of a shock to be presented with Mathematical problems where you couldn’t simply look up the method of solution in a textbook and apply it to guarantee success. Even in my first year I had to grapple with challenges where you had no real clue where to start. Instead what worked, at least most of the time, was immersing yourself in the general literature, breaking down the problem into more manageable chunks, trying different techniques – sometimes quite recherché ones – to make progress, occasionally having an insight that provides a short-cut, but more often succeeding through dogged determination. All of that sounds awfully like the approach that has worked for me in a business context.

Having said that, I was not terribly business savvy as a student. I didn’t take Mathematics because I thought that it would lead to a career, I took it because I was fascinated by the subject. As I mentioned earlier, I enjoyed learning about a wide range of things, but Science seemed to relate to the most fundamental issues. Mathematics was both the framework that underpinned all of the Sciences and also offered its own world where astonishing an beautiful results could be found, independent of any applicability; although it has to be said that there are few braches of Mathematics that have not be applied somewhere or other.

I think you either have this appreciation of Science and Mathematics or you don’t and that this happens early on.

Certainly my interest was supported by my parents and a variety of teachers, but a lot of it arose from simply reading about Cosmology, or Vulcanism, or Palaeontology. I watched a YouTube of Steven Jay Gould recently saying that when he was a child in the 1950s all children were “in” to Dinosaurs, but that he actually got to make a career out of it. Maybe all children aren’t “in” to dinosaurs in the same way today, perhaps the mystery and sense of excitement has gone.

In the UK at least there appears to be less and less people taking Science and Mathematics. I am not sure what is behind this trend. I read pieces that suggest that Science and Maths are viewed as being “hard” subjects, and people opt for “easier” alternatives. I think creative writing is one of the hardest things to do, so I’m not sure where this perspective comes from.

Perhaps some things that don’t help are the twin images of the Scientist as a white-coated boffin and the Mathematician as a chalk-covered recluse, neither of whom have much of a grasp on the world beyond their narrow discipline. While of course there is a modicum of truth in these stereotypes, they are far from being wholly accurate in my experience.

Perhaps Science has fallen off of the pedestal that it was placed on in the 1950s and 1960s. Interest in Science had been spurred by a range of inventions that had improved people’s lives and often made the inventors a lot of money. Science was seen as the way to a better tomorrow, a view reinforced by such iconic developments as the discovery of the structure of DNA, our ever deepening insight about sub-atomic physics and the unravelling of many mysteries of the Universe. These advances in pure science were supported by feats of scientific / engineering achievement such as the Apollo space programme. The military importance of Science was also put into sharp relief by the Manhattan Project; something that also maybe sowed the seeds for later disenchantment and even fear of the area.

The inevitable fallibility of some Scientists and some scientific projects burst the bubble. High-profile problems included the Thalidomide tragedy and the outcry, however ill-informed, about genetically modified organisms. Also the poster child of the scientific / engineering community was laid low by the Challenger disaster. On top of this, living with the scientifically-created threat of mutually-assured destruction probably began to change the degree of positivity with which people viewed Science and Scientists. People arrived at the realisation that Science cannot address every problem; how much effort has gone into finding a cure for cancer for example?

In addition, in today’s highly technological world, the actual nuts and bolts of how things work are often both hidden and mysterious. While people could relatively easily understand how a steam engine works, how many have any idea about how their iPod functions? Technology has become invisible and almost unimportant, until it stops working.

I am a little wary of Governments fixing issues such as these, which are the result of major generational and cultural trends. Often state action can have unintended and perverse results. Society as a whole goes through cycles and maybe at some future point Science and Mathematics will again be viewed as interesting areas to study; I certainly hope so. Perhaps the current concerns about climate change will inspire a generation of young people to think more about technological ways to address this and interest them in pertinent Sciences such as Meteorology and Climatology.

Ajay-. How would you rate the various tools within the BI industry like in a SWOT analysis (briefly and individually)?

Peter- I am going to offer a Politician’s reply to this. The really important question in BI is not which tool is best, but how to make BI projects successful. While many an unsuccessful BI manager may blame the tool or its vendor, this is not where the real issues lie.

I firmly believe that successful BI rests on four mutually reinforcing pillars:

  • understand the questions the business needs to answer,
  • understand the data available,
  • transform the data to meet the business needs and
  • embed the use of BI in the organisation’s culture.

If you get these things right then you can be successful with almost any of the excellent BI tools available in the marketplace. If you get any one of them wrong, then using the paragon of BI tools is not going to offer you salvation.

I think about BI tools in the same way as I do the car market. Not so many years ago there were major differences between manufacturers.

The Japanese offered ultimate reliability, but maybe didn’t often engage the spirit.

The Germans prided themselves on engineering excellence, slanted either in the direction of performance or luxury, but were not quite as dependable as the Japanese.

The Italians offered out-and-out romance and theatre, with mechanical integrity an afterthought.

The French seemed to think that bizarrely shaped cars with wheels as thin as dinner plates were the way forward, but at least they were distinctive.

The Swedes majored on a mixture of safety and aerospace cachet, but sometimes struggled to shift their image of being boring.

The Americans were still in the middle of their love affair with the large and the rugged, at the expense of convenience and value-for-money.

Stereotypically, my fellow-countrymen majored on agricultural charm, or wooden-panelled nostalgia, but struggled with the demands of electronics.

Nowadays, the quality and reliability of cars are much closer to each other. Most manufacturers have products with similar features and performance and economy ratings. If we take financial issues to one side, differences are more likely to related to design, or how people perceive a brand. Today the quality of a Ford is not far behind that of a Toyota. The styling of a Honda can be as dramatic as an Alfa Romeo. Lexus and Audi are playing in areas previously the preserve of BMW and Mercedes and so on.

To me this is also where the market for BI tools is at present. It is relatively mature and the differences between product sets are less than before.

Of course this doesn’t mean that the BI field will not be shaken up by some new technology or approach (in-memory BI or SaaS come to mind). This would be the equivalent of the impact that the first hybrid cars had on the auto market.

However, from the point of view of implementations, most BI tools will do at least an adequate job and picking one should not be your primary concern in a BI project.

Ajay- SAS Institute Chief Marketing Officer, Jim Davis (interviewed with this blog) points to the superiority of business analytics rather than business intelligence as an over hyped term. What numbers, statistics and graphs would you quote rather than semantics to help re direct those perceptions?

I myself use SAS,SPSS, R and find the decision management capabilities as James Taylor calls Decision Management much better enabled than by simple ETL tools or reporting and aggregating graphs tools in many BI tools.

Peter- I have expended quite a lot of energy and hundreds of words on this subject. If people are interested in my views, which are rather different to those of Jim Davis, then I’d suggest that they read them in a series of articles starting with Business Analytics vs Business Intelligence [URL http://peterthomas.wordpress.com/2009/03/28/business-analytics-vs-business-intelligence/ ].

I will however offer some further thoughts and to do this I’ll go back to my car industry analogy. In a world where cars are becoming more and more comparable in terms of their reliability, features, safety and economy, things like styling, brand management and marketing become more and more important.

As the true differences between BI vendors narrow, expect more noise to be made by marketing departments about how different their products are.

I have no problem in acknowledging SAS as a leader in Business Analytics, too many people I respect use their tools for me to think otherwise. However, I think a better marketing strategy for them would be to stick to the many positives of their own products. If they insist on continuing to trash competitors, then it would make sense for them to do this in a way that couldn’t be debunked by a high school student after ten seconds’ reflection.

Ajay- In your opinion what is the average RoI that a small, large medium enterprise gets by investing in a business intelligence platform. What advice would you give to such firms (separately) to help them make their minds?

Peter- The question is pretty much analogous to “What are the benefits of opening an office in China?” the answer is going to depend on what the company does; what their overall strategy is and how a China operation might complement this; whether their products and services are suitable for the Chinese market; how their costs, quality and features compare to local competitors; and whether they have cracked markets closer to home already.

To put things even more prosaically, “How long is a piece of string?”

Taking to one side the size and complexity of an organisation, BI projects come in all shapes and sizes.

Personally I have led Enterprise-wide, all-pervasive BI projects which have had a profound impact on the company. I have also seen well-managed and successful BI projects targeted on a very narrow and specific area.

The former obviously cost more than the latter, but the benefits are commensurately greater. In fact I would argue that the wider a BI project is spread, the greater its payback. Maybe lessons can be learnt and confidence built in an initial implementation to a small group, but to me the real benefit of BI is realised when it touches everything that a company does.

This is not based on a self-interested boosting of BI. To me if what we want to do is take better business decisions, then the greater number of such decisions that are impacted, the better that this is for the organisation.

Also there are some substantial up-front investments required for BI. These would include: building the BI team; establishing the warehouse and a physical architecture on which to deliver your application. If these can be leveraged more widely, then costs come down.

The same point can be made about the intellectual property that a successful BI team develops. This is one reason why I am a fan of the concept of BI Competency Centres [URL http://peterthomas.wordpress.com/2009/05/11/business-intelligence-competency-centres/ ].

I have been lucky enough to contribute to an organisation turning round from losing hundreds of millions of dollars to recording profits of twice that magnitude. When business managers cite BI as a major factor behind such a transformation, then this is clearly a technology that can be used to dramatic effect.

Nevertheless both estimating the potential impact of BI and measuring its actual effectiveness are non-trivial activities. A number of different approaches can be taken, some of which I cover in my article:

Measuring the benefits of Business Intelligence [URL http://peterthomas.wordpress.com/2009/02/26/measuring-the-benefits-of-business-intelligence/ ]. As ever there is no single recipe for success.

Ajay-. Which BI tool/ code are you most comfortable with and what are its salient points?

Peter –Although I have been successful with elements of the IBM-Cognos toolset and think that this has many strong points, not least being relatively user-friendly, I think I’ll go back to my earlier comments about this area being much less important than many others for the success of a BI project.

Ajay -How do you think cloud computing will change BI? What percentage of BI budgets go to data quality and what is eventual impact of data quality on results?

Peter –I think that the jury is still out on cloud computing and BI. By this I do not mean that cloud computing will not have an impact, but rather that it remains unclear what this impact will actually be.

Given the maturity of the market, my suspicion is that the BI equivalent of a Google is not going to emerge from nowhere. There are many excellent BI start-ups in this space and I have been briefed by quite a few of them.

However, I think the future of cloud computing in BI is likely to be determined by how the likes of IBM-Cognos, SAP-BusinessObjects and Oracle-Hyperion embrace the area.

Having said this, one of the interesting things in computing is how easy it is to misjudge the future and perhaps there is a potential titan of cloud BI currently gestating in the garage so beloved of IT mythology.

On data quality, I have never explicitly split out this component of a BI effort. Rather data quality has been an integral part of what we have done. Again I have taken a four-pillared approach:

  • improve how the data is entered;
  • make sure your interfaces aren’t the problem;
  • check how the data has been entered / interfaced;
  • and don’t suppress bad data in your BI.

The first pillar consists of improved validation in front-end systems – something that can be facilitated by the BI team providing master data to them – and also a focus on staff training, stressing the importance to the organisation of accurately recording certain data fields.

The second pillar is more to do with the general IT Architecture and how this relates to the Information Architecture, again master data has a role to play, but so does ensuring that the IT culture is one in which different teams collaborate well and are concerned about what happens to data when it leaves “their” systems.

The third pillar is the familiar world of after-the-fact data quality reports and auditing, something that is necessary, but not sufficient, for success in data quality.

Finally there is what I think can be one of the most important pillars; ensuring that the BI system takes a warts-and-all approach to data. This means that bad data is highlighted, rather than being suppressed. In turn this creates pressure for the problems to be addressed where they arise and creates a virtuous circle.

For those who might be interested in this area, I expand on it more in Using BI to drive improvements in data quality [URL http://peterthomas.wordpress.com/2009/02/11/using-bi-to-drive-improvements-in-data-quality/ ].

Ajay- You are well known with England’s rock climbing and boulder climbing community. A fun question- what is the similarity between a BI implementation/project and climbing a big boulder.

Peter –I would have to offer two minor clarifications.

First it is probably my partner who is better known in climbing circles, via here blog [URL http://77jenn.blogspot.com/ ] and articles and reviews that she has written for the climbing press; though I guess I can take credit for most of the photos and videos.

Second, particularly given the fact that a lot of our climbing takes place in Wales, I should acknowledge the broader UK climbing community and also mention our most mountainous region of Scotland.

Despite what many inhabitants of Sheffield might think to the contrary, there is life beyond Stanage Edge [URL http://en.wikipedia.org/wiki/Stanage ].

I have written about the determination and perseverance that are required to get to the top of a boulder, or indeed to the top of any type of climb [URL http://peterthomas.wordpress.com/2009/03/31/perseverance/ ].

I think those same qualities are necessary for any lengthy, complex project. I am a firm believer that the true benefits of BI are only realised when it leads to cultural transformation. Certainly the discipline of change management has many parallels with rock climbing. You need a positive attitude and a strong belief in your ultimate success, despite the inevitable setbacks. If one approach doesn’t yield fruit then you need to either fine-tune or try something radically different.

I suppose a final similarity is the feeling that you get having completed a climb, particularly if it is at the limit of your ability and has taken a long time to achieve. This is one of both elation and deep satisfaction, but is quickly displaced by a desire to find the next challenge.

This is something that I have certainly experienced in business life and I think the feelings will be familiar to many readers.

Biography-

 

Peter Thomas has led all-pervasive, Business Intelligence and Cultural Transformation projects serving the needs of 500+ users in multiple business units and service departments across 13 European and 5 Latin American countries. He has also developed Business Intelligence strategies for operations spanning four continents. His BI work has won two industry awards including “Best Enterprise BI Implementation”, from Cognos in 2006 and “Best use of IT in Insurance”, from Financial Sector Technology in 2005. Peter speaks about success factors in both Business Intelligence and the associated Change Management at seminars across both Europe and North America and writes about these areas and many other aspects of business, technology and change on his blog [URL http://peterthomas.wordpress.com ].