Gartner BI and Inf Mgmt Summit 2011- 30 min One on Ones

From the land Down Under, where Gartner gathers business summit thunder.

http://www.gartner.com/technology/summits/apac/business-intelligence/index.jsp

Gartner Business Intelligence
& Information Management Summit 2011

February 22 – 23 • Sydney, AUSTRALIA
gartner.com/ap/bi

Register Now

From Information to Intelligence:

Evaluate, Execute and Evolve

At Gartner Business Intelligence & Information Management Summit 2011 you will experience a unique mix of Gartner research presentations, guest keynote addresses, real-life case studies and interactive panel discussions to provide you with a holistic view of the business intelligence and performance management landscape. Information, insight and advice are channeled through an increasingly targeted and focused approach, taking you from the high-level strategic view all the way to your specific issue.

Click here to view the full agenda or download the brochure.

AGENDA HIGHLIGHTS

teamsend


Guest Keynote Address

Future Thinking – Global Trends and Thinking that are Upending your Business

Anders Sorman-Nilsson
Creative Director, Thinque

Click here to read more about this session.

Best Practice Workshops:

  • How to Become an Effective Data Warehouse Modeler
  • Analytics – Business Intelligence and Performance Management ITScore

Analyst User Roundtables:

  • Enterprise Information Management – Focusing on What Matters to the Business
  • Sharepoint – thin edge of the wedge to the MS family
  • Preparing for the 2020 workplace

Worldwide Expertise at Your Fingertips!
Your questions on Business Intelligence and Performance Management answered. Meet the Gartner Analysts presenting at the Summit and book your exclusive 30 minute one-on-one ( lap top dance) with the Analysts of your choice.

Using SAS/IML with R

Analyze That
Image via Wikipedia

SAS just released an updated documentation to SAS/IML language with a special chapter devoted to using R

Here is an example-

CALL EXPORTMATRIXTOR( IMLMatrix, RMatrix ) ;

CALL IMPORTMATRIXFROMR( IMLMatrix, RExpr ) ;

If you have existing SAS licences and existing hardware and loots of data -this may be the best of both worlds- without getting into the mess of technically learning MKL threads/BLAS/Premium Packages/Cloud

Another thought- its a good professional looking help book, which is what more R packages can do (work on improving ease of their help/update vignettes)

 

Link-http://support.sas.com/documentation/cdl/en/imlug/63541/HTML/default/viewer.htm#r_toc.htm

 

Calling Functions in the R Language

[continuerule]

SAS Lawsuit against WPS- Application Dismissed

I saw Phil Rack http://twitter.com/#!/PhilRack (whom I have interviewed before at https://decisionstats.com/2009/02/03/interview-phil-rack/ ) and whom I dont talk to since Obama won the election-

 

 

 

 

 

 

 

well Phil -creator of Bridge to R- first SAS language to R language interface- mentioned this judgment and link.

 

Probably Phil should revise the documentation of Bridge to R- lest he is sued himself!!!

Conclusion
It was for these reasons that I decided to dismiss SAS’s application.

From-

http://www.bailii.org/cgi-bin/markup.cgi?doc=/ew/cases/EWHC/Ch/2010/3012.html

 

Neutral Citation Number: [2010] EWHC 3012 (Ch)
Case No: HC09C03293

IN THE HIGH COURT OF JUSTICE
CHANCERY DIVISION
Royal Courts of Justice
Strand, London, WC2A 2LL
22 November 2010

B e f o r e :

THE HON MR JUSTICE ARNOLD
____________________
Between:
SAS INSTITUTE INC. Claimant
– and –

WORLD PROGRAMMING LIMITED Defendant

____________________

Michael Hicks (instructed by Bristows) for the Claimant
Martin Howe QC and Isabel Jamal (instructed by Speechly Bircham LLP) for the Defendant
Hearing date: 18 November 2010
____________________

HTML VERSION OF JUDGMENT
____________________

Crown Copyright ©

MR. JUSTICE ARNOLD :

Introduction
By order dated 28 July 2010 I referred certain questions concerning the interpretation of Council Directive 91/250/EEC of 14 May 1991 on the legal protection of computer programs, which was recently codified as European Parliament and Council Directive 2009/24/EC of 23 April 2009, and European Parliament and Council Directive 2001/29/EC of 22 May 2001 on the harmonisation of certain aspects of copyright and related rights in the information society to the Court of Justice of the European Union under Article 267 of the Treaty on the Functioning of the European Union. The background to the reference is set out in full in my judgment dated 23 July 2010 [2010] EWHC 1829 (Ch). The reference is presently pending before the Court of Justice as Case C-406/10. By an application notice issued on 11 October 2010 SAS applied for the wording of the questions to be amended in a number of respects. I heard that application on 18 November 2010 and refused it for reasons to be given later. This judgment contains those reasons.

The questions and the proposed amendments
I set out below the questions referred with the amendments proposed by SAS shown by strikethrough and underlining:

“A. On the interpretation of Council Directive 91/250/EEC of 14 May 1991 on the legal protection of computer programs and of Directive 2009/24/EC of the European Parliament and of the Council of 23 April 2009 (codified version):
1. Where a computer program (‘the First Program’) is protected by copyright as a literary work, is Article 1(2) to be interpreted as meaning that it is not an infringement of the copyright in the First Program for a competitor of the rightholder without access to the source code of the First Program, either directly or via a process such as decompilation of the object code, to create another program (‘the Second Program’) which replicates by copying the functions of the First Program?
2. Is the answer to question 1 affected by any of the following factors:
(a) the nature and/or extent of the functionality of the First Program;
(b) the nature and/or extent of the skill, judgment and labour which has been expended by the author of the First Program in devising and/or selecting the functionality of the First Program;
(c) the level of detail to which the functionality of the First Program has been reproduced in the Second Program;
(d) if, the Second Program includes the following matters as a result of copying directly or indirectly from the First Program:
(i) the selection of statistical operations which have been implemented in the First Program;
(ii) the selection of mathematical formulae defining the statistical operations which the First Program carries out;
(iii) the particular commands or combinations of commands by which those statistical operations may be invoked;
(iv) the options which the author of the First Program has provided in respect of various commands;
(v) the keywords and syntax recognised by the First Program;
(vi) the defaults which the author of the First Program has chosen to implement in the event that a particular command or option is not specified by the user;
(vii) the number of iterations which the First Program will perform in certain circumstances;
(e)(d) if the source code for the Second Program reproduces by copying aspects of the source code of the First Program to an extent which goes beyond that which was strictly necessary in order to produce the same functionality as the First Program?
3. Where the First Program interprets and executes application programs written by users of the First Program in a programming language devised by the author of the First Program which comprises keywords devised or selected by the author of the First Program and a syntax devised by the author of the First Program, is Article 1(2) to be interpreted as meaning that it is not an infringement of the copyright in the First Program for the Second Program to be written so as to interpret and execute such application programs using the same keywords and the same syntax?
4. Where the First Program reads from and writes to data files in a particular format devised by the author of the First Program, is Article 1(2) to be interpreted as meaning that it is not an infringement of the copyright in the First Program for the Second Program to be written so as to read from and write to data files in the same format?
5. Does it make any difference to the answer to questions 1, 2, 3 and 4 if the author of the Second Program created the Second Program without access to the source code of the First Program, either directly or via decompilation of the object code by:
(a) observing, studying and testing the functioning of the First Program; or
(b) reading a manual created and published by the author of the First Program which describes the functions of the First Program (“the Manual”) and by implementing in the Second Program the functions described in the Manual; or
(c) both (a) and (b)?
6. Where a person has the right to use a copy of the First Program under a licence, is Article 5(3) to be interpreteding as meaning that the licensee is entitled, without the authorisation of the rightholder, to perform acts of loading, running and storing the program in order to observe, test or study the functioning of the First Program so as to determine the ideas and principles which underlie any element of the program, if the licence permits the licensee to perform acts of loading, running and storing the First Program when using it for the particular purpose permitted by the licence, but the acts done in order to observe, study or test the First Program extend outside the scope of the purpose permitted by the licence and are therefore acts for which the licensee has no right to use the copy of the First Program under the licence?
7. Is Article 5(3) to be interpreted as meaning that acts of observing, testing or studying of the functioning of the First Program are to be regarded as being done in order to determine the ideas or principles which underlie any element of the First Program where they are done:
(a) to ascertain the way in which the First Program functions, in particular details which are not described in the Manual, for the purpose of writing the Second Program in the manner referred to in question 1 above;
(b) to ascertain how the First Program interprets and executes statements written in the programming language which it interprets and executes (see question 3 above);
(c) to ascertain the formats of data files which are written to or read by the First Program (see question 4 above);
(d) to compare the performance of the Second Program with the First Program for the purpose of investigating reasons why their performances differ and to improve the performance of the Second Program;
(e) to conduct parallel tests of the First Program and the Second Program in order to compare their outputs in the course of developing the Second Program, in particular by running the same test scripts through both the First Program and the Second Program;
(f) to ascertain the output of the log file generated by the First Program in order to produce a log file which is identical or similar in appearance;
(g) to cause the First Program to output data (in fact, data correlating zip codes to States of the USA) for the purpose of ascertaining whether or not it corresponds with official databases of such data, and if it does not so correspond, to program the Second Program so that it will respond in the same way as the First Program to the same input data.
B. On the interpretation of Directive 2001/29/EC of the European Parliament and of the Council of 22 May 2001 on the harmonisation of certain aspects of copyright and related rights in the information society:
8. Where the Manual is protected by copyright as a literary work, is Article 2(a) to be interpreted as meaning that it is an infringement of the copyright in the Manual for the author of the Second Program to reproduce or substantially reproduce in the Second Program any or all of the following matters described in the Manual:
(a) the selection of statistical operations which have been described in the Manual as being implemented in the First Program;
(b) the mathematical formulae used in the Manual to describe those statistical operations;
(c) the particular commands or combinations of commands by which those statistical operations may be invoked;
(d) the options which the author of the First Program has provided in respect of various commands;
(e) the keywords and syntax recognised by the First Program;
(f) the defaults which the author of the First Program has chosen to implement in the event that a particular command or option is not specified by the user;
(g) the number of iterations which the First Program will perform in certain circumstances?
9. Is Article 2(a) to be interpreted as meaning that it is an infringement of the copyright in the Manual for the author of the Second Program to reproduce or substantially reproduce in a manual describing the Second Program the keywords and syntax recognised by the First Program?”

Jurisdiction
It was common ground between counsel that, although there is no direct authority on the point, it appears that the Court of Justice would accept an amendment to questions which had previously been referred by the referring court. The Court of Justice has stated that “national courts have the widest discretion in referring matters”: see Case 166/73 Rheinmühlen Düsseldorf v Einfuhr-und Vorratstelle für Getreide under Futtermittel [1974] ECR 33 at [4]. If an appeal court substitutes questions for those referred by a lower court, the substituted questions will be answered: Case 65/77 Razanatsimba [1977] ECR 2229. Sometimes the Court of Justice itself invites the referring court to clarify its questions, as occurred in Interflora Inc v Marks & Spencer plc (No 2) [2010] EWHC 925 (Ch). In these circumstances, there does not appear to be any reason to think that, if the referring court itself had good reason to amend its questions, the Court of Justice would disregard the amendment.

Counsel for WPL submitted, however, that, as a matter of domestic procedural law, this Court had no jurisdiction to vary an order for reference once sealed unless either there had been a material change of circumstances since the order (as in Interflora) or it had subsequently emerged that the Court had made the order on a false basis. He submitted that neither of those conditions was satisfied here. In those circumstances, the only remedy of a litigant in the position of SAS was to seek to appeal to the Court of Appeal.

As counsel for WPL pointed out, CPR rule 3.1(7) confers on courts what appears to be a general power to vary or revoke their own orders. The proper exercise of that power was considered by the Court of Appeal in Collier v Williams [2006] EWCA Civ 20, [2006] 1 WLR 1945 and Roult v North West Strategic Health Authority [2009] EWCA Civ 444, [2010] 1 WLR 487.

In Collier Dyson LJ (as he then was) giving the judgment of the Court of Appeal said:

“39. We now turn to the third argument. CPR 3.1(7) gives a very general power to vary or revoke an order. Consideration was given to the circumstances in which that power might be used by Patten J in Lloyds Investment (Scandinavia) Limited v Christen Ager-Hanssen [2003] EWHC 1740 (Ch). He said at paragraph 7:
‘The Deputy Judge exercised a discretion under CPR Part 13.3. It is not open to me as a judge exercising a parallel jurisdiction in the same division of the High Court to entertain what would in effect be an appeal from that order. If the Defendant wished to challenge whether the order made by Mr Berry was disproportionate and wrong in principle, then he should have applied for permission to appeal to the Court of Appeal. I have been given no real reasons why this was not done. That course remains open to him even today, although he will have to persuade the Court of Appeal of the reasons why he should have what, on any view, is a very considerable extension of time. It seems to me that the only power available to me on this application is that contained in CPR Part 3.1(7), which enables the Court to vary or revoke an order. This is not confined to purely procedural orders and there is no real guidance in the White Book as to the possible limits of the jurisdiction. Although this is not intended to be an exhaustive definition of the circumstances in which the power under CPR Part 3.1(7) is exercisable, it seems to me that, for the High Court to revisit one of its earlier orders, the Applicant must either show some material change of circumstances or that the judge who made the earlier order was misled in some way, whether innocently or otherwise, as to the correct factual position before him. The latter type of case would include, for example, a case of material non-disclosure on an application for an injunction. If all that is sought is a reconsideration of the order on the basis of the same material, then that can only be done, in my judgment, in the context of an appeal. Similarly it is not, I think, open to a party to the earlier application to seek in effect to re-argue that application by relying on submissions and evidence which were available to him at the time of the earlier hearing, but which, for whatever reason, he or his legal representatives chose not to employ. It is therefore clear that I am not entitled to entertain this application on the basis of the Defendant’s first main submission, that Mr Berry’s order was in any event disproportionate and wrong in principle, although I am bound to say that I have some reservations as to whether he was right to impose a condition of this kind without in terms enquiring whether the Defendant had any realistic prospects of being able to comply with the condition.’
We endorse that approach. We agree that the power given by CPR 3.1(7) cannot be used simply as an equivalent to an appeal against an order with which the applicant is dissatisfied. The circumstances outlined by Patten J are the only ones in which the power to revoke or vary an order already made should be exercised under 3.1(7).”
In Roult Hughes LJ, with whom Smith and Carnwath LJJ agreed, said at [15]:

“There is scant authority upon Rule 3.1(7) but such as exists is unanimous in holding that it cannot constitute a power in a judge to hear an appeal from himself in respect of a final order. Neuberger J said as much in Customs & Excise v Anchor Foods (No 3) [1999] EWHC 834 (Ch). So did Patten J in Lloyds Investment (Scandinavia) Ltd v Ager-Hanssen [2003] EWHC 1740 (Ch). His general approach was approved by this court, in the context of case management decisions, in Collier v Williams [2006] EWCA Civ 20. I agree that in its terms the rule is not expressly confined to procedural orders. Like Patten J in Ager-Hanssen I would not attempt any exhaustive classification of the circumstances in which it may be proper to invoke it. I am however in no doubt that CPR 3.1(7) cannot bear the weight which Mr Grime’s argument seeks to place upon it. If it could, it would come close to permitting any party to ask any judge to review his own decision and, in effect, to hear an appeal from himself, on the basis of some subsequent event. It would certainly permit any party to ask the judge to review his own decision when it is not suggested that he made any error. It may well be that, in the context of essentially case management decisions, the grounds for invoking the rule will generally fall into one or other of the two categories of (i) erroneous information at the time of the original order or (ii) subsequent event destroying the basis on which it was made. The exigencies of case management may well call for a variation in planning from time to time in the light of developments. There may possibly be examples of non-procedural but continuing orders which may call for revocation or variation as they continue – an interlocutory injunction may be one. But it does not follow that wherever one or other of the two assertions mentioned (erroneous information and subsequent event) can be made, then any party can return to the trial judge and ask him to re-open any decision…..”
In the present case there has been no material change of circumstances since I made the Order dated 28 July 2010. Nor did counsel for SAS suggest that I had made the Order upon a false basis. Counsel for SAS did submit, however, that the Court of Appeal had left open the possibility that it might be proper to exercise the power conferred by rule 3.1(7) even if there had no been material change of circumstances and it was not suggested that the order in question had been made on a false basis. Furthermore, he relied upon paragraph 1.1 of the Practice Direction to CPR Part 68, which provides that “responsibility for settling the terms of the reference lies with the English court and not with the parties”. He suggested that this meant that orders for references were not subject to the usual constraints on orders made purely inter partes.

In my judgment PD68 paragraph 1.1 does not justify exercising the power conferred by rule 3.1(7) in circumstances falling outside those identified in Collier and Roult. I am therefore very doubtful that it would be a proper exercise of the power conferred on me by CPR r. 3.1(7) to vary the Order dated 28 July 2010 in the present circumstances. I prefer, however, not to rest my decision on that ground.

Discretion
Counsel for WPL also submitted that, even if this Court had jurisdiction to amend the questions, I should exercise my discretion by refusing to do so for two reasons. First, because the application was made too late. Secondly, because there was no sufficient justification for the amendments anyway. I shall consider these points separately.

Delay
The relevant dates are as follows. The judgment was handed down on 23 July 2010, a draft having been made available to the parties a few days before that. There was a hearing to consider the form of the order, and in particular the wording of the questions to be referred, on 28 July 2010. Prior to that hearing both parties submitted drafts of the questions, and the respective drafts were discussed at the hearing. Following the hearing I settled the Order, and in particular the questions. The Order was sealed on 2 August 2010. The sealed Order was received by the parties between 3 and 5 August 2010. At around the same time the Senior Master of the Queen’s Bench Division transmitted the Order to the Court of Justice. On 15 September 2010 the Registry of the Court of Justice notified the parties, Member States and EU institutions of the reference. On 1 October 2010 the United Kingdom Intellectual Property Office advertised the reference on its website and invited comments by interested parties by 7 October 2010. The latest date on which written observations on the questions referred may be filed at the Court of Justice is 8 December 2010 (two months from the date of the notification plus 10 days extension on account of distance where applicable). This period is not extendable in any circumstances.

As noted above, the application was not issued until 11 October 2010. No justification has been provided by SAS for the delay in making the application. The only explanation offered by counsel for SAS was that the idea of proposing the amendments had only occurred to those representing SAS when starting work on SAS’s written observations.

Furthermore, the application notice requested that the matter be dealt with without a hearing. In my view that was not appropriate: the application was plainly one which was likely to require at least a short hearing. Furthermore, the practical consequence of proceeding in that way was to delay the hearing of the application. The paper application was put before me on 22 October 2010. On the same day I directed that the matter be listed for hearing. In the result it was not listed for hearing until 18 November 2010. If SAS had applied for the matter to be heard urgently, I am sure that it could have been dealt with sooner.

As counsel for WPL submitted, it is likely that the parties, Member States and institutions who intend to file written observations are now at an advanced stage of preparing those observations. Indeed, it is likely that preparations would have been well advanced even on 11 October 2010. To amend the questions at this stage in the manner proposed by SAS would effectively require the Court of Justice to re-start the written procedure all over again. The amended questions would have to be translated into all the EU official languages; the parties, Member States and EU institutions would have to be notified of the amended questions; and the time for submitting written observations would have to be re-set. This would have two consequences. First, a certain amount of time, effort and money on the part of those preparing written observations would be wasted. Secondly, the progress of the case would be delayed. Those are consequences that could have been avoided if SAS had moved promptly after receiving the sealed Order.

In these circumstances, it would not in my judgment be proper to exercise any discretion I may have in favour of amending the questions.

No sufficient justification
Counsel for WPL submitted that in any event SAS’s proposed amendments were not necessary in order to enable the Court of Justice to provide guidance on the issues in this case, and therefore there was no sufficient justification for making the amendments.

Before addressing that submission directly, I think it is worth commenting more generally on the formulation of questions. As is common ground, and reflected in paragraph 1.1 of PD68, it is well established that the questions posed on a reference under Article 267 are the referring court’s questions, not the parties’. The purpose of the procedure is for the Court of Justice to provide the referring court with the guidance it needs in order to deal with the issues before it. It follows that it is for the referring court to decide how to formulate the questions.

In my view it is usually helpful for the court to have the benefit of the parties’ comments on the wording of the proposed questions, as envisaged in paragraph 1.1 of PD68. There are two main reasons for this. The first is to try to ensure that the questions are sufficiently comprehensive to enable all the issues arising to be addressed by the Court of Justice, and thus avoid the need for a further reference at a later stage of the proceedings, as occurred in the Boehringer Ingelheim v Swingward litigation. In that case Laddie J referred questions to the Court of Justice, which were answered in Case C-143/00 [2002] ECR I-3759. The Court of Appeal subsequently concluded, with regret, that the answers to those questions did not suffice to enable it to deal with the case, and referred further questions to the Court of Justice: [2004] EWCA Civ 575, [2004] ETMR 65. Those questions were answered in Case C-348/04 [2007] ECR I-3391. The second main reason is to try to ensure that the questions are clear and free from avoidable ambiguity or obscurity.

In my experience it is not uncommon for parties addressing the court on the formulation of the questions to attempt to ensure that the questions are worded in a leading manner, that is to say, in a way which suggests the desired answer. In my view that is neither proper nor profitable. It is not proper because the questions should so far as possible be impartially worded. It is not profitable because experience shows that the Court of Justice is usually not concerned with the precise wording of the questions referred, but with their legal substance. Thus the Court of Justice frequently reformulates the question in giving its answer.

As counsel for WPL pointed out, and as I have already mentioned, in the present case the parties provided me with draft questions which were discussed at a hearing. In settling the questions I took into account the parties’ drafts and their comments on each other’s drafts, but the final wording is, for better or worse, my own.

As counsel for WPL submitted, at least to some extent SAS’s proposed amendments to the questions appear designed to bring the wording closer to that originally proposed by SAS. This is particularly true of the proposed amendment to question 1. In my judgment it would not be a proper exercise of any discretion that I may have to permit such an amendment, both because it appears to be an attempt by SAS to have the question worded in a manner which it believes favours its case and because its proper remedy if it objected to my not adopting the wording it proposed was to seek to appeal to the Court of Appeal. In saying this, I do not overlook the fact that SAS proposes to move some of the words excised from question 1 to question 5.

In any event, I am not satisfied that any of the amendments are necessary either to enable the parties to present their respective arguments to the Court of Justice or to enable the Court to give guidance on any of the issues arising in this case. On the contrary, I consider that the existing questions are sufficient for these purposes. By way of illustration, I will take the biggest single amendment, which is the proposed insertion of new paragraph (d) in question 2. In my view, the matters referred to in paragraph (d) are matters that are encompassed within paragraphs (b) and/or (c); or at least can be addressed by the parties, and hence the Court of Justice, in the context provided by paragraphs (b) and/or (c). When I put this to counsel for SAS during the course of argument, he accepted it.

Other amendments counsel for SAS himself presented as merely being minor matters of clarification. In my view none of them amount to the elimination of what would otherwise be ambiguities or obscurities in the questions.

It is fair to say that SAS have identified a small typographical error in question 2 (“interpreting” should read “interpreted”), but in my view this is an obvious error which will not cause any difficulty in the proceedings before the Court of Justice.

Conclusion
It was for these reasons that I decided to dismiss SAS’s application

Quantifying Analytics ROI

Japanese House Crest “Go-Shichi no Kiri”
Image via Wikipedia

I had a brief twitter exchange with Jim Davis, Chief Marketing Officer, SAS Institute on Return of Investment on Business Analytics Projects for customers. I have interviewed Jim Davis before last year https://decisionstats.com/2009/06/05/interview-jim-davis-sas-institute/

Now Jim Davis is a big guy, and he is rushing from the launch of SAS Institute’s Social Media Analytics in Japan- to some arguably difficult flying conditions in time to be home in America for Thanksgiving. That and and I have not been much of a good Blog Boy recently, more swayed by love of open source, than love of software per se. I love equally, given I am bad at both equally.

Anyways, Jim’s contention  ( http://twitter.com/Davis_Jim ) was customers should go in business analytics only if there is Positive Return on Investment.  I am quoting him here-

What is important is that there be a positive ROI on each and every BA project. Otherwise don’t do it.

That’s not the marketing I was taught in my business school- basically it was sell, sell, sell.

However I see most BI sales vendors also go through -let me meet my sales quota for this quarter- and quantifying customer ROI is simple maths than predictive analytics but there seems to be some information assymetry in it.

Here is a paper from North Western University on ROI in IT projects-.

but overall it would be in the interest of customers and Business Analytics Vendors to publish aggregated ROI.

The opponents to this transparency in ROI would be market leaders in market share, who have trapped their customers by high migration costs (due to complexity) or contractually.

A recent study listed Oracle having a large percentage of unhappy customers who would still renew!, SAP had problems when it raised prices for licensing arbitrarily (that CEO is now CEO of HP and dodging legal notices from Oracle).

Indeed Jim Davis’s famous unsettling call for focusing on Business Analytics,as Business Intelligence is dead- that call has been implemented more aggressively by IBM in analytical acquisitions than even SAS itself which has been conservative about inorganic growth. Quantifying ROI, should theoretically aid open source software the most (since they are cheapest in up front licensing) or newer technologies like MapReduce /Hadoop (since they are quite so fast)- but I think that market has a way of factoring in these things- and customers are not as foolish neither as unaware of costs versus benefits of migration.

The contrary to this is Business Analytics and Business Intelligence are imperfect markets with duo-poly  or big players thriving in absence of customer regulation.

You get more protection as a customer of $20 bag of potato chips, than as a customer of a $200,000 software. Regulators are wary to step in to ensure ROI fairness (since most bright techies are qither working for private sector, have their own startup or invested in startups)- who in Govt understands Analytics and Intelligence strong enough to ensure vendor lock-ins are not done, and market flexibility is done. It is also a lower choice for embattled regulators to ensure ROI on enterprise software unlike the aggressiveness they have showed in retail or online software.

Who will Analyze the Analysts and who can quantify the value of quants (or penalize them for shoddy quantitative analytics)- is an interesting phenomenon we expect to see more of.

 

 

Why do bloggers blog ?

Xbox (revision 1.0) internal layout. Including...
Image via Wikipedia

Step 1 is to create internal motivation to create a blog in the first place

Step 2 is to find what to write

Reasons Bloggers Blog-

Basic -Ranting


Examples- I hate Facebook Platform team treats me badly with waits, and breaks my code.

SAS Marketing wont give me  a big discount to make me look good in front of my boss.

Companies  wont give me their software for free- even though I will use it to make money (and not play X Box)

I want my vendors to be FOSS but my customers to switch to SaaS.

Google wont do this- Apple wont do that- Microsoft wont do those.

Revolution would give me 4 great packages but not the open source for RevoScaler (which only 300 people would understand in the first place)

Safety-

I better kiss the Professor and give a Turkey for dinner, as he sits on my thesis committee.

I will recommend Prof X’s lousy book in the hope he recommends my lousy book as a textbook too.

It is safe to laugh when the boss is making a joke-I should comment on her corporate blog, and retweet her.

Belonging-

I belong to this great online community of smart people. Let me agree to what they say.

I really believe in EVERYTHING that ALL the 2 MILLION members of the community have to say ALL the TIME.

I belong to this online community because all my friends are on my computer.

4 Egositic

My blog page rank is now X plus delta tau because of sugary key words (2004)

My technorati numbers rise (2005)

I was once on Digg (2007)

I have Z * exp N followers on Twitter and even more on Facebook (2008)

My Klout is increasing on twitter, My stack overflow reputation ‘s cup floweth over. (2009)

My Karma on Reddit is more important than my Karma in real life (2010)

Self Actualization-

I got time to kill- and I think I may learn more, meet intersting people and discover something wandering on the internet.

All those who wonder are not lost- Wikiquote

I got a story to tell, poems to write, code to give away. A free  Blog is something a Chinese , an Iranian  and a North korean really really know what the value is.

But after all that, WHY Do Bloggers Blog?

  • Because we are still waiting for Facebook to create the Blog Killer.
  • Its better than saying I am unemployed and a social loner
  • Reddit Karma feels good. Any Karma of any kind.

Reputation on Social Networks

Law of Diminishing Marginal Utility
Image via Wikipedia

Classical Economics talks of the value of utlity, diminishing marginal utility if the same things is repeated again and again (like spam in an online community)

StackOverflow has a great way of measuring reputation – and thus allows intangible benefits /awards -similar to wikipedia badges , reddit karma. Utility is also auto generated like @klout  on twitter or lists memberships and other sucessful open source communities online including Ubuntu forums have ways to create ah hierarchies even in class less utopian classes.

Basically it then acts as the motivating game as the mostly boy population try to race on numbers.

 

in Stack Overflow- you can get buddies to upvote you and basically act as a role playing game too.

—–From http://stackoverflow.com/faq#reputation

To gain reputation, post good questions and useful answers. Your peers will vote on your posts, and those votes will cause you to gain (or, in rare cases, lose) reputation:

answer is voted up +10
question is voted up +5
answer is accepted +15 (+2 to acceptor)
post is voted down -2 (-1 to voter)

A maximum of 30 votes can be cast per user per day, and you can earn a maximum of 200 reputation per day (although accepted answers and bounty awards are immune to this limit). Also, please note that votes for any posts marked “community wiki” do not generate reputation.

Amass enough reputation points and Stack Overflow will allow you to go beyond simply asking and answering questions:

15 Vote up
15 Flag offensive
50 Leave comments
100 Edit community wiki posts
125 Vote down (costs 1 rep)
200 Reduced advertising

PAWCON -This week in London

Watch out for the twitter hash news on PAWCON and the exciting agenda lined up. If your in the City- you may want to just drop in

http://www.predictiveanalyticsworld.com/london/2010/agenda.php#day1-7

Disclaimer- PAWCON has been a blog partner with Decisionstats (since the first PAWCON ). It is vendor neutral and features open source as well proprietary software, as well case studies from academia and Industry for a balanced view.

 

Little birdie told me some exciting product enhancements may be in the works including a not yet announced R plugin 😉 and the latest SAS product using embedded analytics and Dr Elder’s full day data mining workshop.

Citation-

http://www.predictiveanalyticsworld.com/london/2010/agenda.php#day1-7

Monday November 15, 2010
All conference sessions take place in Edward 5-7

8:00am-9:00am

Registration, Coffee and Danish
Room: Albert Suites


9:00am-9:50am

Keynote
Five Ways Predictive Analytics Cuts Enterprise Risk

All business is an exercise in risk management. All organizations would benefit from measuring, tracking and computing risk as a core process, much like insurance companies do.

Predictive analytics does the trick, one customer at a time. This technology is a data-driven means to compute the risk each customer will defect, not respond to an expensive mailer, consume a retention discount even if she were not going to leave in the first place, not be targeted for a telephone solicitation that would have landed a sale, commit fraud, or become a “loss customer” such as a bad debtor or an insurance policy-holder with high claims.

In this keynote session, Dr. Eric Siegel will reveal:

  • Five ways predictive analytics evolves your enterprise to reduce risk
  • Hidden sources of risk across operational functions
  • What every business should learn from insurance companies
  • How advancements have reversed the very meaning of fraud
  • Why “man + machine” teams are greater than the sum of their parts for
  • enterprise decision support

 

Speaker: Eric Siegel, Ph.D., Program Chair, Predictive Analytics World

Top of this page ] [ Agenda overview ]


IBM9:50am-10:10am

Platinum Sponsor Presentation
The Analytical Revolution

The algorithms at the heart of predictive analytics have been around for years – in some cases for decades. But now, as we see predictive analytics move to the mainstream and become a competitive necessity for organisations in all industries, the most crucial challenges are to ensure that results can be delivered to where they can make a direct impact on outcomes and business performance, and that the application of analytics can be scaled to the most demanding enterprise requirements.

This session will look at the obstacles to successfully applying analysis at the enterprise level, and how today’s approaches and technologies can enable the true “industrialisation” of predictive analytics.

Speaker: Colin Shearer, WW Industry Solutions Leader, IBM UK Ltd

Top of this page ] [ Agenda overview ]


Deloitte10:10am-10:20am

Gold Sponsor Presentation
How Predictive Analytics is Driving Business Value

Organisations are increasingly relying on analytics to make key business decisions. Today, technology advances and the increasing need to realise competitive advantage in the market place are driving predictive analytics from the domain of marketers and tactical one-off exercises to the point where analytics are being embedded within core business processes.

During this session, Richard will share some of the focus areas where Deloitte is driving business transformation through predictive analytics, including Workforce, Brand Equity and Reputational Risk, Customer Insight and Network Analytics.

Speaker: Richard Fayers, Senior Manager, Deloitte Analytical Insight

Top of this page ] [ Agenda overview ]


10:20am-10:45am

Break / Exhibits
Room: Albert Suites


10:45am-11:35am
Healthcare
Case Study: Life Line Screening
Taking CRM Global Through Predictive Analytics

While Life Line is successfully executing a US CRM roadmap, they are also beginning this same evolution abroad. They are beginning in the UK where Merkle procured data and built a response model that is pulling responses over 30% higher than competitors. This presentation will give an overview of the US CRM roadmap, and then focus on the beginning of their strategy abroad, focusing on the data procurement they could not get anywhere else but through Merkle and the successful modeling and analytics for the UK.

Speaker: Ozgur Dogan, VP, Quantitative Solutions Group, Merkle Inc.

Speaker: Trish Mathe, Life Line Screening

Top of this page ] [ Agenda overview ]


11:35am-12:25pm
Open Source Analytics; Healthcare
Case Study: A large health care organization
The Rise of Open Source Analytics: Lowering Costs While Improving Patient Care

Rapidminer and R were the number 1 and 2 in this years annual KDNuggets data mining tool usage poll, followed by Knime on place 4 and Weka on place 6. So what’s going on here? Are these open source tools really that good or is their popularity strongly correlated with lower acquisition costs alone? This session answers these questions based on a real world case for a large health care organization and explains the risks & benefits of using open source technology. The final part of the session explains how these tools stack up against their traditional, proprietary counterparts.

Speaker: Jos van Dongen, Associate & Principal, DeltIQ Group

Top of this page ] [ Agenda overview ]


12:25pm-1:25pm

Lunch / Exhibits
Room: Albert Suites


1:25pm-2:15pm
Keynote
Thought Leader:
Case Study: Yahoo! and other large on-line e-businesses
Search Marketing and Predictive Analytics: SEM, SEO and On-line Marketing Case Studies

Search Engine Marketing is a $15B industry in the U.S. growing to double that number over the next 3 years. Worldwide the SEM market was over $50B in 2010. Not only is this a fast growing area of marketing, but it is one that has significant implications for brand and direct marketing and is undergoing rapid change with emerging channels such as mobile and social. What is unique about this area of marketing is a singularly heavy dependence on analytics:

 

  • Large numbers of variables and options
  • Real-time auctions/bids and a need to adjust strategies in real-time
  • Difficult optimization problems on allocating spend across a huge number of keywords
  • Fast-changing competitive terrain and heavy competition on the obvious channels
  • Complicated interactions between various channels and a large choice of search keyword expansion possibilities
  • Profitability and ROI analysis that are complex and often challenging

 

The size of the industry, its growing importance in marketing, its upcoming role in Mobile Advertising, and its uniquely heavy reliance on analytics makes it particularly interesting as an area for predictive analytics applications. In this session, not only will hear about some of the latest strategies and techniques to optimize search, you will hear case studies that illustrate the important role of analytics from industry practitioners.

Speaker: Usama Fayyad, , Ph.D., CEO, Open Insights

Top of this page ] [ Agenda overview ]


SAS2:15pm-2:35pm

Platinum Sponsor Presentation
Creating a Model Factory Using in-Database Analytics

With the ever-increasing number of analytical models required to make fact-based decisions, as well as increasing audit compliance regulations, it is more important than ever that these models can be created, monitored, retuned and deployed as quickly and automatically as possible. This paper, using a case study from a major financial organisation, will show how organisations can build a model factory efficiently using the latest SAS technology that utilizes the power of in-database processing.

Speaker: John Spooner, Analytics Specialist, SAS (UK)

Top of this page ] [ Agenda overview ]


2:35pm-2:45pm

Session Break
Room: Albert Suites


2:45pm-3:35pm

Retail
Case Study: SABMiller
Predictive Analytics & Global Marketing Strategy

Over the last few years SABMiller plc, the second largest brewing company in the world operating in 70 countries, has been systematically segmenting its markets in different countries globally in order optimize their portfolio strategy & align it to their long term country specific growth strategy. This presentation talks about the overall methodology followed and the challenges that had to be overcome both from a technical as well as from a change management stand point in order to successfully implement a standard analytics approach to diverse markets and diverse business positions in a highly global setting.

The session explains how country specific growth strategies were converted to objective variables and consumption occasion segments were created that differentiated the market effectively by their growth potential. In addition to this the presentation will also provide a discussion on issues like:

  • The dilemmas of static vs. dynamic solutions and standardization vs. adaptable solutions
  • Challenges in acceptability, local capability development, overcoming implementation inertia, cost effectiveness, etc
  • The role that business partners at SAB and analytics service partners at AbsolutData together play in providing impactful and actionable solutions

 

Speaker: Anne Stephens, SABMiller plc

Speaker: Titir Pal, AbsolutData

Top of this page ] [ Agenda overview ]


3:35pm-4:25pm

Retail
Case Study: Overtoom Belgium
Increasing Marketing Relevance Through Personalized Targeting

 

Since many years, Overtoom Belgium – a leading B2B retailer and division of the French Manutan group – focuses on an extensive use of CRM. In this presentation, we demonstrate how Overtoom has integrated Predictive Analytics to optimize customer relationships. In this process, they employ analytics to develop answers to the key question: “which product should we offer to which customer via which channel”. We show how Overtoom gained a 10% revenue increase by replacing the existing segmentation scheme with accurate predictive response models. Additionally, we illustrate how Overtoom succeeds to deliver more relevant communications by offering personalized promotional content to every single customer, and how these personalized offers positively impact Overtoom’s conversion rates.

Speaker: Dr. Geert Verstraeten, Python Predictions

Top of this page ] [ Agenda overview ]


4:25pm-4:50pm

Break / Exhibits
Room: Albert Suites


4:50pm-5:40pm
Uplift Modelling:
Case Study: Lloyds TSB General Insurance & US Bank
Uplift Modelling: You Should Not Only Measure But Model Incremental Response

Most marketing analysts understand that measuring the impact of a marketing campaign requires a valid control group so that uplift (incremental response) can be reported. However, it is much less widely understood that the targeting models used almost everywhere do not attempt to optimize that incremental measure. That requires an uplift model.

This session will explain why a switch to uplift modelling is needed, illustrate what can and does go wrong when they are not used and the hugely positive impact they can have when used effectively. It will also discuss a range of approaches to building and assessing uplift models, from simple basic adjustments to existing modelling processes through to full-blown uplift modelling.

The talk will use Lloyds TSB General Insurance & US Bank as a case study and also illustrate real-world results from other companies and sectors.

 

Speaker: Nicholas Radcliffe, Founder and Director, Stochastic Solutions

Top of this page ] [ Agenda overview ]


5:40pm-6:30pm

Consumer services
Case Study: Canadian Automobile Association and other B2C examples
The Diminishing Marginal Returns of Variable Creation in Predictive Analytics Solutions

 

Variable Creation is the key to success in any predictive analytics exercise. Many different approaches are adopted during this process, yet there are diminishing marginal returns as the number of variables increase. Our organization conducted a case study on four existing clients to explore this so-called diminishing impact of variable creation on predictive analytics solutions. Existing predictive analytics solutions were built using our traditional variable creation process. Yet, presuming that we could exponentially increase the number of variables, we wanted to determine if this added significant benefit to the existing solution.

Speaker: Richard Boire, BoireFillerGroup

Top of this page ] [ Agenda overview ]


6:30pm-7:30pm

Reception / Exhibits
Room: Albert Suites


Tuesday November 16, 2010
All conference sessions take place in Edward 5-7

8:00am-9:00am

Registration, Coffee and Danish
Room: Albert Suites


9:00am-9:55am
Keynote
Multiple Case Studies: Anheuser-Busch, Disney, HP, HSBC, Pfizer, and others
The High ROI of Data Mining for Innovative Organizations

Data mining and advanced analytics can enhance your bottom line in three basic ways, by 1) streamlining a process, 2) eliminating the bad, or 3) highlighting the good. In rare situations, a fourth way – creating something new – is possible. But modern organizations are so effective at their core tasks that data mining usually results in an iterative, rather than transformative, improvement. Still, the impact can be dramatic.

Dr. Elder will share the story (problem, solution, and effect) of nine projects conducted over the last decade for some of America’s most innovative agencies and corporations:

    Streamline:

  • Cross-selling for HSBC
  • Image recognition for Anheuser-Busch
  • Biometric identification for Lumidigm (for Disney)
  • Optimal decisioning for Peregrine Systems (now part of Hewlett-Packard)
  • Quick decisions for the Social Security Administration
    Eliminate Bad:

  • Tax fraud detection for the IRS
  • Warranty Fraud detection for Hewlett-Packard
    Highlight Good:

  • Sector trading for WestWind Foundation
  • Drug efficacy discovery for Pharmacia & UpJohn (now Pfizer)

Moderator: Eric Siegel, Program Chair, Predictive Analytics World

Speaker: John Elder, Ph.D., Elder Research, Inc.

Also see Dr. Elder’s full-day workshop

 

Top of this page ] [ Agenda overview ]


9:55am-10:30am

Break / Exhibits
Room: Albert Suites


10:30am-11:20am
Telecommunications
Case Study: Leading Telecommunications Operator
Predictive Analytics and Efficient Fact-based Marketing

The presentation describes what are the major topics and issues when you introduce predictive analytics and how to build a Fact-Based marketing environment. The introduced tools and methodologies proved to be highly efficient in terms of improving the overall direct marketing activity and customer contact operations for the involved companies. Generally, the introduced approaches have great potential for organizations with large customer bases like Mobile Operators, Internet Giants, Media Companies, or Retail Chains.

Main Introduced Solutions:-Automated Serial Production of Predictive Models for Campaign Targeting-Automated Campaign Measurements and Tracking Solutions-Precise Product Added Value Evaluation.

Speaker: Tamer Keshi, Ph.D., Long-term contractor, T-Mobile

Speaker: Beata Kovacs, International Head of CRM Solutions, Deutsche Telekom

Top of this page ] [ Agenda overview ]


11:20am-11:25am

Session Changeover


11:25am-12:15pm
Thought Leader
Nine Laws of Data Mining

Data mining is the predictive core of predictive analytics, a business process that finds useful patterns in data through the use of business knowledge. The industry standard CRISP-DM methodology describes the process, but does not explain why the process takes the form that it does. I present nine “laws of data mining”, useful maxims for data miners, with explanations that reveal the reasons behind the surface properties of the data mining process. The nine laws have implications for predictive analytics applications: how and why it works so well, which ambitions could succeed, and which must fail.

 

Speaker: Tom Khabaza, khabaza.com

 

Top of this page ] [ Agenda overview ]


12:15pm-1:30pm

Lunch / Exhibits
Room: Albert Suites


1:30pm-2:25pm
Expert Panel: Kaboom! Predictive Analytics Hits the Mainstream

Predictive analytics has taken off, across industry sectors and across applications in marketing, fraud detection, credit scoring and beyond. Where exactly are we in the process of crossing the chasm toward pervasive deployment, and how can we ensure progress keeps up the pace and stays on target?

This expert panel will address:

  • How much of predictive analytics’ potential has been fully realized?
  • Where are the outstanding opportunities with greatest potential?
  • What are the greatest challenges faced by the industry in achieving wide scale adoption?
  • How are these challenges best overcome?

 

Panelist: John Elder, Ph.D., Elder Research, Inc.

Panelist: Colin Shearer, WW Industry Solutions Leader, IBM UK Ltd

Panelist: Udo Sglavo, Global Analytic Solutions Manager, SAS

Panel moderator: Eric Siegel, Ph.D., Program Chair, Predictive Analytics World


2:25pm-2:30pm

Session Changeover


2:30pm-3:20pm
Crowdsourcing Data Mining
Case Study: University of Melbourne, Chessmetrics
Prediction Competitions: Far More Than Just a Bit of Fun

Data modelling competitions allow companies and researchers to post a problem and have it scrutinised by the world’s best data scientists. There are an infinite number of techniques that can be applied to any modelling task but it is impossible to know at the outset which will be most effective. By exposing the problem to a wide audience, competitions are a cost effective way to reach the frontier of what is possible from a given dataset. The power of competitions is neatly illustrated by the results of a recent bioinformatics competition hosted by Kaggle. It required participants to pick markers in HIV’s genetic sequence that coincide with changes in the severity of infection. Within a week and a half, the best entry had already outdone the best methods in the scientific literature. This presentation will cover how competitions typically work, some case studies and the types of business modelling challenges that the Kaggle platform can address.

Speaker: Anthony Goldbloom, Kaggle Pty Ltd

Top of this page ] [ Agenda overview ]


3:20pm-3:50pm

Breaks /Exhibits
Room: Albert Suites


3:50pm-4:40pm
Human Resources; e-Commerce
Case Study: Naukri.com, Jeevansathi.com
Increasing Marketing ROI and Efficiency of Candidate-Search with Predictive Analytics

InfoEdge, India’s largest and most profitable online firm with a bouquet of internet properties has been Google’s biggest customer in India. Our team used predictive modeling to double our profits across multiple fronts. For Naukri.com, India’s number 1 job portal, predictive models target jobseekers most relevant to the recruiter. Analytical insights provided a deeper understanding of recruiter behaviour and informed a redesign of this product’s recruiter search functionality. This session will describe how we did it, and also reveal how Jeevansathi.com, India’s 2nd-largest matrimony portal, targets the acquisition of consumers in the market for marriage.

 

Speaker: Suvomoy Sarkar, Chief Analytics Officer, HT Media & Info Edge India (parent company of the two companies above)

 

Top of this page ] [ Agenda overview ]


4:40pm-5:00pm
Closing Remarks

Speaker: Eric Siegel, Ph.D., Program Chair, Predictive Analytics World

Top of this page ] [ Agenda overview ]


Wednesday November 17, 2010

Full-day Workshop
The Best and the Worst of Predictive Analytics:
Predictive Modeling Methods and Common Data Mining Mistakes

Click here for the detailed workshop description

  • Workshop starts at 9:00am
  • First AM Break from 10:00 – 10:15
  • Second AM Break from 11:15 – 11:30
  • Lunch from 12:30 – 1:15pm
  • First PM Break: 2:00 – 2:15
  • Second PM Break: 3:15 – 3:30
  • Workshop ends at 4:30pm

Speaker: John Elder, Ph.D., CEO and Founder, Elder Research, Inc.