A thing that strikes me when I was a student of statistics is that most theories of sampling, testing of hypothesis and modeling were built in an age where data was predominantly insufficient, computation was inherently manual and results of tests aimed at large enough differences.
I look now at the explosion of data, at the cloud computing enabled processing power on demand, and competitive dynamics of businesses to venture out my opinion-
1) We now have large , even excess data than we had before for statisticians a generation ago.
2) We now have extremely powerful computing devices, provided we can process our algorithms in parallel.
3) Even a slight uptick in modeling efficiency or mild uptick in business insight can provide huge monetary savings.
Call it High Performance Analytics or Big Data or Cloud Computing- are we sure statisticians are creating enough mathematical theory or are we just taking it easy in our statistics classrooms only to be subjected to something completely different when we hit the analytics workplace.
Do we need more theorists as well? Is there ANY incentive for corporations with private R and D research teams to share their latest cutting edge theoretical work outside their corporate silo.
“a mathematician is a machine for turning coffee into theorems”