Home » Posts tagged 'network' (Page 3)
Tag Archives: network
Continuing the DecisionStats series on trends for 2012, Timo Elliott , Technology Evangelist at SAP Business Objects, looks at the predictions he made in the beginning of 2011 and follows up with the things that surprised him in 2011, and what he foresees in 2012.
You can read last year’s predictions by Mr Elliott at http://www.decisionstats.com/brief-interview-timo-elliott/
Timo- Here are my comments on the “top three analytics trends” predictions I made last year:
(1) Analytics, reinvented. New DW techniques make it possible to do sub-second, interactive analytics directly against row-level operational data. Now BI processes and interfaces need to be rethought and redesigned to make best use of this — notably by blurring the distinctions between the “design” and “consumption” phases of BI.
I spent most of 2011 talking about this theme at various conferences: how existing BI technology israpidly becoming obsolete and how the changes are akin to the move from film to digital photography. Technology that has been around for many years (in-memory, column stores, datawarehouse appliances, etc.) came together to create exciting new opportunities and even generally-skeptical industry analysts put out press releases such as “Gartner Says Data Warehousing Reaching Its Most Significant Inflection Point Since Its Inception.” Some of the smaller BI vendors had been pushing in-memory analytics for years, but the general market started paying more attention when megavendors like SAP started painting a long-term vision of in-memory becoming a core platform for applications, not just analytics. Database leader Oracle was forced to upgrade their in-memory messaging from “It’s a complete fantasy” to “we have that too”.
(2) Corporate and personal BI come together. The ability to mix corporate and personal data for quick, pragmatic analysis is a common business need. The typical solution to the problem — extracting and combining the data into a local data store (either Excel or a departmental data mart) — pleases users, but introduces duplication and extra costs and makes a mockery of information governance. 2011 will see the rise of systems that let individuals and departments load their data into personal spaces in the corporate environment, allowing pragmatic analytic flexibility without compromising security and governance.
The number of departmental “data discovery” initiatives continued to rise through 2011, but new tools do make it easier for business people to upload and manipulate their own information while using the corporate standards. 2012 will see more development of “enterprise data discovery” interfaces for casual users.
(3) The next generation of business applications. Where are the business applications designed to support what people really do all day, such as implementing this year’s strategy, launching new products, or acquiring another company? 2011 will see the first prototypes of people-focused, flexible, information-centric, and collaborative applications, bringing together the best of business intelligence, “enterprise 2.0”, and existing operational applications.
2011 saw the rise of sophisticated, user-centric mobile applications that combine data from corporate systems with GPS mapping and the ability to “take action”, such as mobile medical analytics for doctors or mobile beauty advisor applications, and collaborative BI started becoming a standard part of enterprise platforms.
And one that should happen, but probably won’t: (4) Intelligence = Information + PEOPLE. Successful analytics isn’t about technology — it’s about people, process, and culture. The biggest trend in 2011 should be organizations spending the majority of their efforts on user adoption rather than technical implementation.
Unsurprisingly, there was still high demand for presentations on why BI projects fail and how to implement BI competency centers. The new architectures probably resulted in even more emphasis on technology than ever, while business peoples’ expectations skyrocketed, fueled by advances in the consumer world. The result was probably even more dissatisfaction in the past, but the benefits of the new architectures should start becoming clearer during 2012.
What surprised me the most:
The rapid rise of Hadoop / NoSQL. The potentials of the technology have always been impressive, but I was surprised just how quickly these technology has been used to address real-life business problems (beyond the “big web” vendors where it originated), and how quickly it is becoming part of mainstream enterprise analytic architectures (e.g. Sybase IQ 15.4 includes native MapReduce APIs, Hadoop integration and federation, etc.)
Prediction for 2012:
As I sat down to gather my thoughts about BI in 2012, I quickly came up with the same long laundry list of BI topics as everybody else: in-memory, mobile, predictive, social, collaborative decision-making, data discovery, real-time, etc. etc. All of these things are clearly important, and where going to continue to see great improvements this year. But I think that the real “next big thing” in BI is what I’m seeing when I talk to customers: they’re using these new opportunities not only to “improve analytics” but also fundamentally rethink some of their key business processes.
Instead of analytics being something that is used to monitor and eventually improve a business process, analytics is becoming a more fundamental part of the business process itself. One example is a large telco company that has transformed the way they attract customers. Instead of laboriously creating a range of rate plans, promoting them, and analyzing the results, they now use analytics to automatically create hundreds of more complex, personalized rate plans. They then throw them out into the market, monitor in real time, and quickly cull any that aren’t successful. It’s a way of doing business that would have been inconceivable in the past, and a lot more common in the future.
Timo Elliott is a 20-year veteran of SAP BusinessObjects, and has spent the last quarter-century working with customers around the world on information strategy.
He works closely with SAP research and innovation centers around the world to evangelize new technology prototypes.
His popular Business Analytics blog tracks innovation in analytics and social media, including topics such as augmented corporate reality, collaborative decision-making, and social network analysis.
His PowerPoint Twitter Tools lets presenters see and react to tweets in real time, embedded directly within their slides.
A popular and engaging speaker, Elliott presents regularly to IT and business audiences at international conferences, on subjects such as why BI projects fail and what to do about it, and the intersection of BI and enterprise 2.0.
Prior to Business Objects, Elliott was a computer consultant in Hong Kong and led analytics projects for Shell in New Zealand. He holds a first-class honors degree in Economics with Statistics from Bristol University, England
Timo can be contacted via Twitter at https://twitter.com/timoelliott
Part 1 of this series was from James Kobielus, Forrestor at http://www.decisionstats.com/jim-kobielus-on-2012/
The slick website of Oracle Public Cloud- coming soon to an office near your location.
and including the oracle social network
Oracle Social Network
A secure collaboration tool for everyone you work with.
While Tor remains the tool of choice with pseudo-techie hacker wannabes , there is enough juice and smoke and mirrors on the market to confuse your average Joe.
For a secure browsing experience on Mobile – do NOT use either Apple or Windows OS
Use Android and this app called Orbot in particular
Orbot is easy to install by simply scanning the following QR code with your Android Barcode scanner.
Orbot is available in the Android Market.
If you have a Dell PC, well just use PeerNet to configure and set up your own network around the neighbourhood. This is particularly applicable if you are in country that is both repressive and not so technologically advanced. Wont work in China or USA.
What is a peer network?
A peer network is a network in which one computer can connect directly to another computer. This capability is accomplished by enabling access point (AP) functionality on one of the computers. Other computers can then connect to this computer in the same way that they would connect to a physical AP. If Internet Connection Sharing is enabled on the computer that has the AP functionality, computers that connect to that computer have Internet connectivity as well.
A basic peer network, which requires no networking knowledge or experience to set up, should meet the needs of most home users and small businesses. By default, a basic peer network is configured with the strongest available security (see How do I set up a basic peer network?).
For users who are familiar with wireless networking technology, advanced configuration features are available to do the following:
• Change security settings (see How do I configure my peer network?)
• Choose which method (push button or PIN) computers with Wi-Fi Protected Setup™ capability can join your peer network (see How do I allow peer devices to join my peer network using Wi-Fi Protected Setup technology?)
• Change the DHCP Server IP address (see How do I configure my peer network?).
• Change the channel on which to operate your peer network (see How do I configure my peer network?)
create a seperate Linux (Ubuntu for ease) virtual disc, then download the Tor Browser Bundle from
https://www.torproject.org/projects/torbrowser.html.en for surfing and a Peernet (above) or a prepaid one time use disposable mobile pre-paid wireless card. It is also quite easy to delete your virtual disc in times of emergencies (but it is best to use encryption even when in Ubuntu https://help.ubuntu.com/community/EncryptedHome)
IRC chat is less secure than you think it is thanks to BOT Trawlers- so I am hoping someone in the open source community updates Waste Again for encrypted chats http://wasteagain.sourceforge.net/
What is “WASTE again”?
“WASTE again” enables you to create a decentralized and secure private mesh network using an unsecure network, such as the internet. Once the public encryption keys are exchanged, sending messages, creating groupchats and transferring files is easy and secure.
Creating a mesh
To create a mesh you need at least two computers with “WASTE again” installed. During installation, a unique pair of public and private keys for each computer is being generated. Before the first connection can be established, you need to exchange these public keys. These keys enable “WASTE again” to authenticate every connection to other “WASTE again” clients.
After exchanging the keys, you simply type in the computers IP address to connect to. If that computer is located behind a firewall or a NAT-router, you have to create a portmap first to enable incoming connections.
At least one computer in your mesh has to be able to accept incoming connections, making it a “public node”. If no direct connection between two firewalled computers can be made, “WASTE again” automatically routes your traffic through one or more of the available public nodes.
Every new node simply has to exchange keys with one of the connected nodes and then connect to it. All the other nodes will exchange their keys automatically over the mesh.
I use Windows 7 on my laptop (it came pre-installed) and Ubuntu using the VMWare Player. What are the advantages of using VM Player instead of creating a dual-boot system? Well I can quickly shift from Ubuntu to Windows and bakc again without restarting my computer everytime. Using this approach allows me to utilize software that run only on Windows and run software like Rattle, the R data mining GUI, that are much easier installed on Linux.
However if your statistical software is on your Virtual Disk , and your data is on your Windows disk, you need a way to move data from Windows to Ubuntu.
The solution to this as per Ubuntu forums is -http://communities.vmware.com/thread/55242
Open My Computer, browse to the folder you want to share. Right-click on the folder, select Properties. Sharing tab. Select the radio button to “Share this Folder”. Change the default generated name if you wish; add a description if you wish. Click the Permissions button to modify the security settings of what users can read/write to the share.
On the Linux side, it depends on the distro, the shell, and the window manager.
Well Ubuntu makes it really easy to configure the Linux steps to move data within Windows and Linux partitions.
VMmare makes it easy to share between your Windows (host) and Linux (guest) OS
and step 2
Start the Wizard
when you finish the wizard and share a drive or folder- hey where do I see my shared ones-
see this folder in Linux- /mnt/hgfs (bingo!)
Hacker HW – Make this folder //mnt/hgfs a shortcut in Places your Ubuntu startup
Hacker Hw 2-
Upload using an anon email your VM dark data to Ubuntu one
Purge using software XX
Reinstall VM and bring back backup
Note time to do this
-General Sharing in Windows
Just open the Network tab in Ubuntu- see screenshots below-
Windows will now ask your Ubuntu user for login-
Once Logged in Windows from within Ubuntu Vmware, this is what happens
You see a tab called “users on “windows username”- pc appear on your Ubuntu Desktop (see top right of the screenshot)
If you double click it- you see your windows path
You can now just click and drag data between your windows and linux partitions , just the way you do it in Windows .
So based on this- if you want to build decision trees, artifical neural networks, regression models, and even time series models for zero capital expenditure- you can use both Ubuntu/R without compromising on your IT policy of Windows only in your organization (there is a shortage of Ubuntu trained IT administrators in the enterprise world)
Revised Installation Procedure for utilizing both Ubuntu /R/Rattle data mining on your Windows PC.
Using VMWare to build a free data mining system in R, as well as isolate your analytics system (thus using both Linux and Windows without overburdening your machine)
- http://downloads.vmware.com/d/info/desktop_end_user_computing/vmware_player/4_0Download and Install
- http://www.ubuntu.com/download/ubuntu/downloadDownload Only
- Create New Virtual Image in VM Ware Player
- Applications—–Terminal——sudo apt get-install R (to download and install)
- sudo R (to open R)
- Once R is opened type this —-install.packages(rattle)—– This will install rattle
- library(rattle) will load Rattle—–
- rattle() will open the GUI—-Getting Data from Host to Guest VMNext Time
- Go to VM Player
- Open the VM
- sudo R in terminal to bring up R
- library(rattle) within R
- rattle()At this point even if you dont know any Linux and dont know any R, you can create data mining models using the Rattle GUI (and time series model using E pack in the R Commander GUI) – What can Rattle do in data mining? See this slideshow-http://www.decisionstats.com/data-mining-with-r-gui-rattle-rstats/If Google Docs is banned as per your enterprise organizational IT policy of having Windows Explorer only- well you can see these screenshots http://rattle.togaware.com/rattle-screenshots.html
Social Gaming is slightly different from arcade gaming, and the heavy duty PSP3, XBox, Wii world of gaming. Some observations on my research ( ) on social gaming across internet is as follows-
There are mostly 3 types of social games-
1) Quest- Build a town/area/farm to earn in game money or points
2) Fight- fight other people /players /pigs earn in game money or points
3) Puzzle- Stack up, make three of a kind, etc
Most successful social games are a crossover between the above three kinds of social games (so build and fight, or fight and puzzle etc)
In addition most social games have some in game incentives that are peculiar to social networks only. In game incentives are mostly in game cash to build, energy to fight others, or shortcuts in puzzle games. These social gaming incentives are-
1) Some incentive to log in daily/regularly/visit game site more often
2) Some incentive to invite other players on the social network
A characteristic of this domain is blatant me-too, copying and ripping creative ideas (but not the creative itself) from other social games. In general the successful game which is the early leader gets most of the players but other game studios can and do build up substantial long tail network of players by copying games. Thus there are a huge variety of games.
However there are massive hits like Farmville and Angry Birds, that prove that a single social game well executed can be very valuable and profitable to both itself as well as the primary social network hosting it.
Accordingly the leading game studios are Zynga, Electronic Arts and (yes) Microsoft while Google has been mostly a investor in these.
As you can see below Appdata is a formidable data gatherer here (though I find the top App – Static HTML as both puzzling and a sign of un corrected automated data gathering),
but I expect more competition in this very lucrative segment.