Home » Posts tagged 'website'
Tag Archives: website
If you like me, hate to get down and dirty in HTML, CSS , JQuery ( not mentioning the excellent Code Academy HTML/CSS tutorials and JQuery Track ) and want to create a pretty simple website for yourself- Jetstrap helps you build the popular Twitter Bootstrap design (very minimalistic) for websites.
And it’s free! And click and point and paste your content- and awesome CSS, HTML. Allows you to download the HTML to paste in your existing site!
Here is one I created in 5 minutes!
So lose your old website! Because not every website needs WordPress!
Try Jetstrap for Bootstrap!
UPDATED POST- Some Models I use for Business Strategy- to analyze the huge reams of qualitative and uncertain data that business generates.
- Porters 5 forces Model-To analyze industries
- BCG Matrix- To analyze Product Portfolios
- Porters Diamond Model- To analyze locations
- McKinsey 7 S Model-To analyze teams
- Gernier Theory- To analyze growth of organization
- Herzberg Hygiene Theory- To analyze soft aspects of individuals
- Marketing Mix Model- To analyze marketing mix.
I really liked this 2002 presentation on Website Hacks at blackhat.com/presentations/bh-asia-02/bh-asia-02-shah.pdf . It explains in a easy manner some common fundamentals in hacking websites. Take time to go through this- its a good example of how hacking tutorials need to be created if you want to expand the number of motivated hackers.
However a more recent list of hacks is here-
The Top Ten
- CRIME (1, 2, 3 4) by Juliano Rizzo and Thai Duong
- Pwning via SSRF (memcached, php-fastcgi, etc) (2, 3, 4, 5)
- Chrome addon hacking (2, 3, 4, 5)
- Bruteforce of PHPSESSID
- Cross-Site Port Attacks
- Permanent backdooring of HTML5 client-side application
- CAPTCHA Re-Riding Attack
- XSS: Gaining access to HttpOnly Cookie in 2012
- Attacking OData: HTTP Verb Tunneling, Navigation Properties for Additional Data Access, System Query Options ($select)
But a more widely used ranking method for Website Hacking is here. Note it is a more formal but probably a more recent document than the pdf above. If only it could be made into an easier to read tutorial, it would greatly improve website exploit security strength.
The Release Candidate for the OWASP Top 10 for 2013 is now available here: OWASP Top 10 – 2013 – Release Candidate
The OWASP Top 10 – 2013 Release Candidate includes the following changes as compared to the 2010 edition:
- A1 Injection
- A2 Broken Authentication and Session Management (was formerly A3)
- A3 Cross-Site Scripting (XSS) (was formerly A2)
- A4 Insecure Direct Object References
- A5 Security Misconfiguration (was formerly A6)
- A6 Sensitive Data Exposure (merged from former A7 Insecure Cryptographic Storage and former A9 Insufficient Transport Layer Protection)
- A7 Missing Function Level Access Control (renamed/broadened from former A8 Failure to Restrict URL Access)
- A8 Cross-Site Request Forgery (CSRF) (was formerly A5)
- A9 Using Known Vulnerable Components (new but was part of former A6 – Security Misconfiguration)
- A10 Unvalidated Redirects and Forwards
Once again, I am presenting this as an example of how lucid documentation can help spread technological awareness to people affected by technical ignorance and lacking the savvy and chops for self-learning. If you need better cyber security, you need better documentation and tutorials on hacking for improving the quantity and quality of the pool of available hackers and bringing in young blood to enhance your cyber security edge.
The lovely lovely diagram at https://developer.linkedin.com/documents/oauth-overview is worth a thousand words and errors.
Very useful if you are trying to coax rCurl to do the job for you.
Also a great slideshare in Japanese (no! Google Translate didnt work on pdf’s and slideshares and scribds (why!!) but still very lucid on using OAuth with R for Twitter.
Why use OAuth- you get 350 calls per hour for authenticated sessions than 150 calls .
I tried but failed using registerTwitterOAuth
There is a real need for a single page where you can go and see which social netowork /website is using what kind of oAuth, which url within that website has your API keys, and the accompanying R Code for the same. Google Plus,LinkedIn, Twitter, Facebook all can be scraped better by OAuth. Something like this-