Top 10 Web Security Threats

The Open Web Application Security Project (OWASP) has released the list of web security threats of 2010 . These are the threats which has caused the maximum damage to cyber world in 2010.
Throughout the world, governments, telecommunication industries, defense industries and financial industries are increasingly targeted by the cyber attacks from criminals.

10 Unvalidated Redirects and Forwards

Web applications frequently redirect and forward users to other pages and websites, and use untrusted data to determine the destination pages. Without proper validation, attackers can redirect victims to phishing or malware sites, or use forwards to access unauthorized pages.
Consider anyone who can trick your users into submitting a request to your website. Any website or other HTML feed that your users use could do this. Attacker links to unvalidated redirect and tricks victims into clicking it. Victims are more likely to click on it, since the link is to a valid site. Attacker targets unsafe forward to bypass security checks. Applications frequently redirect users to other pages, or use internal forwards in a similar manner. Sometimes the target page is specified in an unvalidated parameter, allowing attackers to choose the destination page. Such redirects may attempt to install malware or trick victims into disclosing passwords or other sensitive information. Unsafe forwards may allow access control bypass. Consider the business value of retaining your users’ trust.


9. Insufficient Transport Layer Protection

Insufficient transport layer protection allows communication to be exposed to untrusted third-parties, providing an attack vector to compromise a web application and/or steal sensitive information. Websites typically use Secure Sockets Layer / Transport Layer Security (SSL/TLS) to provide encryption at the transport layer. However, unless the website is configured to use SSL/TLS and configured to use SSL/TLS properly, the website may be vulnerable to traffic interception and modification.

When the transport layer is not encrypted, all communication between the website and client is sent in clear-text which leaves it open to interception, injection and redirection (also known as a man-in-the-middle/MITM attack).

Historically, high grade cryptography was restricted from export to outside the United States. Because of this, websites were configured to support weak cryptographic options for those clients that were restricted to only using weak ciphers. Weak ciphers are vulnerable to attack because of the relative ease of breaking them; less than two weeks on a typical home computer and a few seconds using dedicated hardware.


8. Failure to Restrict URL Access

The only protection for a URL is that links to that page are not presented to unauthorized users. However, a motivated, skilled, or just plain lucky attacker may be able to find and access these pages, invoke functions, and view data. Security by obscurity is not sufficient to protect sensitive functions and data in an application. Access control checks must be performed before a request to a sensitive function is granted, which ensures that the user is authorized to access that function.
The primary attack method for this vulnerability is called “forced browsing”, which encompasses guessing links and brute force techniques to find unprotected pages. Applications frequently allow access control code to evolve and spread throughout a codebase, resulting in a complex model that is difficult to understand for developers and security specialists alike. This complexity makes it likely that errors will occur and pages will be missed, leaving them exposed.


7. Insecure Cryptographic Storage

Protecting sensitive data with cryptography has become a key part of most web applications. Simply failing to encrypt sensitive data is very widespread. Applications that do encrypt frequently contain poorly designed cryptography, either using inappropriate ciphers or making serious mistakes using strong ciphers. These flaws can lead to disclosure of sensitive data and compliance violations.
Lots of sensitive information (passwords, account numbers, salary…) is store in web applications hence data are encrypted to protect them. Of course encrypt sensible data is something that we have to keep in mind when we develop applications but sometimes programmers overlook other issues by thinking encryption is good enough to ensure security.

We have seen several vulnerabilities that try to modify parameters which are sent between pages. By this reason a good way to protect our web applications against these kinds of attacks would be to encrypt the information that is sent by POST or GET methods like clean application does.
Basically the process consists of encrypting the parameters which will be sent just before submitting the page.


Related posts

Take A Look At The World’s Most Dog-Friendly Office


Aldebaran Robotics Nao Robot Show .mp4


Dancing Sony Robots