Microsoft releases library to help mitigate cross-site scripting

Many web applications today exhibit security vulnerabilities due to the lack of proper input validation and output encoding. Though numerous development platforms exist, none have a foolproof way to provide complete protection from attacks such as parameter manipulation or cross-site scripting (XSS). Even modern and robust frameworks such as Microsoft .NET are no exception.

However, Web applications written with .NET, in a language such as C#, can utilize many new and interesting approaches to solving input and output vulnerabilities. The attribute validateRequest, for example, can force a .NET application to check for the existence of script-based attacks.

The validateRequest functionality checks for the presence of patterns containing an angle bracket and an alpha character. Under many circumstances, this will prevent a XSS attack. However, when values are written dynamically to HTML, angle brackets are not needed, and an exploit remains possible. Then there are times when developers may choose to disable validateRequest, in which case there is no default protection against XSS attacks.

To aid in mitigating these threats, Microsoft recently released a programming class to prevent XSS vulnerabilities. The Microsoft Anti-Cross Site Scripting Library performs transformations of certain special characters into their HTML entity equivalents, or URL encoded equivalents for items that need to be passed in the URL. For example, <, when run through the HTMLEncode() method will now be safely rendered by the browser as &60;, which is the hexadecimal form of the less-than sign.

Some scenarios will still permit XSS attacks. Developers should use the URLEncode() method to write information that will be sent via URL, such as links. It is therefore critical to apply this as another layer of data validation and encoding security and not use it as your only defense.

Programmers using .NET that wish to make use of this in their applications as an approach to defense-in-depth can obtain it for free from the Microsoft website.

Web application attacks on the rise

According to statistics gathered by the Web Application Security Consortium and reported by Information Week, attacks against Web applications are on the rise. In fact, if the trend continues to the end of the year, 2006 will be the worst year on record for Web application security breaches. According to the article, this is happening for two reasons:

1. The prevalence and availability of tools that make it easier to find and exploit vulnerabilities in Web applications.
2. Web applications aren't often designed with security in mind.

There are even more reasons for this trend than those covered in the article, such as the emergence of worms and other automated attacks that target vulnerabilities in Web applications. Furthermore, knowledge of Web application attacks is becoming commonplace, reducing the average attacker's reliance on tools. Many attackers now need only a browser to wreak havoc in a poorly designed Web application.

The latter point, however, is the crux of the problem. Web applications that weren't designed with security in mind are far more likely to have problems later on. Even if the problems are discovered before they hit the news, it is costly and difficult to retrofit an application with security controls. On the other hand, when security is incorporated into the software development lifecycle from the beginning, the application is prone to fewer vulnerabilities and is much less likely to end up on the news because of an intrusion.

Google spider deletes application content

A recent item in the news (http://www.thedailywtf.com/forums/65974/ShowPost.aspx) reminds us of two important Web application security tips:
1. Don’t fail into an insecure mode by default
2. Be careful running automated spidering software on your applications.

This story took place during development of a Web content management application. One morning the dev team came in to find that all content had been erased. An investigation of the incident linked blame for the deletions on an IP address associated with one of Google’s Web spidering servers. Logs revealed that the spidering software was indexing the site when it came upon a link for content editing. Like a good spider, it followed the link.

Application access controls should have required authentication at this point, effectively stopping the spider from anonymously changing anything on the site. This particiular application assigned a cookie parameter named “isLoggedOn” with a default value of “false”. Once a user authenticated, the app changed this value to “true”. Unfortunately, the application only denied access if the value was set to “false”. Any other value, or the absence of a value altogether, would permit the requested operation.

As you may have guessed, the Google spider was able to successfully enter page editing mode because it didn’t accept the original cookie from the application. Thus is passed the badly written authorization test. Once in edit mode it dutifully continued following all links, including the “Delete Page” option. Any curious hacker could have done the same thing.

Obviously the problem could have been prevented by better programming. Applications should operate by authorizing only requests that are accompanied with legitimate authentication credentials (like a unique session ID) and not by the absence of a value.

The story also acts as a good reminder that spidering a web application can have unintended consequences. If you are using automated vulnerability scanning software that spiders content during testing, it can cause the same negative impacts. This is why Security PS assessments include time for us to manually walk through applications and identify issues like a link that logs out of the application or deletes a user account. These links can then usually be placed in an exception list so they are avoided by any subsequent spidering.

Which is nice, because it allows you to spend time doing something more exciting than running to grab the latest backup tape for your server.