[RndTbl] slowing httpd access to cgi-bin scripts

Tim Lavoie tim at fractaldragon.net
Fri Sep 17 16:40:29 CDT 2010


On Fri, Sep 17, 2010 at 04:08:00PM -0500, Gilles Detillieux wrote:
> Every once in a while, some doofus points a web crawler at our web site 
> and, ignoring the disallowed areas in our robots.txt file, starts 
> crawling through some of our cgi-bin scripts at a rate of 4 to 8 hits a 
> second.  This is particularly annoying with some of the more processor 
> and disk intensive CGI programs, such as man2html, which also happens to 
> generate lots of links back to itself.
> 
> Is there anything I can set up in Apache to throttle back and slow down 
> remote hosts when they start hitting hard on cgi-bin?  I don't want to 
> do anything that would adversely affect legitimate users, nor make 
> important things like the manual pages hard to find by removing any 
> public links to them.  But when a client starts making 10 or more GET 
> requests on /cgi-bin in a 5 second period, it would be nice if I could 
> get the server to progressively add longer and longer delays before 
> servicing these requests, to keep the load down and prevent the server 
> from thrashing.

You could implement mod_security rules which enforce a delay after a
certain number of events. There is an example here which does something
similar based on failed logins. The main difference is that you don't
care about the response going back to the client.

http://www.packtpub.com/article/blocking-common-attacks-using-modsecurity-2.5-part3


If you use OSSEC (a spiffy package in its own right), you could get it
to block the IP for a while, but that may be a little harsh in that it
wouldn't be saying *why* the IP is rejected. In any case, the method I
think would be to look at the apache access log, with a rule configured
to punt that IP for a while if the logs access /cgi-bin too often within
a set period.

  Tim


More information about the Roundtable mailing list