I've been getting a lot of timeouts lately. A fair chunk of connections
are from the 'grub' distributed search engine spider, which connects too
often and doesn't play with robots.txt as nicely as I'd like. Some time
ago I'd put them in the 403 rejection list but, they don't get the hint
and keep on trying to connect.
This doesn't touch the database, but does eat up some apache connections.
Although I have now explicitly banned grub in robots.txt and filled out
their little update form, they're still connecting -- even to the banned-
for-all /w subdirectory. It's pissing me off.
I've upped the max connections on apache from 175 to 260, and on mysql
from 400 to 560. Lee or Magnus; if we get mysql too-many-connections
errors, either bring down the apache limit or move up the mysql limit.
-- brion vibber (brion @ pobox.com)
are from the 'grub' distributed search engine spider, which connects too
often and doesn't play with robots.txt as nicely as I'd like. Some time
ago I'd put them in the 403 rejection list but, they don't get the hint
and keep on trying to connect.
This doesn't touch the database, but does eat up some apache connections.
Although I have now explicitly banned grub in robots.txt and filled out
their little update form, they're still connecting -- even to the banned-
for-all /w subdirectory. It's pissing me off.
I've upped the max connections on apache from 175 to 260, and on mysql
from 400 to 560. Lee or Magnus; if we get mysql too-many-connections
errors, either bring down the apache limit or move up the mysql limit.
-- brion vibber (brion @ pobox.com)