Re: htdig: geocities robots.txt

Geoff Hutchison (
Mon, 18 Jan 1999 15:56:30 -0500 (EST)

On Mon, 18 Jan 1999, Gilles Detillieux wrote:

> the code that fetches this file, to ignore any file size limit currently
> imposed by the RetriveHTTP() function, or b) just set a nice, generous

One of the side-benefits of imposing a max_doc_size is to make the
indexing program more secure against buffer overflows. I don't know how it
could be used in pratice, but completely removing the limit on robots.txt
could cause overflow problems. What happens if the server never stops
sending the robots.txt file? What if you run out of memory?!

I agree with Gilles, switching to a generous limit is the best fix.

-Geoff Hutchison
Williams Students Online

To unsubscribe from the htdig mailing list, send a message to containing the single word "unsubscribe" in
the body of the message.

This archive was generated by hypermail 2.0b3 on Wed Jan 20 1999 - 08:37:46 PST