Re: [htdig] Large Directory

Torsten Neuer (
Tue, 13 Jul 1999 14:45:11 +0200

According to KHumpf@IITRI.ORG:
>On our webserver, we have a directory full of THOUSANDS of .html
>documents.  In fact, there are so many .html documents in this one
>directory, HT://Dig times out when it tries to index them all.  I tried
>accessing the whole directory through my web browser and that also times
>out eventually.  I can get to each individual document with no problem
>and I know the directory has the correct permissions etc... but I was
>wondering if there is any way to tell HT://Dig to keep trying for a
>longer period of time??

>Any advice would be greatly appreciated.

It is not the number of documents that makes ht://Dig time-out, but
the slow response from your server.

If ht://Dig times out on the same LAN as the WWW server is located
and if even your browser times out on local pages, than it would be
better to check the configuration and setup of your web server.

If you run Apache, I'd say increase the number of initial server
tasks (and maybe put some more RAM into the machine, too). For
a big number of virtual hosts you might also try to split up the
service onto a bigger number of machines.

You can set the time-out value of ht://Dig via the configuration
directive "timeout" (see,
but that would not solve the problem of time-outs in general, i.e.
other people will get those time-outs when accessing your pages
and the htsearch cgi-bin executable will probably time-out, too,
if your server is kept in that condition.


InWise - Wirtschaftlich-Wissenschaftlicher Internet Service GmbH
Waldhofstraße 14                            Tel: +49-4101-403605
D-25474 Ellerbek                            Fax: +49-4101-403606
E-Mail:            Internet:

------------------------------------ To unsubscribe from the htdig mailing list, send a message to containing the single word "unsubscribe" in the SUBJECT of the message.

This archive was generated by hypermail 2.0b3 on Tue Jul 13 1999 - 05:15:50 PDT