Subject: [htdig] Indexing large amount of non-related files
From: Marcel Hicking (firstname.lastname@example.org)
Date: Tue May 23 2000 - 09:51:45 PDT
Anyone has experience with indexing
a large amount of files? I am very
satisfied with ht://dig succesfully on
smaler sites, but this one is different.
I have at about 200,000 plain text files
spread over a few 100, maybe 1000, directories.
File size is between a few bytes and, sometimes,
above 1mb. All in all this ends up in 1.2gb
of data, growing daily. The files do not
contain HTML code and I need them to be
indexed at least daily (that is, nightly ;-)
Most of the files are static, only few of them
change, say, 100-200 a day.
We have been using glimpse so far, but it's
not running well, indexing takes much to long,
among other problems.
Wonder if ht://dig could help here, or if
anyone knows of a different solution.
Thanks in advance,
To unsubscribe from the htdig mailing list, send a message to
You will receive a message to confirm this.
This archive was generated by hypermail 2b28 : Tue May 23 2000 - 07:40:17 PDT