Re: [htdig] Limit to number of files

Subject: Re: [htdig] Limit to number of files
From: Robert Morse (
Date: Wed Jul 12 2000 - 07:03:23 PDT


Thanks for the reply. When I try running htdig -vvv it will
dump core before printing anything out. I think the problem
is with machine resources. It seems to be running out of memory
with more than 2500 or so. I will try to stop some processes
to get more memory then try running it with all 3500 or so
files. Thanks.

Jim Cole wrote:
> Robert E Morse's bits of Mon, 10 Jul 2000 translated to:
> >I have 3500 files in several directories that I would like to index,
> >but they are not all referenced within an index.html. So I created
> >a file that listed all the URL's seperately and put this line in the
> >htdig.conf file:
> >
> >start_url: `/tmp/all.urls`
> >
> >and also put this in:
> >
> >max_doc_size: 4000000
> >
> >It keeps dumping core when I run the rundig command. When
> >I trim the file down to about 2500 lines, it works fine. Is there
> >a limit to how many files can be listed in the external file?
> Hi - I use external files for some of the sites I maintain. One of the
> files currently has a little over 57000 lines, so I don't think the
> number of lines is the problem. Have you tried running the commands
> (htdig, htmerge, etc.) individually to see where it is dying? Maybe
> with -vvv to see what is being done when it dies?
> Jim

Bob Morse
System Administrator
American Mathematical Society
401 455 4162

------------------------------------ To unsubscribe from the htdig mailing list, send a message to You will receive a message to confirm this.

This archive was generated by hypermail 2b28 : Wed Jul 12 2000 - 04:24:26 PDT