Re: problems with 3.0.b1

Bryan Mohr (
Thu, 05 Feb 1998 00:30:21 -0800

At 09:30 AM 2/4/98 -0800, you wrote:
>> >>I'm having a problem with htDig that I'm hoping you would be willing
>> >>to help me out on. I've got a list of 38 URL's that I want to include
>> >>in the index. If I create the htdig.conf with up to 20 of those URL's,
>> >>it works fine. But anything beyond that and it never seems to get past
>> >>the point of counting the tokens and starting the retrieval. I've let
>> >>it run for over 48 hours with all URL's and nothing ever happens. The
>> >>CPU use fluctuates between 85 & 96 percent, so I know it's trying to
>> >>do something. I'm running in -verbose mode when I execute "rundig" so
>> >>I know that it's never even starting on the first URL. Any ideas?
>> >
>> >Actually, I just noticed that I can get up to 33 URL's, and the one
>> >it is sticking on is just an IP address instead of a name. Would
>> >that make a difference?
>> Ok, I've done a bit more digging and think I have the problem really
>> narrowed down, now I just need a solution. :)
>> If the start_url (in htdig.conf) is over 1000 characters, it never calls
>> the retriever, if it's under 1000 characters, it works fine. My problem
>> is that 1000 characters isn't enough, it leaves off 5 of the URL's that
>> I want to include in the index.
>> So I have 3 possible solutions:
>> 1. Allow start_url to be over 1000 characters
>> 2. Assign the url's some other way and forget about start_url
>> 3. Make 2 (or more) conf files, each with the start_url string less
>> than 1000 characters, execute rundig against both of those conf
>> files, and somehow combine the results into one database.

>Change the declaration of buffer[1000] to buffer[10000] on line 265 in

Ok, I did that and htdig completes without a problem now. But, now htmerge
doesn't work. It hangs as soon as it starts. Any ideas here?

Bryan Mohr or

This archive was generated by hypermail 2.0b3 on Sat Jan 02 1999 - 16:25:41 PST