Certain files download as single big segment

Asked by zebub

Certain files (and always the same files!) are downloaded in a single segment, while for other files (the majority) segmented downloading works correctly. Some examples are "[WF]_School_Rumble_05.avi" (TTH: ZRCTY...) and "Bleach_006.avi" (TTH: K7HF3...).

DCPP version: .705-.708
OS: XP SP3

Question information

Language:
English Edit question
Status:
Solved
For:
DC++ Edit question
Assignee:
No assignee Edit question
Solved by:
eMTee
Solved:
Last query:
Last reply:
Revision history for this message
poy (poy) said :
#1

this is automatic, DC++ adjusts the number of segments depending on the download speed.

Revision history for this message
zebub (zeb-mailinator) said :
#2

Sorry, this doesn't seem to be it. This wouldn't explain why the same file is affected every time (bleach 006 and not bleach 007, even from the same sources!). Also, this happens if I disconnect the active source, one by one, regardless of its speed. And dc++ clearly asks for the whole file way before it has any idea of the speed of the source.

Revision history for this message
poy (poy) said :
#3

what if you remove the file in question and add it again? was it started with older versions of DC++?
is that file smaller than other ones?

Revision history for this message
zebub (zeb-mailinator) said :
#4

The files in question ALWAYS try to download in whole, even after having been removed, app closed, and even after a few weeks...

Both affected files are about 183000000 bytes, a standard anime episode size. Exact size doesnt seem to matter, as bleach006 (affected) is bigger, schoolrumble05 (affected) is smaller than, eg. bleach007 (unaffected).

Revision history for this message
zebub (zeb-mailinator) said :
#5

Hum, just now I checked and bleach 007 seems affected too. But there are other correctly working files with similar in-between sizes.

Revision history for this message
poy (poy) said :
#6

could it be that there just aren't enough available sources for these files? with available, i mean "not already in use for another download".
if you pause all the downloads except one that didn't want to download segmented, and make sure the speed is ridiculously low, does it work then? you also need available slots on remote sources, of course.

Revision history for this message
zebub (zeb-mailinator) said :
#7

Although I made a mistake above wrt bleach007, I checked again and eg. bleach008 is definiately unaffected. The sources for bleach006 and bleach008 largely overlap, of course.

Now, with no downloads, if I start to dl bleach006 (with about 30 sources), it tries to DL in whole, from a single source, which I can disconnect and it goes to the next source, etc.

On the other hand, with no downloads, if I try this with bleach008 (with similar 30 sources), it downloads in 1-2mb segments.

Regardless of remote slots on sources, it visibly ASKS for small segments / the whole file, respectively. And in the whole file case, if the download is stopped for a reason, it loses all data it downloaded so far.

IMO this is definiately about the particular files, thats why I included TTHs in the original report.

Revision history for this message
eMTee (realprogger) said :
#8

>And in the whole file case, if the download is stopped for a reason, it loses all data it downloaded so far.
We had a few reports of this behavior recently.

>thats why I included TTHs in the original report
I can't see a full TTH string in any of your posts.The only way to solve this poblem is when whe have the TTHs and the hub(s) adresses where these files can be found. We have nothing to do with filenames so try to provide the relevant info only, here or in the DC++ Public Development hub. Thanks.

Revision history for this message
zebub (zeb-mailinator) said :
#9

I picked those files since they are relatively easy to find, best chances are at public.otaku-anime.net:555 (though any big hubs from the public hublist should have a few sources too).

I don't see what's wrong with searching on keywords and verifying with the first TTH chars, but here are the full strings as well:

ZRCTYV4G45YFKZ2MOSXHGCBR6EDWZUS2GPCYSHQ
K7HF3I3EC3F3TC3W7TTNSOJLKUQHMPMG3SA4WJI

Revision history for this message
Best eMTee (realprogger) said :
#10

Both of these files are working normally for me.
There is a big chance that your hashdata entries for these files is corrupted. Try to remove the problematic files from your queue and THEN use the /rebuild command to remove their tree information from the hashdata.
(This may take a long time depending on your queue and share size and DC++ will be unresponsive during the rebuild operation, do not close it until it finishes). Then readd the files and see if they're working well.
Let us know if it solved your problem.

Revision history for this message
zebub (zeb-mailinator) said :
#11

Thanks, /rebuild fixed these files.

(What may still be interesting though, is that I frequently download similar files (these two were DLed months ago), and sometimes run into this problem even with new files that were unseen beforehand...)

Revision history for this message
zebub (zeb-mailinator) said :
#12

Thanks eMTee, that solved my question.

Revision history for this message
eMTee (realprogger) said :
#13

>sometimes run into this problem even with new files that were unseen beforehand
If some parts of your hashdata file is corrupted, this can happen. Now /rebuild probably corrected all problems but if you experience this in the future you may need to delete your hashdata files by hand and let DC++ reindex your share to get rid of this problem completly.
More info : http://dcpp.wordpress.com/2006/03/09/what-do-hashindexxml-and-hashdatadat-do/