Autodownload Limit


(geppy) #1

For a long while, we’ve been having issues with autodownload on our custom maps servers preventing people from downloading all of the custom maps at the same time (after the 12th pk4, the download would stop).

Now, I could set it to only have people download the maps in the current campaign, but for various reasons I’ve been asked to have it give you the full download list.

It was pointed out to me about an hour ago that the issue was not with the number of files, but with the length of the file list string. As it’s limited to 1024 characters, our (relatively long) base URL plus the large number of items results in the string being truncated midway through the thirteenth file.

I found (what could be considered) the offending line:

Simply raising the size of each static string would alleviate the issues on our end, though it seems like a stupid thing to do. I’m wondering if I should go through and replace the use of char* with a smart string (that is, one that knows its own length).

From a cursory look, I’m thinking that if I replace the idStr internals it’d be a proper fix for the issue.

Anyway, if anyone was interested in this problem, this looks to be the cause. Uni starts up tomorrow, though, so we’ll see whether or not I’m able to get around to it in short order.


(geppy) #2

Ah, it looks like it’s not a variable used internally. It looks like the idStr class can handle arbitrarily-sized strings (up to 2^32 characters).

Looks like different parts of the game are allocating temporary memory before passing it to the idStr constructor.

I could go through and fix it everywhere or I could just fix it in the places that are causing me problems right now.

Any comments from any of the SD folks?


(geppy) #3

I may be mistaken, but unfortunately it looks like I don’t have access to all of the code needed to fix the issue.

As a workaround, I’m might just use a shorter base URL.


(jRAD) #4

The URL buffer does indeed live in the engine, so there’s nothing that can be changed in that respect. A shorter base URL will help with the issue, and is really the only option at this point.


(Susefreak) #5

and it’s not an option to conveniently leak that engine code??? :smiley:


(geppy) #6

I have an idea that I think will fix it, but I’m having difficulty getting even an unchanged mod going from the SDK (using the tips from modwiki).


(Scrupus) #7

Nice find, it explains some mysteries - thanks for the info! :slight_smile:


(Azuvector) #8

Could maybe send multiple autodownload commands?
I dunno, random guess. I’ve not had the opportunity to work on QWTA in some weeks now. Busy busy.


(Susefreak) #9

Tried some testing on the current DL settings:
If you have a large download (14 files) and cancel it halfway through, then if you reconnect if fetches a new list of 14 files.

I’m not a programmer (yet), but maybe some code that writes to that offending line how many downloads there actually are, rather then sticking to the current situation with the [max_string_char]??


(kamikazee) #10

A guy named “Simulation” posted in the thread on Doom3World that he would try cutting the URLs off at the right place, then trying to see if he could somehow make it clear to the engine that more files follow. Still, seeing how this bug is present, none of the code monkeys seems to have thought of including some <continued…> marker URL…

EDIT: Well, why not include this with a patch should there ever be one?