Suggestion: QuakeWars silent/out-of-game/p2p updates


(mortis) #1

Okay, we all know that the ETQW map filesizes are going to be quite large aka greater than 500Mb each, or so goes the speculation. I, as a mapper, would like to still be able to create full feature custom maps and distribute them, and people are not going to sit through a 550MB download. I get impatient for 20MB downloads in game. Out of game, I could care less, it would great if I could set my game to “autoupdate all selected files”. I’d like to hear some other options, if anyone else has some bright ideas.

Suggestion #1

Central server / out of game autoupdate. Mappers submit new maps to SD, who places them on a central server after they have been virus checked. Game browser includes an update tab, where new files are listed by name, author, version and short summary. Users select the maps they desire, and then they are downloaded to the game silently when the game is not actively in use. Downloads are resumable.

Suggestion #2

Master server / P2P “QuakeWarsP2P”

Master server maintains list of map versions located on P2P servers hosted by willing admins and players. Download browser functions as above in #1, and players would be actively making the maximum amount of content available to the network. Less secure thna #1, but probably more reliable overall.

Suggestion #3

Community Map Packs

SD holds a competition for best user maps every six to twelve months. The ten best maps are included on a DVD map pack that is an official game update and costs ~$10. Bug fixes could be distributed at the same time, not only fixing game exploits but also providing exciting new content.

Suggestion #4

Community Texture packs

If mappers wish to use a megatexture not included in the game, it must be submitted to SD as an open source resource. In game, broadband users can choose to ‘silently’ download these textures in game and out of game at a trickle down rate to avoid performance hits. Custom maps would not include a built in texture automatically, thus reducing the in game download size. Other mappers could also rely on these textures once they are completely downloaded.

Suggestion #5 filler texture

Each map contains a built in low res ‘dummy texture’. Online users only get the dumbed down version on connect, but the game browser provides links to a full res version to be ‘downloaded out of game’ (or just keeps track of the names of needed texture packs).

–Mortis


(taken) #2

#1 and #2: This would require a lot of bandwitch and a laaaaarge server from the develeper, something that would cost a lot. Auto downloading/rederecting when connection to server is enough for me. :slight_smile: (user, not mapper)


(McAfee) #3

Of all that you said, there are some stuff I agree with, and some I don’t. I will reformulate another suggestion based on mix of ideas from your thread.

SD can add a segment to their website listing the “good” maps. The list should be administrated by the SD staff. These maps could be the ones that are very popular within the community, or those which have won some sort of monthy contest (or any other interval). Another option is too list all maps freely, but use a rating system and download counters, but the addition of a “SD seal of approval” should be used.

The download links should be of P2P nature (Torrents and/or ED2K links). This should remedy any stress on SD’s web server. In addition, md5sums could be used to guarantee freshness and taste.

Personally, I don’t agree with the other ideas as they are:

  1. I wouldn’t like another background process in my computer. Every software company wants to place something in the background in one way or the other. I want to have control of what to download and when. The current and typical download apps take good care of that. BTW, It’s very unlikely SD will want to host all those files.

  2. P2P = OK

  3. Charging for community based maps brings to many legal complications, and I doubt all the devs involved will want to go thru with that (in essence, the Publisher). And I doubt they will ship the DVD for free. This typically involves a closed map contest where all the maps are held private and the creater looses all legal possesion of the map. In other words, the creation belongs to the Publisher.

  4. I’m sure a lot of mappers won’t like this part:

it must be submitted to SD

You can’t force mappers to use specific textures, or remove their freedom to create new content at their own will. In addition, users will start saying that all maps have the same boring textures over and over again. Because Megatextures are so big, they are also very specific. Example: We don’t want maps with the same road layout.

  1. This requires chaning the anti-cheat system, especially on Pure servers. I guess maps could come in two PK4 files: one with all the map stuff and low res textures, and one with the high res textures. The HQ pk4 would be marked as optional, so players could play without it. But additional protection should be taken for those players who do have the file. We don’t want to allow players to have their own “customized” HQ textures. The 2nd PK4 file could actually have a new extension, just for labeling reasons.

(B0rsuk) #4

Once upon a time I dabbled in map making. I sympathise with you.
I see following problems: if map files are going to be THIS large, it is absolutely critical that a map is polished once it’s released to wider public. But I suppose you’d have to test it with someone before you do that. And map file size doesn’t help, does it ?

While having a central official server would be problematic and costly, I think it’s more than worth it to have a reliable source of good maps, even if it requires additional staff just to review maps. It would pay off.
Wait a second. Why not just use BitTorrent ? As far as I know it’s the most efficient way to share big files. All SD would have to do is to keep a tracker running.


(carnage) #5

wether SD host the maps or a third party orgonisation its likely that in game map downloads just wont be posible. i often download 300-400 MB files and it involves a lot of set and forget so this is gona have to be the way it goes i think

with such bigger maps and impatient players if your going to convince people that your map is woth the download then a good website reviewing all the aspects of your map are needed. it would be better to have some kind of centeral map website so server admins looking for map have a bit easier time

for the review of each map keeping the inforamtion structured would also be pretty nice making comparisons easier

-map name
-objectives
-command map
-screen shots
-mappers personal text
-independant reviews
-ranking for
*gameplay
*gameflow
*looks
*fps
*overall fun

as long as there is a structured template already made wouldnt be too hard to put this stuff in for every new map or at least the ones worth hosing online. the site wouldnt need to many staff realy depending how you get the reviews. and i dont think its realy out of the ability of this comunity to create and maintani such a site before running to SD.

…anyway what im trying to say i gess is that like it or not if the downloads are big them its just going to be harder to shift maps along. so some kind of PR front is (imo) what is needed

and the $10 disk idea is very nice but the complexity of shiping and producing these disks will create a lot of problems and are no camparision to know when you can have a new map in five minutes. but its not imposible for a mapper with a few maps to have there own DVD distribution and change only for the shipping and production costs then i think you stay withing the user agreement

which makes me think… if there was an above site then they could offer all the good maps on the database on a DVD every year so people can have a hard copy and dont need to re download if lost. maby even make a little profit to cover the site running costs


(B0rsuk) #6

Ok, to clarify: I think there’s no possible, sane way of putting both silent update and 300 MB(or more) in the same topic. Let alone sentence. You can’t miss something so big. Automatic download of such behemoths is ridiculous, and won’t be implemented, because SD people are not that stupid.


(mortis) #7

Sure it’s possible. People illegally download multi gigabyte movies every day with set-it-and-forget it P2P programs. These files would be smaller, and legal, but still need a reliable distribution network. What would be ridiculous is not to consider at all including a simple way for add-ons to be implemented. SD people are smart enough (being former modders themselves) to take that into account. IMO, one the biggest factors that made Quake and UT so successful was the incredible volume of user created content available; a dozen game levels come with the game, but there are hundreds of user-made add-on mutators, antilag, maps, models, etc.


(carnage) #8

Automatic download of such behemoths is ridiculous, and won’t be implemented, because SD people are not that stupid.

i belive it will be posible since they will also want to allow downloads for smaller mods like QWpro and will try send the pk3 file. they might impose some kind of max download size adjustable by the client though. although that is useles when not cosidering the rate of download


(Domipheus) #9

Sure it’s theoretically possible to do, via p2p means, but do people really want several hundred meg of data slowly taking up thier bandwidth in the background of their machine? If there is an update, i’d love to be told about it, and then can download it whenever I want - but I hate things which suddenly take up all of my bandwidth.

A dedicated et tracker for torrents is a nice idea if someone wanted to set one up, but auto-updates of this extent are not for me.


(jah) #10

500mb maps? you’re insane tbh

just consider the time all the clients had to wait just to load a regular map while playing online.

and gtk would simply explode (500mb .bsp = what?.. 1.5gig .map?) :S

splash damage are one of the best developers that i know of (great support in around the communitys for the games their publish - they’re one of a kind in that chapter) and i don’t really believe they would commit such attrocity.

J


(Domipheus) #11

it’s the megatexture which is 500mb, not the actual map.

(i.e., its not a 500mb bsp [no bsp’s anymore anyway] but most maps will still need a megatexture, hence the size)


(Sauron|EFG) #12

Procedurally generated megatextures then, as Ragnar suggested? It would still take some time to generate the file, but it beats downloading 500 MB. :slight_smile:


(jah) #13

i thought megatexture was (bluntly) an oversized texture (hence the name…). if this is based on doom3 tech, then where will the bsp’s go? SD redesigned the whole engine just to acommodate this megatexture thingy? and wouldn’t it be possible to texture maps the old way (forgive me but i just don’t believe that this megatexture will provide texture info for all the map - terrain, buildings, sprites, etc)?

Oh, and if the megatexture supports jpeg, still… 500mb’s is a lot (a 999k pixel sized jpeg?). Maybe if it would only support .tga’s… but even then… it’s a lot of Mb’s and a lot of RAM and a lot of CPU usage imo. i really doubt that things will work that way.

J


(mortis) #14

Procedurally generated megatextures then, as Ragnar suggested? It would still take some time to generate the file, but it beats downloading 500 MB.

I also thought about .kkrieger , but I don’t know enough about how the textures in that game are generated.

.kkrieger is 97KB. Yes, that’s not a typo. 97 kilobytes.


(Lanz) #15

Oh, and if the megatexture supports jpeg, still… 500mb’s is a lot (a 999k pixel sized jpeg?). Maybe if it would only support .tga’s… but even then… it’s a lot of Mb’s and a lot of RAM and a lot of CPU usage imo. i really doubt that things will work that way.

Why do you doubt that? It’s how it works in ET:QW for the original maps. What they do is that they model a terrain in I think Maya, export the model to Radiant and use the tool they call MegaGen to create and then generate the megatexture which results in multiple files taking up 500 meg of data. Then on top of that they use radiant to make buildings and plop in additional models etc. Why would it be any different for custom maps?

And yeah I bet they still use bsp files, would surprise me if they didn’t. Megatextures is not some revolutionary all purpose thing, it just let the game use a huge terrain texture and that’s all.

The only thing is we don’t know how all this pans out, this is one of those things I hope SD will tell us about now, just so we know what we have to expect from the game when it comes to custom maps.


(Domipheus) #16

Afaik, there are tools that can generate the megatexture for you, a la procedural - but that would be for map makers i guess. Until there is more information I dont think we could figure out if they could be somehow made on the client, tho that aint a bad idea i guess :slight_smile:

jah: d3 doesnt use bsps?:nag: and the whole point of megatexture is that the ram is used in such a way that it doesnt use up as much as one would think etcetc. There are plenty of threads about the tech so give the search function a try.


(Sauron|EFG) #17

I don’t think you can compare it to the [relatively] simple textures in .kkrieger, it’s probably better to look at a landscape generator.


(Hakuryu) #18

This is still all speculation. We wont know the real map file sizes until the game or demo comes out.

I’ll make an argument for smaller size map files. One phrase used alot in describing Mega Textures is “Procedurally generated”. Most people assume this means how the texture is created, but it could as easily mean that the texture is generated once when a map loads (ala BF2 optimizing shaders on first map load) or as a map loads (every time).

Maybe the Mega Texture is not a huge image after all, but a set of components that describe how that texture should look. These components could then be parsed by Quake Wars, and the Mega Texture generated on the fly. A ‘grass’ texture is the same no matter where it is on the map, so instead of imagining all the ‘grass’ areas as being defined within one large image, why not imagine a raw data file with the ‘grass’ locations. Add dirt, rock, road, and other type of terrain texture raw files like the ‘grass’ example and procedurally generate the terrain texture from these small files.


I’m against p2p. I dont want any of my bandwith being used to facilitate map transfer no matter what size they are. ET does fine with auto-downloading and redirection for download. If the maps are huge, then redirection is fine with me… the only thing is how do you know what maps are on a server? Maybe a popup should be introduced saying “This server uses the following custom maps : (list of maps) . Download at http://…”


(Domipheus) #19

The assumption about it being ‘how it is generated’ is no assumption at all:

http://www.beyond3d.com/interviews/etqw/

States MegaGen is used to create the mt by the mapper.


(mortis) #20

quote from link above, originally posted by Mordenkainen

MegaTexture is probably the most well known improvement that ETQW will have over DOOM 3. The 32,000 by 32,000 6GB source texture you mentioned in previous interviews suggests you’re not using any procedural textures at run-time but considering the multiple DVDs implication, are the .MEGA files compressed, and if so, are you compressing by tile or as a whole (single file)?

Enemy Territory: QUAKE Wars features a tool suite for creating MegaTextures called MegaGen. Once the Artist or Level Designer has completed the artwork, MegaGen outputs two entirely unique 4GB textures - a diffuse map containing colour data, and a normal map - and then combines them into a single 5GB data file. This data file is then split into unique tiles suitable for streaming, and then compressed to reduce disk space usage.

The resulting unique MegaTexture is around 500MB in size. This represents a reasonable tradeoff between ETQW’s visual quality and disk space usage (maintaining a shippable size for the game).