hi guys,
I’m trying to understand if there is an easy relation between r_speeds and the video card fill rate figures.
for instance
I have a map that goes up to 15000 tris drawn
If I take some figures for a GF4 4600
Pixel Fillrate 1.24 Gigapixel/s
triangle transform rate 69 M triangles/s
what framerate should i expect given the r_speeds
I tried taking the simple approach:
15000 tris have to be transformed, at a rate of 69 M triangles/s that would give me a framerate of … :banghead: 4600 FPS. Guess that is a wrong interpretation.
Is there any rule that could give some estimate? Any interesting links related to this?
As to the pixel fill rate. Is that the number of shader passes a card can apply to one single pixel? How would i go about estimating the number of pixels that has to be processed?
I hope this question doesn’t make me look uber-stupid.
[EDIT]actually i don’t care if i look uber-stupid :lol:
thnx
fraco
). Obviously no game will be like this, except maybe Stare At The Moving Shapes On The Screen™.