recently i raised my res to 1280x1024 and put on 8x ani and 2x aa just cuz it looks so fucking pretty, and only makes about a 10 fps difference from my old, visibly interlaced and jaggedy 800x600.
so, while removing shadows and dynamic lights and whatnot to compensate for the fps drop i wanted to see how much my 110 fov drops my fps from the default 90, and what i’ve noticed for some reason is that my fps is 2-15 fps better at 110 fov opposed to 90… this confuses me because with 110 fov more is rendered on the screen and all logic says my fps should go down because more polys and surfaces are rendered. in most situations except some where i’m staring at mostly ground at 90fov and see more than the ground at 110 does it ever favor the 90 fov.
i’m just wondering what the technical explination for this is… is there some kind of scaling that is used that takes a load off when textures are further away? what’s the deal?