They possible could. nVidia is better in OpenGL games which ET is. Maybe they could do some mayor optimalizations in OpenGL apps over DirectX games.
what SLI mode for Wolf ET?
I’m sure they do, but not per game. And they don’t need profiles to detect if a game uses OpenGL or DX.
I’m guessing Ouroboro is correct. 
I’ve been doing a lot of timedemo’ing and head scratching, and here are my findings:
Running in full screen mode does make a difference
The Nvidia SLI profile for ET is useless - I completely deleted it from nvapps. I get better performance in single card mode.
Alternate frame rendering (both types) does nothing in ET - the load balancing bar shows some activity in menu screens etc, but is totally blank during the game. This is what the NV profile seems to have been using.
SLI Antialiasing (8x, 16x) performs very poorly in this game
I achieved the best performance (by far) with 8x SSAA and split frame rendering
Hope this is useful
From the description “getting a solid 76 fps”, I assume that your cvar “com_maxfps” is set to 76.
While single card can provide you sustaining 76 fps, SLI won’t boost it further, unless you unlock the cvar limitation.
However, isn’t 76fps enough for a smooth gameplay?
Not according to my eyes. I find 76 to be “decidedly less than buttery smooth.” 
85+ seems to be the sweet spot for me in W:ET. Some other games seem choppy to me with anything less than 100 fps.
I know other players who agree. I’d say for simplicity’s sake, cap at 125 fps if you can. It can’t hurt, and it’s definately smoother, whether you can see it or not. It’s also a “jumping number.”
It’s also important that refresh rate be higher than framerate, for optimal smoothness. Sadly, my monitor can’t do so at a resolution higher than 800x600. I played at that res for a while to see how an uber-refresh felt, and it was like butter. I actually got a little verklempt! :chef:
However, isn’t 76fps enough for a smooth gameplay?
Yes, but my problem was that I couldn’t even get a frame rate that high using 8x Antialiasing on my SLI setup. I worked out the best settings and it seems to be OK now, but it still goes below 76 sometimes on maps like Radar and Railgun.
With little exception, aren’t monitor refresh rates capped at 60 (LCD) or 75 (CRT) Hz??
ie - you can’t SEE more than 60 or 75 frames per second. The only thing you could possibly detect would be a fluctuation if all of the sudden your fps dropped below the threshold of 60 or 75 and then went back up abruptly…but setting cl_maxfps would fix this problem nicely.
Correct me if I’m wrong (which is quite likely on topics such as this) but isn’t having 125 fps useless compared to, say, 80?
Do you mean capped by W:ET? If so, no. r_displayrefresh can be set to any refresh rate that your monitor can achieve at a given resolution (r_mode).
It varies by the person asked. I’m usually not happy with anything below ~85 hz/fps.
There’s an interesting read at http://www.daniele.ch/school/30vs60/30vs60_1.html
One thing is for certain though, the “humans can only see X fps” claims are complete bullshit.
All I mean is that your monitor has a given maximum refresh rate, which would be what I imagine it runs on (ie 75 or 60 Hz), regardless of r_mode and r_displayrefresh, etc.
I agree with you about eyeball sensitiivity - different people can detect all manner of different refreshes.
Ah I understand you now.
Most CRT monitors should do more than 75. It depends on the resolution. Mine shows 60-150 depending, although you can usually bump it up a bit higher. For example I found my limit to be 119hz @ 1024x768, although 100 was the highest listed.
If you set a refresh rate in-game that your monitor cannot do at that resolution, you will get the same “signal timing out of range” error that you would get outside the game. If that ever happens, you can Alt+Enter your way out of it.