SLI/Crossfire Support


(master[mind]) #1

Is it just me, or do most modern games have poor Multi-Card support? So my question is this, could you guys at splash damage test Different SLI and Crossfire setups with Brink, test on these setups in both Crossfire and SLI:
16x/16x(/16x)
16x/8x(/8x)
16x/4x
8x/8x(/8x)
If there are issues, it would be awesome if y’all would fix them. Those are the most common setups. Especially the unbalanced ones would be most likely to cause tearing.

I was testing the MoH Open beta, and it had horrible artifacts from the SLI, seems the only game that runs with SLI in my testing is Crysis and SH. Which is annoying since Nvidia and ATI constantly push the technology. I can’t wait till Brink is released, but I also don’t want to sacrifice my AA because I can only run a single card. Hope the development is going well.


(engiebenjy) #2

A few years back i bought two graphics cards (nvidia 8800 gts) for use in SLI. After about a month I sold both cards and bought a single 8800 GTX instead - SLI just wasnt worth the effort for me.

I think like most things they are prioritized - the more people who have SLI/Crossfire the more time they will spend making the game better comptable with it. I dont know many people who have SLI


(master[mind]) #3

Yeah, sadly, any game that doesn’t run in SLI never needed it in the first place. But I hope Splash Damage will take a little time to try and get it working, it shouldn’t be too much trouble considering what they’ve already accomplished.


(Auzner) #4

You sure sound credible with those statements.

There’s no need to worry about PCI-E 1.0 vs 2.0 since all cards and controllers are backwards compatible with each other. You can run things at 4x 2.0 and there will be such a minute performance drop that it hardly matters. Here is Quake 4 (id Tech 4) for a closer comparison to how Brink may behave: http://www.techpowerup.com/reviews/AMD/HD_5870_PCI-Express_Scaling/15.html
Quake Wars 4x vs 16x gets you a 1fps drop! http://www.techpowerup.com/reviews/AMD/HD_5870_PCI-Express_Scaling/9.html

I don’t know what you’re talking about with SLI and Crossfire generally not being supported. Just about any site that reviews hardware will show the performance increase figures for running two cards. Maybe games that are just crappy console ports may have issues, but they never cared about the PC market in the first place. When you add a third or fourth card that’s usually so you can crank up AA or run Physx. They don’t really net you a higher frame rate. http://www.guru3d.com/article/radeon-hd-5770-in-3way-crossfirex-review-test/7
http://www.guru3d.com/article/radeon-hd-5770-in-3way-crossfirex-review-test/10


(brbrbr) #5

reminding thats Brink use HIGHER resolution textures[than ETQW does, for example], mean you may need GFX card with more onboard memory and difference between PCIe v1 vs PCIe v2.x become more noticeable, like difference between x16 or x8 lines per GFX card[both single or SLI/CF].
thats why im strongly recommend to anyone - don’t buy GFX cards with only 1Gb memory onboard in 2010 or 2011.


(Slade05) #6

Brink virtualizes textures.


(brbrbr) #7

i know how megatexture works :wink:
but current GPU’s shading performance isn’t enough to COMPLETELY offload textures generation to GPU, rather than store most bootlenecking stuff in memory.
so key is exact numbers of performance[on given hardware], whose can fit or can not fit in it.


(DarkangelUK) #8

Virtual Texturing is not megatexture


(master[mind]) #9

Sorry, that was my personal experience.

This knowledge was already present in my cranium, but my point lies in the fact that combining say 16x and 4x would provide differing levels of bandwidth, which may throw the SLI out of sync. The point is for both cards to be perfectly in sync producing the same amount of data at the same rate. Even a few milliseconds would result in some visual tearing, especially at the framerate SLI gets(remember it hast to render a million+ pixels 120 times a second, if it strays in it’s sync, then the difference will be magnified, or SHOULD be maginified, though I figure Nvidia and ATI have this figured out by now.)

[QUOTE=Auzner;243566]
I don’t know what you’re talking about with SLI and Crossfire generally not being supported. Just about any site that reviews hardware will show the performance increase figures for running two cards. Maybe games that are just crappy console ports may have issues, but they never cared about the PC market in the first place. When you add a third or fourth card that’s usually so you can crank up AA or run Physx. They don’t really net you a higher frame rate. http://www.guru3d.com/article/radeon-hd-5770-in-3way-crossfirex-review-test/7
http://www.guru3d.com/article/radeon-hd-5770-in-3way-crossfirex-review-test/10[/QUOTE]

I did my research ahead of time, basically, I wanted something more powerful than a GTX 480 for the same price. I don’t expect to run every game on the market maxed, but I do expect it to work. Maybe it is just my MB, but I only had one choice(AM3 + DDR3 + SLI = MSI NF980-G65). In my testing, I have recieved visual artifacts and tearing in the following stuff:
LOTRO
BF2142
MOH 2010
TF2

Oddly, the much MUCH older Tribes 2 works flawlessly.
BTW, games I tried that are fine with SLI:
Shattered Horizon
Crysis Demo
Cryostasis Demo
Alien Swarm
Any Nvidia Demo
Heaven Benchmark(flawless, and exactly double performance :smiley: )

Just Cause 2 only works in windowed mode, if I fullscreen it, I get tearing.

That is my personal experience. If I came across as pushy and sarcastic, that’s because your post seemed that way to me. Text is not the best purveyor of emotion.


(master[mind]) #10

Oooh, if this is true, it will be awesome. I do hope their using some nice shaders though. From the screenshots, the poly count is nice and high.


(brbrbr) #11

sure, its kinda like saying “CoD graphics is not MoH graphics”.
scientifically correct, but not complete answer.
point is, its based/built around similar principles/ideas/math. [shaders (potentially]mobile over GFX/CPU stream].
with some modifications/variations/augmentations/advancements/forks.

but returning on topic, i repeat, spending money on GFX card with only 1Gb onboard money is [ironically]more waste of money, than saving of it in 2010 and later.
yeah, ATi cards with 2Gb onboard memory are ridiculously overpriced[about ~$150 over 1Gb versions], but thats mainly because poor sales of this versions and some over-promoted sick “features” in ATi drivers, suck as Fast Z-Buffer clearing, Hyper-Z/Hierarchial Z-Buffer[and tweakable Z-compression], whose only make[together with bottlenecked “in silicon” last two gen’s GFX chips] ATi SW/HW stuff behaviour more jerky and less predictible, in favor of marginaly saved memory bandwith in somw appz. hilariously ironically, appz most benefited from this [over-promoted]PR-features[from 199x years]are most …suffering visually from it, like Arma/OFP:R or X3:R.

or in short: go on, buy NVidia. its less cheap, but no crap inside.
and you don’t need to spend extra money for FireGL/FirePro versions of GFX to get access to [more]“written with brain usage” drivers, let alone OpenGL implementation.


(Auzner) #12

I don’t have issues running crossfire. I provided links to others who didn’t either. The pci-e lanes do not need to match speed. Most setups running triple or quad will not have matching speeds. You don’t know what you’re talking about and are just over generalizing everything to speculate your own issues. Pretending you know all about the architecture and pin pointed design flaws doesn’t help anyone.

Your power supply is inadequate and/or you don’t have proper cooling. And running two video cards demands a lot of both. Also overclocking will cause artifacts. You should look into those things instead of posting threads insisting that SLI and crossfire are broken and unsupported.


(master[mind]) #13

[QUOTE=Auzner;244019]I don’t have issues running crossfire. I provided links to others who didn’t either. The pci-e lanes do not need to match speed. Most setups running triple or quad will not have matching speeds. You don’t know what you’re talking about and are just over generalizing everything to speculate your own issues. Pretending you know all about the architecture and pin pointed design flaws doesn’t help anyone.

Your power supply is inadequate and/or you don’t have proper cooling. And running two video cards demands a lot of both. Also overclocking will cause artifacts. You should look into those things instead of posting threads insisting that SLI and crossfire are broken and unsupported.[/QUOTE]

I laughed, though only briefly, mostly due to your rather confrontational manner, but aside from the unrelated personal issues, I am not as clueless as you think. Allow me to expound:

A. I have spent enough time building/tweaking/overclocking/researching/programming/setting up/fixing computers to know that what I say is from logical research and fact; though you will most likely mentally disagree at close to this exact moment, the further points should alleviate the doubt to some extent:

B. PCIe Lanes = more speed, here’s why:
1 PCIe Lane = 250MB/s(byte, not bit. bit is considerably different, 8 times different :slight_smile: )
16x PCIe Lanes = 4GB/s
1 PCIe2.0 = 500MB/s
16x PCIe2.0 = 8GB/s
Obviously, more lanes = greater bandwidth/bus and greater throughput. Your point is that the cards do not saturate the bandwidth. You are correct. But, there is a slight difference in performance between 8x and 16x, albiet slight. When 500 billion FLOP(FLoating point OPerations) are executed each second, the difference can be magnified and has to be adjusted for, or at least I assume so. I have witnessed a few games exhibit some tearing with the initial boot up, and then sync over time. I assume this isn’t a coincidence. I am actually trying to help for those who may experience that issue, if it exists.

C.Yes, the title is general, any suggestions? I am not a Critical Writing major, I am CS.

D. I have far more power than I need, to boost in a 140w Phenom II x6 and a dedicated physx card of some form. I have tested my system to 100%, combining the Furmark test with OCCT simultaneously. I have a Corsair 750w powersupply, it’s voltage never dropped below 12.04 which is far more than sufficient considering the default voltage is 12.14. That was with both 465’s and my 955 at 100%. I did the wattage math ahead of time. I have around 200w to spare under full load. FULL load. That is including a rough 150w for MB, HD’s and CD + fans. You seem to be over generalizing my issues. I have done tests far more extensive than you can imagine. I have used the cards in a synced manner(16x 16x) and still maintained the same issues. My point on the uneven lane counts was for other peoples troubles, not mine. Mine lies with the fact that certain games will not sync their output, including almost all source based games. If you want any more info on my tests/burn tests/benchmark tests, please ask. I did not state that SLI and Crossfire were broken, it works great for every Nvidia demo, so the issues are NOT there. The issues are with the engines and the developers. Yes. I have tested those benchmarks(from Nvidia), I have run Furmark, maxing my GPU, and PSU, with NO TEARING. Please, you seem to have reduced my experience to a point on a timeline, when I have spent months of my time testing this stuff.

If this is not enough information for you, I will direct you to Wikipedia as it is the only resource that has a longer post than this. I simply wanted to make sure the developers of Brink were taking SLI into account. I am not sure how I offended you, but that is only left up to my imagination. I find it humorous that your experience is a point here when my experience is not. There are many articles about both working and non working SLI, so those articles are moot, I would not post any in the reverse either.

I hope this was clear, and I hope your response is not simply commenting on my grammar.

P.S. On a humorous note, there was a study that stated that all internet arguments end of mentioning the Nazi party.


(master[mind]) #14

Quote:
Originally Posted by master[mind]
“SLI and crossfire don’t work because I say so.”

Because I definitely said that. See? I expect you to be credible so I don’t bother to check your quotes. Good thing a friend pointed out your error while he was perusing the thread.


(Nail) #15

meh, I’d rather have the work done on multi-core utilization rather than multi-card, more games are getting away from being solely GPU dependant, hell, idtech5 will run on an iPhone at 60 fps [linky](http://www.engadget.com/2010/08/12/carmack-blows-minds-with-id-softwares-rage-running-on-iphone-a)


(LyndonL) #16

The link returns to the engadget homepage mate.


(master[mind]) #17

The megatexture/virtual texture stuff is awesome, though I want to still see shader and lighting/shadowing performed on the GPU. Cool link though.


(Nail) #18

bah, never been good at that link stuff, maybe

http://gamevideos.1up.com/video/id/30824

text from article

We're sorry, but the Palm Pixi's rendition of Need for Speed no longer impresses us -- we've just seen John Carmack show off Rage for iPhone. While of course it looks nothing like the PC graphical monstrosity that swept the E3 Games Critics Awards, it's safe to say the 60 frame-per-second tech demo at QuakeCon 2010 shoves the cell phone gaming envelope through a Juggernaut-class brick wall. Where Carmack originally called the iPhone "more powerful than a Nintendo DS and PSP combined," the id Software co-founder is now aiming squarely at the likes of PS2 and Xbox with iPhone 4 hardware. Not impressive enough? He says it still "runs great on an original 2G iPhone" as well. VG247, who liveblogged the event, reports the title will be available in the App Store later this year for a relatively inexpensive price, with a second game available in time for the PC game's 2011 launch. Sadly, there's as of yet no plans for Android owners to get the same megatexturing goodness. Don't miss the video after the break, because this screenshot doesn't do it justice.


(Nail) #19

bah, never been good at that link stuff

try
http://www.engadget.com/2010/08/12/carmack-blows-minds-with-id-softwares-rage-running-on-iphone-a\


(master[mind]) #20

[QUOTE=Nail;244268]bah, never been good at that link stuff, maybe

http://www.engadget.com/2010/08/12/carmack-blows-minds-with-id-softwares-rage-running-on-iphone-a[/QUOTE]

Worked for me, maybe DNS issue? As long as something works. So Brink is using Id Tech 5 or 4?

Edit: Nvm, it’s 4.