What should I buy? (ATI or Nvidia)


(Ramsgaard [DK]) #21

I got approx. 250 $

Should be enough for a 9700 Pro

Or is it?


(Garfo) #22

I am getting a Hercules 3D Prophet 9800 128MB DDR 4 £225 english pounds and got my MSI FX5600 sold 4 £130 english pounds :banana: :clap: :banana:


(ScreamZ) #23

I have both Radeon 9800 XT Pro 256mb and The Nvidia GeForce FX 5950 Ultra and there both killer cards. I run the ATI in my Main rig and the GF in the Lan rig. Both good cards its just up to you to pick a brand thats all.


(Fusen) #24

god damn screamz you spent abit on your rigs


HARLEY-DAVIDSON FXDX


(Ramsgaard [DK]) #25

Thanks for all the replies.

Now I am down to:

Geforce FX 5900 - 256 MB
Radeon 9800 Pro - 128 MB

Geforce is approx. 25 $ cheaper than Radeon in Denmark.

Is the 256 MB important in new games today vs. the 128 MB?

Is the 9800 Pro much faster than 5900?

And have anyone had problems using the Geforce FX cards with Intel 865G chipsets?


(=ABS= SparhawK) #26

ATI all the way :clap:

QUOTE

By now im sure all of you know that futuremark has released a new version of 3DMark03. This new version disables all of the cheats which they knew of in any drivers to date. This info displays some results between some Nv and ATi cards.

3DGPU - Patch - Driver - 3DMark - GT1 - GT2 - GT3 - GT4
5950UL - 330 - 52.16 - 6412 - 205.7 - 46.6 - 37.0 - 37.3
5950UL - 333 - 52.16 - 5538 - 205 - 39.6 - 33.1 - 26.3
9800XT - 330 - ca3.8 - 6435 - 209.7 - 45.4 - 38.6 - 36.3
9800XT - 333 - ca3.8 - 6436 - 210.5 - 45.4 - 38.6 - 36.3

Note that NVIDIA hardware and drivers lose 15 - 20%, while ATi’s hardware and drivers lose 0%. At this point ATi seems to be completely clean, while Nv seems to still have some artificially inflating optimizations in their drivers. This was taking from B3D… Thanks for Dave and a few others for posting such info. Future Mark version of 333, is the release candidate for 340. Im sure we’ll see more scores pop up soon enough.

Full write up @

http://www.beyond3d.com/articles/3dmark03/340/

Statements from both companys

“With the introduction of the GeForce FX - we built a sophisticated real-time compiler called the Unified Compiler technology. This compiler does real-time optimizations of code in applications to take full advantage of the GeForce FX architecture.

Game developers LOVE this - they work with us to make sure their code is written in a way to fully exploit the compiler.

The end result - a better user experience.

One of the questions we always get is what does this compiler do? The unified compiler does things like instruction reordering and register allocation. The unified compiler is carefully architected so as to maintain perfect image quality while significantly increasing performance. The unified compiler a collection of techniques that are not specific to any particular application but expose the full power of GeForce FX. These techniques are applied with a fingerprinting mechanism which evaluates shaders and, in some cases substitutes hand tuned shaders, but increasingly generates optimal code in real-time.

Futuremark does not consider their application a “game”. They consider it a “synthetic benchmark”. The problem is that the primary use of 3DMark03 is as a proxy for game play. A website or magazine will run it as a general predictor of graphics application performance. So it is vital that the benchmark reflect the true relative performance of our GPUs versus competitors.

And, while they admit that our unified compiler is behaving exactly the way it behaves in games and that it produces accurate image quality, they do not endorse the optimizations for synthetic use. Hence, Futuremark released a patch that intentionally handicapped our unified compiler.

So, we advocate that when reviewers are using 3DMark as a game proxy, they must run with the unified compiler fully enabled. All games run this way. That means running with the previous version of 3DMark, or running with a version of our drivers that behave properly.”

Derek Perez
Director of NVIDIA PR

Just today we received the following rebuttal from ATI:

“It’s been claimed that Futuremark’s changes have disabled compilers. This is complete nonsense. ATI has had a compiler since CATALYST 3.6 and it didn’t have any problems with Futuremark’s changes. Shader replacement and compilers are completely different operations.

ATI has had a compiler since CATALYST 3.6. We didn’t have any problems with Futuremark’s changes. The new build of 3DMark03 gives an honest picture of the relative DX9 game performance of graphics cards. It accurately reflects what gamers will see with titles such as Half-Life 2, and what they already see with today’s DX9 games, such as Tomb Raider: Angel of Darkness.

Secondly, it is disingenuous to claim that Shader replacement better reflects the performance in games. Only a tiny fraction of games get the attention that 3DMark03 has had from our competitors. It requires too much programming time. Over 300 PC games are launched a year, and 100 of these will really tax the graphics hardware. Maybe a half-dozen - the ones most used as benchmarks - will receive the gentle caress of the driver engineer. An honest run of 3DMark03 will give a true indication of performance for the overwhelming majority of DirectX 9 games. Gamers need to be able to play any game they want; they don’t want to be locked into the six that have had all their shaders replaced.

Even assuming that you somehow found the resources to replace all the shaders in every game, it’s still not a practical solution. Shader replacement is a massive step back for reliability and game compatibility. Every year you’ll be writing thousands of Shader programs that each have to be checked for image quality, taken through QA and supported. And changed whenever the developer issues a patch. Treating every game as a special case is a huge stability issue. Developers have come out against Shader replacement. John Carmack is on record as saying “Rewriting shaders behind an application’s back in a way that changes the output under non-controlled circumstances is absolutely, positively wrong and indefensible.” The opinions of Gabe Newell, Valve Software’s CEO, on Shader replacement are well-known. Developers hate it. What if they release a new level, the gamer downloads it and performance sucks? The hardware vendor isn’t going to get any grief, because all the user sees is the old levels working fine and the new one running like molasses in January. The problem’s obviously with the game, right? Developers are worried about the support nightmare this approach will generate and the damage to their own brand when they get blamed.”

Chris Evenden
PR Director
ATI Technologies

and then later on

"Luciano Alibrandi, European Product PR Manager for NVIDIA Corporation, has made a correction in regards previous information about NVIDIA’s Unified Compiler and 3DMark03 benchmark after getting into details with the company’s engineers. Apparently, the statement claiming that NVIDIA’s Unified Complier deployed to optimize pixel shader performance is disabled by the new version of 3DMark03 is not fully correct.

“I would like to inform you that a part of my response was not accurate. I stated that the compiler gets disabled, by 3DMark and that is in fact not true,” he said.

So, after all NVIDIA denied the problems between the Unified Compiler technology and the latest version of popular 3DMark03 benchmark. As a result, we may now conclude that the accusations in Futuremark direction from Hans-Wolfram Tismer, a Managing Director for Gainward Europe GmbH were not correct at all.

In October 2003 Santa Clara, California-based NVIDIA Corporation introduced its Unified Compiler integrated in its ForceWare 52.16 drivers to optimize Pixel Shader code for NVIDIA GeForce FX architecture in an attempt to improve performance of graphics cards powered by NVIDIA’s latest GPUs in variety of demanding applications.

NVIDIA said that the Unified Compiler technology tunes DirectX 9.0 execution on the GeForce FX GPUs, and can be used to correct any similar conflict that arises with future APIs. NVIDIA indicated the Unified Compiler as an automatic tuning tool that optimizes Pixel Shader performance in all applications, not just on specific ones. Officials from NVIDIA again stressed today that one of the things the Unified Compiler does is to reinstruct the order of lines of code in a shader. By simply doing this the performance can increase dramatically since the GeForce FX technology is very sensitive to instruction order. So, if the re-ordering is not happening NVIDIA’s GeForce FX parts have a performance penalty.

Since the complier is still active with the new version of 3DMark03 there is currently no explanations for performance drops of certain GeForce FX parts in the latest build 340 of the famous 3DMark03.

“The only change in build 340 is the order of some instructions in the shaders or the registers they use. This means that new shaders are mathematically equivalent with previous shaders. A GPU compiler should process the old and the new shader code basically with the same performance,” said Tero Sarkkinen, Executive Vice President of Sales and Marketing for Futuremark Corporation – the developer of 3DMark03 application.

He was indirectly confirmed by an ATI official yesterday, who said: “ATI has had a compiler since CATALYST 3.6. We did not have any problems with Futuremark’s changes.”

full statement @

http://www.xbitlabs.com/news/video/display/20031114041519.html

SH


(Gump) #27

Supposedly the TI’s are better than the newer Nvidia cards.

I have TI4600 and run ET with all the jazz turned on and never have a problem (except for when my virus scan randomly kicks in!).


(RivrStyx) #28

I had a GF4 TI4600… a really good card. Decided to get a new card. Got the ATI9800 pro 256. It was horrible. Framerates less then the TI4600. You could also see a white outline of the skyboxes on some maps which they were suppose to fix a long time ago.

I took the card back and got the GF FX 5900 ultra 256. Kicks ass and frames are never below 100 (i usually cap at 76 and is steady even on customs)…even on large servers (32 plus players) and graphics maxed.
Need to have the 8x agp bus to really see what the fx5900 ultra can do…if not and only have 4x agp get the ti4600 because may not get much more performance using the FX 5900 and 4x agp.

The FX 5200 blows chunks but the high end FX 5900 ultra out does the new ATI9800 pro with no problem after trying both. I have the new intel chipset btw

Just my opinion and i like GF.


(=ABS= SparhawK) #29

ATI all the Way :clap:

At the moment a ATI is best, correctly installed and configured.

RivrStyx is brilliant at maps, not so hot on installing hardware :smiley:

Read reviews, learn.

The following generation of Cards (in 1 year) will have ATI playing catch up as nvidia have gone thro’ certain knowledge barriers (e.g. .13 and .11 micron wafers and the like).

8X and 4X and 1X

Biggest marketing con in video cards ever, on a 9800 with 256mb ram benchmark @ 8X and 1X and differnce will be about 2%.

Some reviews and info

http://www.hardocp.com/article.html?art=NTUyLDE=

http://www.envynews.com/index.php?ID=534

http://www.beyond3d.com/articles/3dmark03/340/

http://www.rage3d.com/board/showthread.php?s=6c476d816fedfeaa22d92827f76e96ab&threadid=33726125

http://www.rage3d.com/board/showthread.php?s=6c476d816fedfeaa22d92827f76e96ab&threadid=33726271

http://www.rage3d.com/board/showthread.php?s=6c476d816fedfeaa22d92827f76e96ab&threadid=33719852&highlight=agp+bus+speeds


(RivrStyx) #30

Lol …well that may be but I have 5 systems at my house … the FX and the ATI was originally installed in one that only went up to 4x agp…neither did to well (although the GF FX still had better frame rates) and the ti4600 was actually better. I added them to my newer system that has 8x agp and the GF FX kicked ass…the ATI was a bit better but still not out doing the TI4600 really. I had a radeon 9700 awhile back and also didn’t live up to the hype thus i got the TI4600. I’ve also heard the AGP bus speed means nothing and maybe a different factor was at play… but the FX was much better than the ATI.

The ATI was installed and configured correctly on my system (i think i can handle that) but the GF FX 5900 ultra blew it away. The reviews are great and i’m sure theres reviews that go the other (though i don’t care to search for them) but just wasn’t my experience. Nvidia is my preference from personal experience twice now. Kinda like personal experience makes some like AMD and others like Intel.

Both are good cards and a hard choice… GF FX does better in Doom3 and the 9800 pro supposibly does better in HL2.


(=ABS= SparhawK) #31

Agree to Disagree

Love the maps

P.S. have 12 systems @ home, is mucho sad.

9700
9700
9700
8500
8500
gf3ti200
gf3ti200
7500
gf4 4200
xfx 128MB on board thing x 3


(RivrStyx) #32

Thx
:eek3: …You have 12 systems at home? I thought having 5 was bad… :banana:


(=ABS= SparhawK) #33

Clan night is thursday. :clap:

Best part of 14 machines playing ET LAN match. :eek:

+cough+
“Its not just a game, its a way of life”
+cough+

He He

SH


(amazinglarry) #34

Well my friend if one of your choices is a Geforce FX 5900 Ultra 256mb why not just go with the best one out there currently? And it only takes up one slot!

The Asus V9950 Ultra! w00t. This card is amazing.

http://www.sharkyextreme.com/hardware/videocards/article.php/3079811

369 dollars here at NexCom Direct with free shipping.

http://www.nexcomdirect.com/itemdesc.asp?CartId=7607277BZWCS-ACCWARE-824&ic=VGAASUV9950U&cc=&tpc=

406 bucks at www.newegg.com if you want more a reputable vendor.

It’s roughly 15-50 bucks more expensive then the 5900 Ultra 256mb, and it out performs it. Check it out.

BTW I’ve heard of no such problems with any nvidia cards with any chipsets. Just from my experience I’ve never had problems with any Nvidia card, although a few friends of mine recently have had problems with their 9800… one has already returned it and is picking up a V9950 128MB card and the other one is doing the same when he gets a few more bucks.

Good luck with your decision… in the end ATI and Nvidia are both excellent card manufacturers, it really all comes down to the very little things such as if you want a few more FPS on lowest resolutions go ATI, if you want higher FPS on higher resolutions go Nvidia… things like that… IMHO they basically cancel each other out and work out to be both great cards. All comes down to personal taste I guess, like color. :smiley:


(duke'ku) #35

I recently got a card very similar to this (same chipset, different company), very satisfied with its performance… 43 fps steady on radar at 1024x768 makes me leap with joy.


(Bokkem) #36

You might want to check out this review on Toms hardware site:

In either ‘low’ and ‘high’ settings the NVidia cards perform the best with ET.

I recently bought a FX 5900 Ultra (€539,-) and I am very satisfied with it.
It’s fast, cheaper then high end ATI Radeon cards, and quiet.

However you should be aware there’s an issue with the current version of Punkbuster and the latest (WHQL) Forceware driver (52.16) used in the test.