Geforce TI4800 VS Geforce FX 5200/5600

Some Llama

Limp Gawd
Joined
Aug 6, 2001
Messages
379
Hey guys,

Im going to be building a new system and as far as video goes I just want to get a basic card that has performance on par with a Geforce 4 TI 4600 or 4800 and I noticed for less than the price of a 4800 I can get a FX 5200 or 5600 with 256Mb of ram, my question is:

Will i get better performance from a FX card or the older geforce4 4800/4600 card?

If so, what does the FX have advantage-wise over the geforce 4 cards?

Keep in mind I know there are better video cards out there, but for my limited budget i can wait till prices go down to get a 9800 pro or such.

Thanks in advance,
Some Llama
 
ti4600/4800 and even a ti4200/4400/4800SE completely destroy the 5200/5600 preformance-wise.
 
then what is the point of releasing a "newer" product that performs worse than a "older" product? even if it gives new functionality like pixel shaders, if it runs like crap whats the point?


I ask to get some idea of what exactly this "newer" model is supposed to bring to the table...


Also if we throw this one in the mix, where would it come out?
GeForce FX5700 128MB DDR
 
then what is the point of releasing a "newer" product that performs worse than a "older" product? even if it gives new functionality like pixel shaders, if it runs like crap whats the point?
yeah nvidia had some problems with those cards like the fx 5200 they are basically a DX9 enabled MX440. 5600 are not great either and there is no point in getting the 256 version at all. if its a midrange card get a 9800 pro they are like 50 bucks more than the 5700. i dont know maybe try a 5700 if its only not going to be used for gaming much. thats my mindless yammer.
 
Originally posted by Some Llama
then what is the point of releasing a "newer" product that performs worse than a "older" product? even if it gives new functionality like pixel shaders, if it runs like crap whats the point?


I ask to get some idea of what exactly this "newer" model is supposed to bring to the table...


Also if we throw this one in the mix, where would it come out?
GeForce FX5700 128MB DDR

Price, and phasing out the old product line. It's also more impressive to the people who are technically ignorant to see a 256MB over a 128MB one, even if the card with the larger memory is inferior.
 
Originally posted by V0ltage
5700 is worthless due to the 5900NU/SE/XT being so cheap.

being so cheap means different things to different people, even though it is under 200 it is still listed as upwards of 50$ more than what i could get a 5700 for...

thanks to everyone for their comments, this has helped me in my search for knowledge... unless someone can give me a good reason otherwise, I iwll most likely get the best geforce4 4800 I can for my limited budget...

cheers,
Llama
 
Don't - I tried it and didn't like it.

Problem with the TI4600-8x (and even moreso the Ti4800, which is only a Ti4200-8x) is that the anti-aliasing performance on it is just awful. And it doesn't look so great for it, either.

Seriously, though, WHERE are you seeing a 5900XT for $50 more than a 5700 Ultra? Looking on Newegg, they sell the 5900XTs (and SEs) pretty cheap. For example, a 5900XT with video-in, to boot for $185.

Compare that to $174 for a 5700 Ultra. $20 for sometimes double the performance?

Now, if you were looking at 5700 non-Ultras, I can see your point. They come as low as $139, which isn't bad.

In any case, definately DO NOT look at FX 5200s and FX 5600s.

I'd recommend an FX 5700 over a TI4800, if you can afford it. FX5900 XT is the best bet, though.
 
To support my above argument, check out these charts over at Tom's.

You can get a pretty good feel for how the cards stack up here. Note that the previous pages are nearly useless - covering all kinds of performance with no anti-aliasing and no aniso-filtering. Who spends $100s on a graphics card and DOESN'T use anti-aliasing and aniso-filtering? Doesn't make sense! (It's worth noting that a Ti4600-8x does very well in those tests, but, again, who plays games like that now?)

The linked page, and the following two after it, cover the charts you'll want to pay attention to.

(It's also worth pointing out that, if you don't NEED nVidia, the 9600 XT can pull nearly-5900XT performance in some situations, and always 5700 Ultra performance....and is $149 on Newegg.)
 
It's also worth pointing out that, if you don't NEED nVidia, the 9600 XT can pull nearly-5900XT performance

No it cant. It has been proven about 1 billion times in about 30 differnt threads and about 6 differnt benchmarks that i have seen.
 
Then perhaps you should look at the benchmarks I've just posted that prove that point.

Without FSAA and 8xAniso on, then no, it can't. But, again, who pays more than $100 for a graphics card and doesn't use those features?

ATI's FSAA and Aniso algorithm are faster and look better than nVidia's, and it allows the 9600 XT to *sometimes* pull even with the 5900 XT. Note that we are talking about the XTs of both - basically the highest, top-end 9600 core vs the lowest low-end 5900 core.

Please read my post again before posting knee-jerk responses. I noted that SOMETIMES the 9600 XT can put up NEARLY 5900 XT performance.

I'd be very surprised if you could find benchmarks to prove that wrong.
 
yah i was looking at the 5700 non-ultra, and I don't need or really care about AA, my budget is about 130-150 but that's the limit i will pay for a card, the 9600XT does look good though and it will out perform a 4800 gf4 for 10 bucks more... I might just be getting a 9600XT....

What I care about is getting +60fps in any game running at 1024x768, and being able to play ANY game that comes out in the next year or so with those frame rates...

if the 9600XT can do this for under 150 then I'll get it, if I need to spend more than that to enjoy the latest graphics, i'd rather wait till a card that can support the latest DX9 or such stuff falls under 150..
 
What I care about is getting +60fps in any game running at 1024x768, and being able to play ANY game that comes out in the next year or so with those frame rates...

Wellll....keeping in mind that in-game graphics settings can always be lowered, that should be doable on a 9600XT with no FSAA or Aniso. SHOULD being the active word. Until Doom3 and HL2 hit, there is just no telling for sure.

I mean, a 9600XT is still pretty high-end (or, high mid-range, anyway). Valve and id would not be so foolish as to code their engines so detailed that a 9600XT couldn't achieve 60fps at ALL.

Question is...what do you have to turn off to GET that 60fps at 1024x768? And, will it be 'too much' and make the card feel worthless?

The short answer is that '60fps at 1024x768 for all games in the next 1 year' is always an unrealistic demand. Sometimes it works out (Radeon 9700) most the time it doesn't (even the 9700 doesn't get than in Morrowind with max FSAA and some in-game settings cranked). And the 9700, at release, was definately out of your budget.

Essentially, for $150-200, you can get a card that will play the currently released games okay, and probably new ones for 6 months or so maybe, but no more. And that won't change by waiting. 3 months from now when the budget versions of the nv40 and r420 generation are out, THEY will be able to play only current games okay, and maybe 6 months or so of new ones.

What you need to do - rather than say 'all games for 1 year' is identify what one or two games you are actually interested in, wait till THOSE games come out, and see what graphics card you can afford that will play them.

Waiting for 'the next generation of graphics card that will play future games great' will always be a losing proposition, because graphics cards technology will never catch UP with games, let alone get a year head start on them.

Find what games you want to play, THEN find the graphics card for them. Not the other way around.
 
I dont agree, when I bought my Geforce 4 ti4200 ( i have a "lan room" with 4 comps, and diff hardware), I was able to play every game i had at maximum settings at 1024 full color and get 60+ always (usually a lot more that that, eg. 100+) even with UT2003 coming out 6 months later and with COD just a few months ago, I still can get away with 60+ and medium to high settings at 1024, at the time the 4800 ultra was one of the best cards out there and the 4200 was around 150 (plus being able to OC to 4400+ speeds was a nice factor) so this card has lasted me 1 1/2+ years and still seems to be holding it own with any new game that comes out (although i wonder about hl2 and doom3 because I will be hitting those bad boys for sure....)

now if we are talking WITH AA or ansio then that's a diff story... as my card kinda chugs when i use those options, but for me I think AA looks lame and makes things look "fuzzy" (try it in jedi knight 2 jedi academy with 2-4x and ansio) and not as crisp and clear.. but i guess that is personal preferrence.

When I buy a card i expect it to play every current game well (within my specs for playing games) plus be able to perform well on games released 1-2 years from then (with a realistic understanding that some functionality and or quality might suffer)
I base these expectations on previous performance and abilities of cards and software that I have bought and used... i don't think i'm being all that realistic to expect my card to not be obsolete the day i buy it (from what you are saying below) and I don't think we should expect that.
 
(with a realistic understanding that some functionality and or quality might suffer)

That was part of my point. Functionality and quality suffer.

For me, spending $100+ on a video card and not using FSAA doesn't make sense, so that '60fps for the next year of games' just isn't gonna happen.

As to your opinion on FSAA....well, I'm sure it'd change if you used a modern card.

The FSAA on the GF4 kinda sucked - it was okay for its time, but modern FSAA algorithms look MUCH better.

As I understand your posts you HAVE a Ti4200 and are LOOKING AT a Ti4800? I hope you mean a Ti4600-8x, as a Ti4800 is nothing but a Ti4200 with AGP 8x. Same card you already have!

Even so, you will not find this a good upgrade, I think. Ti4600 just isn't THAT much faster than a Ti4200, and the AGP 8x thing offers no performance increase at all.

Getting an FX 5700 (at least) or a Radeon 9600XT would be a much better improvement - noticeable, in fact.

And the thing with those cards is that 2xAA is almost 'free', and 4xAA is generally usable in all current games. Once you've seen 4xAA on a Radeon, you'll NEVER play games without it again. It just makes a night-and-day difference.
 
well i was interested in seeing truform in ATI's cards as from SS i've seen it looks pretty sweet, and i do plan on getting a radeon 9800pro 256 or better eventually, but yah for now all i need is a card that has basicaly functionality (like the geforce4 or better) for under 150... better if 130 or less as i am building a brand new system so the 20 bucks i save here and there will allow me to get lower cas ram or bigger Hd, etc..

thanks for your input tho, it has really helped me in getting some useful information :)
 
Originally posted by Some Llama
I dont agree, when I bought my Geforce 4 ti4200 ( i have a "lan room" with 4 comps, and diff hardware), I was able to play every game i had at maximum settings at 1024 full color and get 60+ always (usually a lot more that that, eg. 100+) even with UT2003 coming out 6 months later and with COD just a few months ago, I still can get away with 60+ and medium to high settings at 1024, at the time the 4800 ultra was one of the best cards out there and the 4200 was around 150 (plus being able to OC to 4400+ speeds was a nice factor) so this card has lasted me 1 1/2+ years and still seems to be holding it own with any new game that comes out (although i wonder about hl2 and doom3 because I will be hitting those bad boys for sure....)


Same. I love my Ti4200. Glad I spent the money all that time ago..
 
Looks like im gonna go for the 5900XT from nvidia, out performs th 9600XT and basically the same price, now if only i can find it for under 150...
 
Back
Top