Get A 7800 GT/X now or wait for ATi...?

Someone show me some facts that prove ATi's adaptive AA to be superior to nVidia's TSAA/TMSAA. Don't tell me, "so and so reviewer thought it looked nicer" because that is just that one reviewers opinion based on his own configuration and likes. I realize ATi has a better AF implementation (trylinear and their new angle independent) but in MY opinion both are equal with respect to AA. So sure if you want the best AF possible, get yourself an ATi card but in the process you cheat yourself out of better driver support (I'll get to that) and opengl performance that you get with nVidia cards.

Regarding driver support, people tend to think drivers and video cards as a whole are confined just to gaming performance when they are not. You have to look at their functionality as a whole and judge which is better. The nVidia drivers give the user the ability to set custom timing rates for their display which the ATi drivers do not. This is vital for a Dell 2005 FPW user like me that enjoys 75 hz at native resolution because currently that is not possible with ATi drivers. There is also PVP which is superior to ATi's implementation as well as Digital Vibrance support (which I still like using for games) that a lot of ATi users wish they had. Also, and this is my opnion, nVidia drivers have a much better and cleaner interface than ATi drivers do.
 
Is it me or is the quality thing a moot point. (no not a mute point) Hell, the 9800 had MUCH better quality than the 5900. I bet the x800 had better quality than the 6800. Its just really an ongoing battle. It's really no surprise taht the 7800 doesn't quite have the high quality as the x1800.

But, it was THERE.

Now, could somebody straighten me out here....? Hardocp's review, which in my opinion is the most trusted source, showed the x1800xl was less able to play some games than was teh 7800. It was close, but less able to play, sometimes by 10fps. Now, what good is quality if it's a slideshow?

I really have a problem spending $100 more on slightly better quality, yet LOWER performance.

Some of you say that the x1800 series is a humungous performer.... maybe in benchmarks. Unless [H]ardOCP sucks at reviewing, or had terrible drivers (yeah right) then in games, where it matters, the nVidia solution is your best bet, especially in the price/performance ratio.

Allow me to also demonstrate my marvelous math knowlege. Allow the x1800xl to be x and the x1800xt to x+1. The same for 7800's, except with a y.

Now, if X<Y, then it follows that (X+1)<(Y+1).

I may be totally wrong about that, but it would be nice to have it proved to me.

 
PRIME1 said:
ATI FUD.

In the reviews I read, there was no mention of a big difference in IQ between the cards.
hard to tell IQ when you are doing benchmarks and under a time crunch. there is a different feeling with ATi versus NV
 
R1ckCa1n said:
hard to tell IQ when you are doing benchmarks and under a time crunch. there is a different feeling with ATi versus NV


I completely agree with you on that one. ATI vs NVIDIA feeling is hard to explain unless you have experienced it. The IQ on ATI cards in the past has been better from my experience.
 
apHytHiaTe said:
Allow me to also demonstrate my marvelous math knowlege. Allow the x1800xl to be x and the x1800xt to x+1. The same for 7800's, except with a y.

Now, if X<Y, then it follows that (X+1)<(Y+1).

I may be totally wrong about that, but it would be nice to have it proved to me.

lol.. thats algebra :p

anyway... I just went ahead and ordered a X1800Xl.. reason was because I prefer IQ
I have a 6800 Gt and that was a disappointment.. shimmering

i will have gone the 7800 route but shimmering still exists in those cards

I wouldnt have order one if it wasnt because of neweggs deal of no interest for 6 months
 
Bigjohns97 said:
The fact is that ATI's AF and AA methods are still better than nvidiaas seen above.

There is another pic out there that shows the AA.

Saying the AF diffrence noted above would be a choice is not really a strong agrument to put it nicely. You perfer you textures to be blurred by distance like above??? :confused:

Isn't that the way it looks in real life the farther away it is the more blurry it gets.It maybe part of the reason Nvidia users don't seem to notice these things because they look more relistic you might not notice that another card will show it noe sharp. Not everything can be measured the same way.
 
apHytHiaTe said:
Is it me or is the quality thing a moot point. (no not a mute point) Hell, the 9800 had MUCH better quality than the 5900. I bet the x800 had better quality than the 6800. Its just really an ongoing battle. It's really no surprise taht the 7800 doesn't quite have the high quality as the x1800.

But, it was THERE.

Now, could somebody straighten me out here....? Hardocp's review, which in my opinion is the most trusted source, showed the x1800xl was less able to play some games than was teh 7800. It was close, but less able to play, sometimes by 10fps. Now, what good is quality if it's a slideshow?

I really have a problem spending $100 more on slightly better quality, yet LOWER performance.

Some of you say that the x1800 series is a humungous performer.... maybe in benchmarks. Unless [H]ardOCP sucks at reviewing, or had terrible drivers (yeah right) then in games, where it matters, the nVidia solution is your best bet, especially in the price/performance ratio.

Allow me to also demonstrate my marvelous math knowlege. Allow the x1800xl to be x and the x1800xt to x+1. The same for 7800's, except with a y.

Now, if X<Y, then it follows that (X+1)<(Y+1).

I may be totally wrong about that, but it would be nice to have it proved to me.


Well, Anandtech's review showed the ATi's in a better light for the most part. If you tally things up anyway. The ATi's were almost always faster with AA. Without AA, alot of games wer still a touch faster on the ATi cards. In the cases of Doom 3 and Chronicles of Riddick, the ATi cards got HAMMERED by the 7 series.

BF2 w/AA=With AA the ATi cards were definately faster. Even at higher resoltutions.
BF2 w/o AA=practically a draw until you pass 1600x1200 resolutions. The 7800 is CLEARLY faster.

Overall=ATi rules.

Day of Defeat Source w/o AA= nVidia
Day of Defeat Source w/AA=Pretty even. some resolutions turn the tide one way or the other.

Overall=Equal

Doom 3 w/o AA=ATi gets raped.
Doom 3 w/AA=ATi gets raped.

Overall=nVidia. This game proves that nVidia is the OpenGL king as if we didn't know that. The differences here are interesting, because it shows that ATi's OpenGL deficiencies are alot worse than nVidia's Direct X implementation.

It's very odd here, if you look at the performance percentages lost by enabling AA you can see that nVidia doesn't lose as much performance doing AA on OpenGL. Same as ATi doesn't lose out as much in Direct 3D as nVidia does.

Far Cry w/o AA=Equal (Stones throw from each other, testing variance could account for the test scores swaying either way to favor one card or the other.)
Far Cry w/AA=ATi is again the victor here, but not by much at all. There is no practical difference here.

Overall=Doesn't matter, both are just about equal in this game now.

Chronicles of Riddick isn't shown with seperate AA and no AA data. However, this has the nVidia card completely manhandling the ATi cards worse than even in Doom 3.

Overall=nVidia.

SplinterCell 3 w/o AA= ATi wins here, but not by much.
SplinterCell 3 w/AA= ATi wins again.

Overall=ATi. Not massive differences, but ATi clearly wins out.

What is interesting here is why people don't look at the data and fight about these things, let's break it down logically.

The ATi cards are faster by a SMALL PERCENTAGE in most games. However, in the games nVidia does better, THEY RAPE the ATI cards by a HUGE margine. (Not alot of titles run are like this though.) What does this tell us? Well it tells us that ATi's OpenGL is worse than nVidia's Direct 3D. Image quality shifts from driver version to driver version. What is today may not be tommarow. Neither has bad image quality. At least we don't have to have really crappy sky's in games with nVidia cards just to get higher frame rates. Most of the time, most people won't notice differences in image quality. Those that really do are few. Granted the [H] population is made up of nothing but exceptions to the rule. Generally though, nVidia's image quality is good. ATi's got problems with IQ in some games from time to time as well.

The thing right now is nVidia cards are cheaper, more readily available and great overclockers. The nVidia cards can be used in SLi TODAY. You can even get new AA modes with SLi enabled. 15x AA is pretty damned impressive, although many newer titles can't do it at the super high resolutions we usually use for games at 1600x1200 for example.

Overall, what is important is that you choose the card that works best in the games you play, at the price you are willing to pay. Bang for the buck is with nVidia right now. No doubt, and still Crossfire motherboards are either MIA or damn close to it. So today nVidia still has the fastest solution on the planet that you can get. The X1800XT isn't available.

The performance of these cards is often so damn close most of the time I don't see why arguments about them even come up so much.

If you play OpenGL games, get nVidia. If you play mostly Direct 3D games, go with ATi. It's pretty simple and this is the EXACT same situation we've been dealing with for YEARS. It doesn't seem like its going to change any time soon.

My advice is to buy the best you can afford when you are in the market and don't feel bad about it. Something better is always just around the corner. Get what you need when you need it. Right now, that's nVidia. In 2 months from now, there might be more ATi cards available and the X1800XT might be easy to get. I imagine though that the best bang for the buck is still going to be with nVidia for awhile though.
 
R1ckCa1n said:
hard to tell IQ when you are doing benchmarks and under a time crunch. there is a different feeling with ATi versus NV

I agree, most of the sites had no IQ analysis at all. Kudos to HardOCP taking screenshots and spending a few pages on IQ analysis.
 
Particleman said:
I agree, most of the sites had no IQ analysis at all. Kudos to HardOCP taking screenshots and spending a few pages on IQ analysis.

Amen.

And to Sir-Fragalot, thank you for the well written post, points well taken. I thoroughly agree with you. It am somewhat surprised about ATI doing so well, I guess that the [H]'s review threw me off, not that it's a bad one, but I just took too much out of it.

 
apHytHiaTe said:
Amen.

And to Sir-Fragalot, thank you for the well written post, points well taken. I thoroughly agree with you. It am somewhat surprised about ATI doing so well, I guess that the [H]'s review threw me off, not that it's a bad one, but I just took too much out of it.


Well [H]ard|OCP does have unique testing methods. I am not saying they are wrong, but I don't have all the details about what they might have done vs. the other sites. What I did notice is that Anandtech probably did exactly what most sites do and compared exactly the same settings for each card. Not leveraging the strengths of each card at all.

For example, on the Anandtech review they did benches at your standard resolutions like 1600x1200 with 4x Anti-Aliasing and 16x Antisitropic Filtering. Pretty standard. Both cards can do it so that's what they did. That doesn't really show you what the card can do, and doesn't leverage the new features added to the cards for image quality. Like nVidia's SuperSampling and Multi-sampling Anti-Aliasing Algorithms and they don't touch ATi's Adaptive AA at all either.

Keep this in mind. the [H] isn't comparing Apples to Apples here. More like Oranges and Tomatoes. This is not a bad thing. The [H] review is showing you each card in it's best light, and pitting them together to try and tell you which card is the better experience. If you read this particular part of the article: "In Half-Life 2, the gameplay experience benefits tremendously if you can enable the highest form of anti-aliasing, especially if there is an option for anti-aliasing alpha tested textures. With both the ATI Radeon X1000 and GeForce 7 series of cards, there is that capability. There is another quality feature the X1000 series has that the GeForce 7 series does not and that is High Quality anisotropic filtering. In Half-Life 2, there is a large use of high quality textures, which can benefit from the best filtering quality.

We found that we could enable both Adaptive AA and High Quality Anisotropic filtering in this game with very fast performance with the Radeon X1800 XL. We did some testing to see how much High Quality AF hurt performance and we found only a difference of around a few FPS, so this isn’t a huge burden on this video card. We found that 4X AD AA and 16X HQ AF were very playable at 1600x1200. If you look at the frame rates, the average was very high.

The BFGTech GeForce 7800 GT OC allows 4X TR SSAA at 1600x1200 with 16X AF. Performance was higher on the BFGTech GeForce 7800 GT OC; however, image quality was better on the Radeon X1800 XL with its gamma correction AA and HQ AF, and the frame rates were more than playable."

Clearly, we can see that here, they are showing you that in HL2, the 7800's are FASTER, but the ATi cards have better image quality. They are telling you something that the other web sites aren't and the [H] is quantifying the results in a way that other sites haven't.

Alot of people have attacked the methods Brent and Kyle use for video card reviews. Because they don't use the "apples to apples" method, alot of people get confused by the review. Really though, this is one case where you are getting extra valuable information.

For example, if you are a HL2 nut, then you get to choose higher FPS with the Geforce 7 series, or better IQ and still have badass FPS with the ATi card. You get to see which is REALLY better and which offers the better gameplay experience.

This is something Kyle has repeatedly tried to beat into peoples heads when their reviews are questioned. And this is why this review paints a different picture than Anadtech and Toms Hardware do. The catch is you have to read the information rather than just examine the numbers and compare them at face value.
 
Ok without all the hate floating around Pick whatever card you want. There are no huge differences between either card except the price. The Nvidia 7800 series will cost up to 150 dollars less for the 7800GT/X1800XL and the 7800GTX/X1800XT the price difference isn't known but it isn't out yet so you can't compare them really yet...

So best bang for your buck is the 7800GT OR 7800GTX period if you want an ATI x1800XL then buy it if you find it. THere isn't much difference between either brand Nvidia's is more mature yes and has other pros as well. The IQ thing is more of a opinion really and what you feel is better.

Choose whichever you want money wise Nvidia if you want to go ATI then go ATI.
 
It doesn't look like they used supersampling transparency AA for the nvidia screenshot in that comparison. If it was on, the railings should look similar to the ATI one (and similarly have a very noticeable drop in FPS)

But, what's the deal with Nvidia's AF quality as shown by that sloped wall? Isn't this something they can change with drivers?
 
Topweasel said:
Isn't that the way it looks in real life the farther away it is the more blurry it gets.It maybe part of the reason Nvidia users don't seem to notice these things because they look more relistic you might not notice that another card will show it noe sharp. Not everything can be measured the same way.


No that is not how is looks in real life with 20/20 vision.
 
5150Joker said:
Someone show me some facts that prove ATi's adaptive AA to be superior to nVidia's TSAA/TMSAA. Don't tell me, "so and so reviewer thought it looked nicer" because that is just that one reviewers opinion based on his own configuration and likes. I realize ATi has a better AF implementation (trylinear and their new angle independent) but in MY opinion both are equal with respect to AA. So sure if you want the best AF possible, get yourself an ATi card but in the process you cheat yourself out of better driver support (I'll get to that) and opengl performance that you get with nVidia cards.

Regarding driver support, people tend to think drivers and video cards as a whole are confined just to gaming performance when they are not. You have to look at their functionality as a whole and judge which is better. The nVidia drivers give the user the ability to set custom timing rates for their display which the ATi drivers do not. This is vital for a Dell 2005 FPW user like me that enjoys 75 hz at native resolution because currently that is not possible with ATi drivers. There is also PVP which is superior to ATi's implementation as well as Digital Vibrance support (which I still like using for games) that a lot of ATi users wish they had. Also, and this is my opnion, nVidia drivers have a much better and cleaner interface than ATi drivers do.

Don't forget the application profiles, and the ability to clamp LOD and force trilinear. Nvidia drivers allow much more control out of the box and paired with nhancer the implementation is much better than ATI.

My beef is I enjoy AF much better than any other form of IQ improvement, and it pisses me off that ATI's is so much better. I did load hl2 last night trying to find that tunnel to see if i had the same issue. I know you if you don't set HQ and turn all opts off globally some people have expierenced this IQ loss.

PS - i'm not sure if your AA comment was directed at me but i never mentioned those forms of AA i am just refering to AA when selected in game (bf2, hl2) and what i have seen.
 
Asian Dub Foundation said:
get a GTX now and wait for the XT to come down in price :)

Get a GTX now, smile everyday you see people waiting for the X5800XT to actually be shipped, then smile at what it costs, and then smile at Crossfire mechanics. When people say "My BF2 score is a little higher", reply "Oh well. My Riddick, Doom3, Q4, QuakeWars, Prey, and RTCW2 scores are a little higher, and my multi card doesn't need dongles, master and slave cards, or ATI motherboards with weak USB and no SATA." ;)
 
Rollo said:
Get a GTX now, smile everyday you see people waiting for the X5800XT to actually be shipped, then smile at what it costs, and then smile at Crossfire mechanics. When people say "My BF2 score is a little higher", reply "Oh well. My Riddick, Doom3, Q4, QuakeWars, Prey, and RTCW2 scores are a little higher, and my multi card doesn't need dongles, master and slave cards, or ATI motherboards with weak USB and no SATA." ;)

Ouch. There is much truth in this statement, but damn that's harsh.
 
Rollo said:
Get a GTX now, smile everyday you see people waiting for the X5800XT to actually be shipped, then smile at what it costs, and then smile at Crossfire mechanics. When people say "My BF2 score is a little higher", reply "Oh well. My Riddick, Doom3, Q4, QuakeWars, Prey, and RTCW2 scores are a little higher, and my multi card doesn't need dongles, master and slave cards, or ATI motherboards with weak USB and no SATA." ;)

Still trolling I see. It's amazing that you are able to see in the future and determine Q4, Quakewars, Prey, and RTCW2 performance across all cards :rolleyes: Too bad for you it seems that initial driver work has already yielded significant OpenGL gains for the x1800 with more work to be done. Rollo am cry :(

Nice to see you also conveniently gloss over facts. I take it back, you should stay on these boards as you and Kyle have a LOT in common when it comes to ATI.
 
Sir-Fragalot said:
Ouch. There is much truth in this statement, but damn that's harsh.

It is Rollo. I am firmly in the Nvidia camp at this point, but Rollo is still Rollo.

:)
 
riggs58 said:
Much, much better? I'm not sure I would go that far as they both have their pros and cons. Nvidia's being: single slot, cooler, better OpenGL drivers, and better SLi support (if you plan on going that route, that is). The 1800XT is a great card, no doubt about that, but to convey it's the better card on all fronts is a bit misleading.
10 bucks says ATIs X1800XL will out proform 7800GTXs in 6 months to a year. The drivers on it arent optimizied yet. How does 7800GTX have beter duel card support? Also better opengl drivers? I dont know about that. I havent seen any reviews or tests done on opengl yet on the X1800XL or X1800XT. Single slot doesnt mater that much at all who cares if its duel slot? Most motherboards are set up so theres duel slot for cooling. I dont know but I'm pretty sure the X1800XT will completely kill the 7800GTX after a few months of optimization and it already kills it in most games.
 
pArTy said:
10 bucks says ATIs X1800XL will out proform 7800GTXs in 6 months to a year.
Thats fan-fucking-tastic considering NV is talking about a refresh of the G70 in that timeframe. :rolleyes:

Personally I would probably go for a GT instead of the GTX depending on the price of course. By the time ATI's releases a better card (or when they get Crossfire off it's ass) then you can think about getting one then, I would rather enjoy what I can get now.
 
Bigjohns97 said:
No that is not how is looks in real life with 20/20 vision.

Didn't know you got to stand in that alley that the wall was based on ;) . I am not Saying wether or ATI has better IQ all I am saying is that absolute perfection should only be accomplished by 3d renderers and a CAD users. I wan't realizism and seeing a perfect creases out the corner of the screen on a brick wall 25-30 feet away isn't relistic either.

Most of this doesn't matter because the game and overall graphics are so immersive that I don't really ever stop to ummm smell the roses.
 
ATI's High Quality AF has shown in some benchmarks to take huge performance hits. Maybe they will improve performance here but I highly doubt it.
 
pArTy said:
10 bucks says ATIs X1800XL will out proform 7800GTXs in 6 months to a year. The drivers on it arent optimizied yet. How does 7800GTX have beter duel card support? Also better opengl drivers? I dont know about that. I havent seen any reviews or tests done on opengl yet on the X1800XL or X1800XT. Single slot doesnt mater that much at all who cares if its duel slot? Most motherboards are set up so theres duel slot for cooling. I dont know but I'm pretty sure the X1800XT will completely kill the 7800GTX after a few months of optimization and it already kills it in most games.

Wow, just wow. Talk about a lot of talking out your ass.

For talking like that, you won't talk at all here for three days. Enjoy your vacation. Try something like this again, and your vacation will be permanent. Listen up people, we're serious about no flaming/name calling. - DL

It's dual, not duel. Look up the meanings of both those words.

Now let's take a look at some of the problems with your post:

1) Nvidia has dual core drivers out RIGHT now. In my books, that's better
2) OpenGL drivers are significantly better. Look at any of the benchmarks that show any Nvidia card doing much better than the respective ATI card. If you don't know which games are OpenGL, educated yourself before you post again
3) Some (a lot?) of people care about single slot cooling solutions due to the limited space they may have in their case(s)
4) Who's to say who will be better after several months of driver optimization. Being "pretty sure" about something being better down the road reads a bit like "it'll be faster cuz I said so"
 
Stanley Pain said:
1) Nvidia has dual core drivers out RIGHT now. In my books, that's better
2) OpenGL drivers are significantly better. Look at any of the benchmarks that show any Nvidia card doing much better than the respective ATI card. If you don't know which games are OpenGL, educated yourself before you post again
3) Some (a lot?) of people care about single slot cooling solutions due to the limited space they may have in their case(s)
4) Who's to say who will be better after several months of driver optimization. Being "pretty sure" about something being better down the road reads a bit like "it'll be faster cuz I said so"

All good points, just shame he had to flame... :p
 
In lieu of recent developments posted today, if you can wait a couple weeks, I think it will be worth seeing what happens with the x1800XT.
 
Shocky said:
All good points, just shame he had to flame... :p
Indeed, and this is precisely the right attitude. You don't need to direct negative commentary at other members/flame/use name calling to make a point. Actually, doing so detracts from your argument in the eyes of most people.
 
5150Joker said:
Regarding driver support, people tend to think drivers and video cards as a whole are confined just to gaming performance when they are not. You have to look at their functionality as a whole and judge which is better. The nVidia drivers give the user the ability to set custom timing rates for their display which the ATi drivers do not. This is vital for a Dell 2005 FPW user like me that enjoys 75 hz at native resolution because currently that is not possible with ATi drivers.

If I'm not mistaken (I happen to own the same monitor), The 2005 only supports 60hz at native (which is 1680x1050) if you want a higher scan rate you have to drop the resolution down. Forcing it higher really can damage the monitor, (I've done this with an NEC I used to own), but it is possible within the driver.
 
All the Open GL Crap with ATi Cards may change very soon. Some Hexus Benchmarks show the X1800XT beating a 7800GTX in all resolutions of a DOOM3 Benchmark with and without AA/AF on.
From the begining I said ATi just had to tweak its software and it looks like I was right. Their Memory Controller looks to be able to provide some insanely free performance boosts.
 
Stanley Pain said:
Wow, just wow. Talk about a lot of talking out your ass.

For talking like that, you won't talk at all here for three days. Enjoy your vacation. Try something like this again, and your vacation will be permanent. Listen up people, we're serious about no flaming/name calling. - DL

It's dual, not duel. Look up the meanings of both those words.

Now let's take a look at some of the problems with your post:

1) Nvidia has dual core drivers out RIGHT now. In my books, that's better
2) OpenGL drivers are significantly better. Look at any of the benchmarks that show any Nvidia card doing much better than the respective ATI card. If you don't know which games are OpenGL, educated yourself before you post again
3) Some (a lot?) of people care about single slot cooling solutions due to the limited space they may have in their case(s)
4) Who's to say who will be better after several months of driver optimization. Being "pretty sure" about something being better down the road reads a bit like "it'll be faster cuz I said so"
1) Duel core? I'm talking about SLI/Crossfire
2) I havnet seen any opengl benchmarks for the X1k series yet and if you know of one let me know. I think ATI will do just as good as the 7800GTX in opengl because the 7800gtx supports opengl 2.0 now when the older nvidia cards only supported 1.5 while ATI has supported 2.0 for some time now.
3) My case and mother board are 3 years old and it supports duel slots for AGP. Most cases/motherboards support duel PCI-E and AGP.
4) I say this because of how ATI designed there card. ATI made a lot of head room for optimization all over the card and as of right now its not to far behind the 7800 if any. On average there just about even.

I think ATI will pull ahead of the race because of how much head room they left for driver optimizations like I said above.
 
pArTy said:
1) Duel core? I'm talking about SLI/Crossfire
2) I havnet seen any opengl benchmarks for the X1k series yet and if you know of one let me know. I think ATI will do just as good as the 7800GTX in opengl because the 7800gtx supports opengl 2.0 now when the older nvidia cards only supported 1.5 while ATI has supported 2.0 for some time now.
3) My case and mother board are 3 years old and it supports duel slots for AGP. Most cases/motherboards support duel PCI-E and AGP.
4) I say this because of how ATI designed there card. ATI made a lot of head room for optimization all over the card and as of right now its not to far behind the 7800 if any. On average there just about even.

I think ATI will pull ahead of the race because of how much head room they left for driver optimizations like I said above.

http://anandtech.com/video/showdoc.aspx?i=2556&p=7

Same benchmarks from Anandtech, check the opengl benchmarks.. Yes its partially a driver issue but its unlikely it will acount for all the performance. Also check 1800lx results. Theres no way based on those results you can claim its going to outperform a GTX in 6 months unless the ATI dev team can work miracles. :)
 
pArTy said:
2) I havnet seen any opengl benchmarks for the X1k series yet and if you know of one let me know. I think ATI will do just as good as the 7800GTX in opengl because the 7800gtx supports opengl 2.0 now when the older nvidia cards only supported 1.5 while ATI has supported 2.0 for some time now.

Well you're wrong about opengl 2.0 support. And if you haven't seen any opengl benchmarks then you haven't read a single review of the X1800XT.

And to answer the thread question - don't spend $400+ on a 256MB card.
 
Shocky said:
unless the ATI dev team can work miracles. :)

Looks like this has happened. :) X1800XT outperforms 7800GTX in Doom3 on higher res with AA/AF.

bench-doom3-composite.png



The new fully programmable memory controller is Ati's secret weapon. :cool:
 
Apple740 said:
Looks like this happened. :) X1800XT outperforms 7800GTX in Doom3 on higher res with AA/AF.

bench-doom3-composite.png

Yep, and as soon as I see some XL numbers with the fix I'll be assured that this isn't just a memory controller tweak that allows the XT to make good use of it's 25% bandwidth advantage ;) Notice there are no noAA/noAF results with the fix for the X1800XT - if it was a general OpenGL fix those numbers would be higher as well.
 
Wait to see how much ATI can tweak the X1800's memory controller...thats what I would do

I too was going to buy a GTX after the initial reviews...but after reading about the 35% increase on doom3 because of memory controller tweaks...i think waiting is the best thing to do. Lets see is Nvidia has something they can do.
 
DASHlT said:
Wait to see how much ATI can tweak the X1800's memory controller...thats what I would do

I too was going to buy a GTX after the initial reviews...but after reading about the 35% increase on doom3 because of memory controller tweaks...i think waiting is the best thing to do. Lets see is Nvidia has something they can do.
They don't need to do anything, just buy the overclocked version 490/1300. Fairly cheap too compared even to a x1800xt 256MB version or even the 486/1350 version... . Id guess with the 512MB GTX coming soon they might increase the memory speed or at least some manufacturers might.
 
Shocky said:
They don't need to do anything, just buy the overclocked version 490/1300. Fairly cheap too compared even to a x1800xt 256MB version.

Well considering they are now just tweaking their memory controller, Nvidia overclocking a card will only work for so long, because ATI is just now starting to tweak it to get out as much perforamce as they possibly can.

This is only in openGL i might add. They also mentioned they will start tweaking it for D3D as well.

blah spelling errors
 
Back
Top