Which is better: 1900XT or 7800GTX 512mb?

HL2 even still has quake 1 libraries in it..

How could it? It's DirectX 9 based; it has no OpenGL port, and Quake used OpenGL. If you could give some links it would be helpful.
 
sabrewolf732 said:
Just because it's dx9 does not mean it cannot be based off of hl1 engine...

Well HL1 was OpenGL (with a dx layer), you tell me how a dx9 game (no layers) is based of an old OpenGL engine.
 
Source is a very popular engine, but at its core, it is essentially a series of Quake hacks. One can even see some original Quake documentation in some areas.

I can only review this from a modding point of view, but I have spoken to some developers who have a full licence. Their experience is similar to mine.

-Features:
Source engine has a nice set of features, complete with HDR and Facial Animation. It has a very advanced physics engine, that has caused a large stir.

The Source engine uses a shader based pipeline, and thus has some very nice effects. It has refraction, reflection and advanced specular highlights. The material system is very advanced and works itself into the shader system and the physics engine. Materials have "types" which determine the physical effects and how they interact with the world. The bumpmapping in Source is a mojor let down and gives no real appearence of depth.

The lighting in Source is quite nice, especially the HDR. Shadows aren't dynamic, which is a let down for the price. The engine uses soft shadow mapped shadows, which look quite nice, but are unsuitable for many purposes.

The AI is quite nice, but not perfect. It has some issues and needs to be heavily scripted in most instances.

The facial animation system uses muscles, and looks very nice ingame. VALVe has provided tools to facilitate this and it can lead to some very nice sequences.

The netcode for Source is particularly good, even though the physics are sever side, even physics based games manage to run lag free even on basic DSL connections.

Source provides a rounded feature set that is up there with the best. It is let down on some key issues such as shadows and bumpmapping.

-Ease of Use:
Source uses C++ with no scripting support. However, it provides a nice SDK.

Hammer is a BSP based world editor with support for advanced geometry and entities. The compile tools are great and easy to use to get content ingame. The Half-Life model viewer allows developers to view their models ingame without compiling a scene each attempt. A tool for scripting facial animation sequences allows for unprecentented control on lip synch and other aspects ingame.

The SDK works well with Microsoft C++ 7, but is difficult to use under diffrerent IDEs. It is generally easy to use, but has little official documentation, though several modding based sites give tutorials which can help programmers.

-Stability & Performance:
The stability of Source is less than perfect. It has been known to crash and bring up compile errors for no reason in particular.

Source can crash in several instances, and the engine is riddled with anomolies. For example, the engine will crash when the player in Half-Life 2 based games attempts to pick up the scout car with the super gravity gun. While this is a game specific example, it shows the issues with stability that Source suffers for a commercial product.

Performace is decent, even on older machines. Source can be scaled back to DirectX 7, which is impressive for a shader based engine. However, there are several things that negate performance, such as the physics in multiplayer being calculated server side.

-Support:
From a modding point of view, there is plenty of support from forums that are dedicated to Half-Life 2 modding. For a licenced developer, all accounts say that VALVe are very helpful and prompt with support and provide a top-noch support service that can be accessed 24/7.

-Summary:
The Source Engine was obviously made for Half-Life 2 and is best suited for urban FPS games. It provides quite good features (4.5 if it were possible), and is quite easy to use. It suffers from stability issues, but performs quite well on older systems. From all accounts, the support is top of the range.

If you are looking for publicity on your licence, Source is the engine for you. However, as a engine for serious developers, it is let down by some key issues that make it inferior to other choices.

-Overall Mark: 4.2

This document describes the structure of the BSP file format used by Half-Life 2, and other Source engine games. The format is similar but not identical to the BSP file formats of the Half-Life 1 engine, which is in turn based on the Quake 1 and Quake 2 file formats, plus that of the later Quake 3:Arena. Because of this, Max McGuire's article, "Quake 2 BSP File Format" (http://flipcode.com/articles/article_q2bsp.shtml) has been of invaluable help in understanding the overall structure of the format and the parts of it that have remained the same or similar to its predecessors.

that's just some of the documentation I found online by typing source and quake :eek:
 
sabrewolf732 said:
that's just some of the documentation I found online by typing source and quake :eek:

afaik, HL2 is largely based on HL1's engine.

The source engine was being developed the sametime the HL1 engine was nearly finished. Yes there is some similiar code in both of them, but to say Source engine is *largely* based off the Quake 1 engine is a joke.
 
HL2 even still has quake 1 libraries in it..

How could it? It's DirectX 9 based; it has no OpenGL port, and Quake used OpenGL. If you could give some links it would be helpful.

Now you're just changing what you say :rolleyes:
 
sabrewolf732 said:
Now you're just changing what you say :rolleyes:

Not at all, I said it's not based of OpenGL or Quake 1. And it's not. Just because it shares some techniques doesn't mean its based off an engine.
 
Source is a very popular engine, but at its core, it is essentially a series of Quake hacks. One can even see some original Quake documentation in some areas.

at first you said it wasn't based on Quake 1 at all, now you're saying it has some stuff. the above quote says it all.
 
I remember that someone was very interested in FP16 and FP32... Heres a site that actually documented these things... Unlike the biased as hell american websites(hardocp, anandtech). http://216.239.39.104/translate_c?h...1900-crossfire-test.html&prev=/language_tools

Even though the x1900xt does not have have these features is still etches out the 7800gtx 512 when it is tested. :p Nvidias top product costs a full 200 dollars more than ati's and doesn't even perform on the same level. If you think the 7800gtx 512 is better(performance AND value wise) after all this your insane. fyi.
 
BBA said:
You forget HL2 is actually built on modifications of the original Quake engine Carmack made. (Maybe you weren't around back then to know?)

And what if it is? That doesn't change anything about the quality and features of the HL2-engine.

BBA said:
Right there...you showed your complete lack of understanding. Doom3 engine has not yet been CPU limited, not even on the SLi 7800GT's.

I think you're the one not understanding something. Namely that it is very CPU-heavy because of its software shadowvolume and skinning algorithm. Which makes it CPU-limited when you put a card in that is 'too fast', like a 9600Pro in an 1800+ system. The same card performs much better on a faster CPU. Ergo, CPU-limited.

BBA said:
I guess you go on folklore instead of logic. Look, if it was CPU limited, then a 20% increase in CPU speed would produce more than maybe 3-6% (5-10 FPS) difference from a 3800 to an FX60 on an SLi 7800GT rig. On a 9600, your video card limited no matter what the CPU, even an Athlon 1GHz would do as good as your 1800+

Well, get an 1800+ and a 9600, and then test it at stock speed, and clocked at 1 GHz. Let's see who's right. I already know the answer.
Also, you can't compare low-end hardware with high-end SLI setups just like that. There is no direct linear relation between a single card and a SLI setup anyway. Same goes with single-core CPU or dualcore.

BBA said:
BUT: Since you stated that was your hardware, I believe you have NO experience with new hardware, so your whole argument thus far is based on second hand knowledge, were as people here actually have the hardware and have first hand knowledge (myself included)

That is my own hardware at home yes. But that's not the only hardware I have access to, obviously. I also have access to various high-end ATi and NV cards, and I have actually developed Direct3D software on those. I think I have a better understanding of what these cards are capable of, than pretty much anyone here in this thread. Which makes it all the more frustrating that people just ignore my comments and basically just say "but ATi gets higher framerates". Well, that's not the point I'm trying to make, is it? There's so much more than framerates. There are just effects that I can implement on a 6800 which are still virtually impossible on an X1900. Which is why I find it strange that people say a 7800 is 'older technology'.
 
What Direct3D software did you develope?
The 9600pro would be the greater bottleneck in Doom 3 than the 1800xp.

http://www.hardocp.com/article.html?art=NjQ0LDIx
AMD came out ahead in DOOM 3 performance with the strongest CPU in our tests, the Athlon 64 FX-53 processor. The Athlon 64 series of CPU is undoubtedly a powerhouse when it comes to overall gaming. Thankfully though, DOOM 3 is terribly forgiving to those of you that do not use the latest CPUs. DOOM 3 runs just fine on an Intel 1.5GHz Pentium 4 or an AMD Athlon XP 1800+. I came away from our testing at the id Software offices thinking that id has done a great job optimizing for both Intel and AMD platforms, and that DOOM 3 would run great on either platform readily and without issue. With that said, in our high end system testing, you can see where DOOM 3 and its image quality are allowed to scale upward with stronger CPUs from both Intel and AMD. Another standout was the ABIT IC7 system (i875) at 3GHz. The IC7 showed us that our previous thoughts about the aging i875 are indeed correct. The i875 may be a little long in the tooth as silicon goes, but when measured on performance there is little reason to replace the technology if you already own it. While the world of CPUs is changing a bit in terms of "better=faster," at this point in time when gaming is considered, more MHz are still welcomed. In the case of DOOM 3 though, the latest high-dollar CPUs are hardly a necessity as Athlon XPs and Northwood core Pentium 4s still bring more than enough of the needed power to the table.
 
Shadow27 said:
The 9600pro would be the greater bottleneck in Doom 3 than the 1800xp.

If you disable features like shadows, yes. But why do you have to disable those? Right, because it's CPU-limited otherwise.
And why do you take the word of some journalist who has no idea of how Doom3 works, and has never actually tried an 1800+ with a 9600Pro, over someone who both is an actual D3D developer, and knows what goes on inside the Doom3 engine, and has actually played the game on such a configuration?
 
Scali said:
If you disable features like shadows, yes. But why do you have to disable those? Right, because it's CPU-limited otherwise.
And why do you take the word of some journalist who has no idea of how Doom3 works, and has never actually tried an 1800+ with a 9600Pro, over someone who both is an actual D3D developer, and knows what goes on inside the Doom3 engine, and has actually played the game on such a configuration?

Could you tell me what you've developed please? I would love to see your work as I'm also interested in getting into it. I take their word as they are a respected review source. I personally had a 9600xt on a 2.53 P4 plus tried it on my 2.6ghz A64 (when I had my Asus A8V Deluxe). Now that is quite an improvement CPU wise and I saw almost NO improvement framerate wise with shadows enabled. There is a point in where the CPU simply can't help a shit card. My case is that if you had a 1800xp with a 9600pro a upgrade on the graphics would improve your framerates more than a CPU upgrade.
 
Scali,

I give up. Fact is next gen games that are already here (as both FEAR and SC:CT have a ALU:TEX ratio greater than 6:1) will give the end user a better experince on hardware that has more shading processing power. No one here is saying that the 1900 missing some features is a good thing.

People are asking which card is better. And when you compare them, you need to compare all the aspect of the cards and how the user will use that card. Fact is to developers different things will be more important (like these missing features). But at the end of the day, users here dont give a flying crap about developers needs, they only care about their needs. So things price, AA, AF, prefromance in current games and next gen games become much more important. Then you tack on features that they can use today like AVIO, HDR+AA, ect it makes a very strong case. Finally when you look at all of these and consider that these are for games and not developers, its a very simple choice on which on is better..

The fact that your getting lots of backlash from others here should indicate to you which the games like better as it has nothing to do with playing IHV favorites...
 
Jbirney said:
No one here is saying that the 1900 missing some features is a good thing.

You could have fooled me. Most people didn't even seem to realize it at first, and after I pointed it out, just pretended that it wasn't true.

Jbirney said:
People are asking which card is better.

I was just saying that the 7800 may have some things going for it aswell. I never said it was the better card, let alone that everyone should buy a 7800.
I was just saying that it has some features that the 1900 lacks, and that may be used for either more realistic graphics, or more performance in current and future games.
But I get the feeling that nobody is even listening, or is even trying to understand what I'm saying. Which is why I rarely post on such forums anymore.
 
{NG}Fidel said:
Page 2 by Scali.

Yes, except in that context it was purely feature-wise. Not the 'better' card in this context where it means it's the faster card, or the card to buy, or whatever.
 
Scali said:
Yes, except in that context it was purely feature-wise. Not the 'better' card in this context where it means it's the faster card, or the card to buy, or whatever.

Scali, your original response TO THE QUESTION "Which is better: 1900XT or 7800GTX 512mb? " was this:

Scali said:
I'd go with the 7800GTX (or 7900 when it's out).
The 1900XT still lacks some features that the 7800 has, which put it at a disadvantage with rendering techniques such as HDR. There were also some limitations with vertex texturing on the 1900XT, I believe.
I don't care too much about the speed in current games. Both cards are extremely fast, and there isn't a game that can slow them down yet. I just think the 7800 is better prepared for the future.

So, you have come down on the side of the 7800GTX 512mb because you believe it to be better prepared for the future. You have since vigorously defended that position. That is fine with me, because that is how debate works, and without debate this whole topic would be boring.

The problem with you Scali lies not in your opinion, or your desire to defend that opinion, but in your refusal to acknowledge ANYTHING anyone says that refutes your opinion, and your insistance that this topic is not about the overall cards, but about non-resolution based features (Which is actually hilarious because one of the BIGGEST benefits of the feature you are touting should be increased performance at higher resolutions). You point out 4 features the 7800 has that the 1900 doesn't. Someone else points out 2 that the 1900 has that the 7800 doesn't. You say "Those don't count because I am talking about something else".

WTF is that about? This isn't ABOUT your limited non-highres view of things, this is about which is BETTER OVERALL. Period. If you want to argue about non-resolution based image rendering abilities then start your own damn thread. If you want your opinion to count in this thread though, then you need to start accepting ALL The facts about these cards as valid, not just the 7800's facts.

Also, I find it hilarious that you are still defending your point when your original post has so many inaccuracies in it. After getting those corrected you would think you would accept that you didn't know enough when you posted it, and you would correct it to something more realistic.
 
oooooooooooooooooooooo

you got served!!

:p

btw that is the dumbest phrase since gag me with a spoon.
 
spaceman said:
oooooooooooooooooooooo

you got served!!

:p

btw that is the dumbest phrase since gag me with a spoon.

Lol, nah, I think that's pretty funny. Crap I missed a phone call cause it's on silent!! GRrrrrrrrrrrr...
 
arentol said:
The problem with you Scali lies not in your opinion, or your desire to defend that opinion, but in your refusal to acknowledge ANYTHING anyone says that refutes your opinion

Funny, I feel like it's the exact opposite.

arentol said:
and your insistance that this topic is not about the overall cards, but about non-resolution based features (Which is actually hilarious because one of the BIGGEST benefits of the feature you are touting should be increased performance at higher resolutions). You point out 4 features the 7800 has that the 1900 doesn't. Someone else points out 2 that the 1900 has that the 7800 doesn't. You say "Those don't count because I am talking about something else".

You seem to have misunderstood what I said. I didn't say 'this topic', I meant strictly my opinion on the cards, and my arguments supporting those.

arentol said:
Also, I find it hilarious that you are still defending your point when your original post has so many inaccuracies in it. After getting those corrected you would think you would accept that you didn't know enough when you posted it, and you would correct it to something more realistic.

'So many inaccuracies'... Afaik only the FP16 blending was wrong. The cards still lack fp filtering and vertex texture fetch. So thta doesn't change my point of view on these cards. The 7800 still has the more attractive featureset in my opinion. And obviously my opinion is always realistic, to me anyway.
 
Obviously you think the 512MB GTX is better. Put your money where your mouth is, and buy one, over a X1900XT. Or create a poll and see how many people agree with you. My guess is, not very many. Very few would spend $750+ on a card that is slower than a sub $500 card in a lot of tests, not to mention is behind on the features dept. Go head, buy one, and make a poll.
 
Scali said:
'So many inaccuracies'... Afaik only the FP16 blending was wrong. The cards still lack fp filtering and vertex texture fetch. So thta doesn't change my point of view on these cards. The 7800 still has the more attractive featureset in my opinion. And obviously my opinion is always realistic, to me anyway.
Maybe I am wrong but the X1900 does FP32 regardless what is called to try and create a better image, regardless of a performance hit.

Maybe you need to get a X1800 or X1900 and see the IQ when enabling Adaptive AA and HQ AF, something Nvidia can not come close too.

I still like your original response of "I'd go with the 7800GTX (or 7900 when it's out)." which does not answer the question and inserts pure NV marketing. You know what they say about people who do that?
 
Yawn. Sorry for re-iterating, but It's *SIMPLE*:

1. Price - X1900XT is MUCH cheaper
2. Availability - not seeing much 7800GTX 512 stock anywhere
3. Performance - on the few occasions that the GTX is better, it's a close call. Okay there's B&W2 but that's an exception (maybe it uses all those features the X1900 series lacks :D). There are several games (note: new games) where the X1900XT is a heck of a lot faster.
4. HDR with AA
5. 6xAA (seriously: how often can you use nVidia's 8xS mode?)
6. HQ AF

I've always found ATi's drivers to be better, although the interface is w@nky and bloated compared to what nVidia offer.

Don't worry too much about crossfire. The current chipset is really for X850 cards - RD580 is intended for the X1800/X1900 cards. Hopefully that will bring CF on par with SLI. ATi needs it to, as the dual-card solution is where nV is shining bright.

I own an X1900XT and, despite the fact that it doesn't want to O/C, I'm loving it. And that's coming from someone whose last two cards were a 7800GTX and a 6800GT :p

Can't wait for the green teams response either.
 
R1ckCa1n said:
Maybe I am wrong but the X1900 does FP32 regardless what is called to try and create a better image, regardless of a performance hit.

The pipeline is FP32, but the texture fetch units don't support any kind of filtering on FP textures. Which is why it has to be emulated in a pixelshader. As I tried to say before, 3DMark06 does this emulation, which is why the X1800 performs 'badly', at least, compared to other games and NV cards.

R1ckCa1n said:
Maybe you need to get a X1800 or X1900 and see the IQ when enabling Adaptive AA and HQ AF, something Nvidia can not come close too.

I know exactly how an X1800XT512 looks. I recommended one to my brother. Good card. And my own Radeon 9600XT still looks quite sharp aswell.

R1ckCa1n said:
I still like your original response of "I'd go with the 7800GTX (or 7900 when it's out)." which does not answer the question and inserts pure NV marketing. You know what they say about people who do that?

Excuse me, but that is just my opinion. Just because it isn't the choice of most people here, doesn;t mean I'm marketing NV stuff. I don't care who makes it. I just said that because of a few features on the 7800 and 7900, that is the card I would buy next. And I have the right to express my opinion on this forum. I even bother to explain my reasons.
As I say, it's the card *I* would buy, obviously I don't recommend it to everyone, else my brother would not have his X1800XT right now.
 
rincewind said:
5. 6xAA (seriously: how often can you use nVidia's 8xS mode?)

That is an excellent point actually. As you get more and more per-pixel bumpmapping/lighting, less and less of the aliasing happens on the edges, and more and more is happening on the rest of the polygons. Which means that MSAA will do less and less to reduce aliasing. Ever wondered why eg Doom3 still has those speckly floors even though you have MSAA turned up?
I wonder what the next step in AA is going to be. Will MSAA be replaced by a newer algo which handles per-pixel aliasing aswell, or will it be pushed off into the software, and have the developers write their own AA routines?
 
Anyway, you people are really lame for not being able to think past your own brand of preference. I'm just trying to give some insights in the technology and possibilities of both cards, and the thanks I get is that I'm being accused of being 'the green team', and marketing etc.
Bunch of ignorant ingrates, really.
 
Scali said:
Anyway, you people are really lame for not being able to think past your own brand of preference. I'm just trying to give some insights in the technology and possibilities of both cards, and the thanks I get is that I'm being accused of being 'the green team', and marketing etc.
Bunch of ignorant ingrates, really.

Not at all, you have contradicted yourself several times in this thread. Do what you want, say what you want.As pointed out, the x1k series can infact get the same results of vertex fetch with a little extra programming. You stated that you would buy a 7800gtx, then you didn't and you were just saying that the 7800 has some features the x1k series doesn't, and now you're back to saying yes, you would buy a 7800.

was just saying that the 7800 may have some things going for it aswell. I never said it was the better card, let alone that everyone should buy a 7800.
don't care who makes it. I just said that because of a few features on the 7800 and 7900, that is the card I would buy next.

If it isn't better, that would mean it is inferior, why would you buy an inferior card?
 
sabrewolf732 said:
As pointed out, the x1k series can infact get the same results of vertex fetch with a little extra programming.

Which simply isn't true, but I wouldn't expect a bunch of lousy gamers to understand anything like that. This thread has made that pretty clear.

sabrewolf732 said:
You stated that you would buy a 7800gtx, then you didn't and you were just saying that the 7800 has some features the x1k series doesn't, and now you're back to saying yes, you would buy a 7800.

I have always said that I would buy a 7800. What might confuse you is that I wouldn't recommend that card to everyone. Then again, you are even confused by vertex texture fetch, so there's really no point in arguing.

sabrewolf732 said:
If it isn't better, that would mean it is inferior, why would you buy an inferior card?

You must be American... "If you're not with us, you're against us". There's also something like being neutral. 7800 has some advantages, so does the X1900. It isn't as simple as being able to say which one is better. That depends a lot on what you want to do with it. I just said that in my case the 7800 would be best. That doesn't mean that it is best for everyone, let alone that it is a better card.
But well, people of limited mental capacity can only think in black and white, or in this case red and green.
 
Hi, I'm Joe Builder.

I've owned both the 7800 GTX 256 and the X1900XT.

I prefer the X1900XT for runnin' games and dvds.

Have a nice day. ;)
 
Scali said:
Which simply isn't true, but I wouldn't expect a bunch of lousy gamers to understand anything like that. This thread has made that pretty clear.



I have always said that I would buy a 7800. What might confuse you is that I wouldn't recommend that card to everyone. Then again, you are even confused by vertex texture fetch, so there's really no point in arguing.



You must be American... "If you're not with us, you're against us". There's also something like being neutral. 7800 has some advantages, so does the X1900. It isn't as simple as being able to say which one is better. That depends a lot on what you want to do with it. I just said that in my case the 7800 would be best. That doesn't mean that it is best for everyone, let alone that it is a better card.
But well, people of limited mental capacity can only think in black and white, or in this case red and green.

*sigh*, My response to you would get me banned. Just to let you know, pretty much 3/4 of my cards that I have owned are nvidia (probably 12+). Also, everyone says that it has the same results as vertex fetch, B3D, carmack etc. You're the only one I have seen so far that hasn't, instead of calling people "lousy gamers" explain it if you think everyone else is wrong, which you haven't really done, all you've done is state over and over than ATi does not have vertex fetch and that nvidia does. And, I'm actually not american.Just assuming that proves how ignorant you are. You're a hypocrite (not to mention how many times you contradicted your previous statements).You say everyone else is ignorant because they refer to things like reviews, devs, and reliable established sites instead of a single person on a single forum. I look at video cards as inferior, superior. One choice is better, thus superior, there's no in betweens when I'm spending my money on a graphics card. Also, just to let you know, it's the lousy gamers that buy the cards for the most part based on performance. So you can thanks the lousy gamers for these cards even existing.
 
sabrewolf732 said:
Also, everyone says that it has the same results as vertex fetch, B3D, carmack etc.

Uhhh, where exactly did Carmack say that? So far I only saw a vague comment which said he could see a use for both. That's not the same as the technologies being equivalent.

sabrewolf732 said:
explain it if you think everyone else is wrong

There's really no point in explaining, if you have no clue what vertex fetch is about in the first place. And if you do have a clue, then it should be obvious that rendering to a vertex buffer is a completely different concept from reading a texture in a vertex shader, so there's nothing to explain, is there?

sabrewolf732 said:
So you can thanks the lousy gamers for these cards even existing.

I was happy writing software renderers. In fact, those were blissful days where you didn't have to worry about which videocards were available, and which brand supported what, and blahblah. You just coded your routine and that was that.
 
7800 GTX 512 becase I think ATI makes some pretty crappy products and have horrible horible drivers. They don't even support Linux.

Oh and I've owned this 7800GTX 512 for 3 months now, and paid $150 to step up from a 7800GTX + free motherboard. These forums are like a pack of wild monkeys only onm hard forum will people cry and bad mouth companies for a 2-3 fps difference.
 
peacetilence said:
7800 GTX 512 becase I think ATI makes some pretty crappy products and have horrible horible drivers. They don't even support Linux.

Oh and I've owned this 7800GTX 512 for 3 months now, and paid $150 to step up from a 7800GTX + free motherboard. These forums are like a pack of wild monkeys only onm hard forum will people cry and bad mouth companies for a 2-3 fps difference.

I think we should all follow this guy's advice.
 
Scali said:
You seem to have misunderstood what I said. I didn't say 'this topic', I meant strictly my opinion on the cards, and my arguments supporting those.

See the thing is that your opinion (see below for my interpretation of it, correct it if wrong) is not supported by your "arguments", which are mostly opinions themselves, or by actual REAL WORLD gaming results.


Scali's Opinion????...

"The 7800 has 64bit Texture filtering and blending, and UltraShadow II, and those things are SO much more important to the future of gaming than any features the 1900 has that there is no way to ever consider the 1900 a better card."

The problem with that argument is that it's predicated on YOU knowing 100% for certain what the future will hold. Which I am pretty sure you don't. The other problem with it is that it COMPLETELY ignores any current real-world performance results. Those just don't seem to count, and yet, at the end of the day, those should be the ONLY thing that counts.

BTW, off the top of my head, 5 "features" the 1900 has that the 7800 doesn't. I am sure I could come up with more if I put some effort into it.

Extended H-Z cache
Double Dual-link DVI ports
HDR+AA
Fetch 4
3Dc
 
Back
Top