hungryduck
[H]ard|Gawd
- Joined
- Mar 29, 2005
- Messages
- 1,530
I love video card fights.
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
|CR|Constantine said:.
Once again do you have any sources that show that Nvidia will in fact have a refresh by the time the R600 released. Moreover what kind or refresh could be done within the confines of 90 nanometer, because surely if they are to get more power they would need to move to 80na. Which is my contention they cannot do within 3 months.
Hold onto rumors? you're the one claiming Nvidia will have a refresh by the time the R600 comes out. With absolutely no evidence or even hear say to put up. The R600 is not a "rumor" and will be out early next year.
Secondly, how on Earth does the 8800GTX make the ATI R600 look crappy? When on paper the R600 is a more advanced card and will be released when it can actually be used.
Moreover, I would not be so quick to brag about nvidia and their apparent DX10 goodness, when NO ONE has benchmarked the bloody thing on an actual DX10 enviroment. Especially against another competitor.
Do I believe the Nvidia 8800GTX is a great card, sure of course it is. Was it rushed, no. Do I think it will beat the R600 (It's competitor) No.
Throw all the jokes and crap you want but when the R600 is released I will be sure to look you up in the forums and gloat happily.
|CR|Constantine said:First, I was not talking to you.
|CR|Constantine said:Silus said:Originally Posted by Silus
It better, because it won't be competing with the 8800 GTX anyway. It will be competing with the refreshed version.
What *Sources* do you have about this phantom NVIDIA refresh part.
Moreover, if ATi releases this card in January\February time frame you think Nvidia is going to have a refresh ready (that should be a 80na part) in 3 months ..... RIIIIIIGGGGGGHHHHHTTTT.
|CR|Constantine said:Second, the person I was talking to makes a point that the R600 will be competing against a refreshed G80 not the current model. To which I called bullsh1t.
I never said the G80 was not impressive, but with ZERO evidence on how it does within a DX10 enviroment against another competitor I believe waving the OH MY GOD WE WIN TEH DX 10 GOD SPOT. is a bit pre-mature. Especially when there is NO BLOODY DX10 GAMES OUT NOW.
So if people with ZERO evidence is going to pull the G80 is the best at DX10 with no proof than I will happily submit the fact that the R600 with it's SECOND GENERATION UNIFIED ARCHITECTURE, TRUE 512BIT, 1GB DDR4, 80 NANOMETER, card will do better than the G80 in DX10.
Can I prove that, of course not because I have no hard data, and neither do you.
JDAdams said:Personally I'm not interested in the 8800 since it uses a software layer to emulate the DX10 unified shader architecture instead of actually having unified shader support in hardware; this is a little too similar to the Voodoo5 emulating DX7 hardware transform and lighting my liking.
I'll wait for a true DX10 part before I buy, whether that be from nVidia or ATi. In the meantime my dual X1900s can take care of Oblivion or anything else at 1600x1200 with decent AA/AF on.
nobody_here said:i still want to see where Constantine got the "G80 runs DX10 through a software emulation"
i can point to threads where people that should know have said Nvidia may very well have a refresh ready, but it's pointless
my point is, everything about R600 is speculation right now, nothing about G80 is speculative other than DX10 performance, but the fact that M$ gave it the "DX10" compliant blessing tells me one thing, that it's done right
Silus said:I made the same mistake. It wasn't him that said that. It was JDAdams.
Any D3D10 accelerator must support all base API features without any exceptions, developers requiring a level playing field in order to further the PC as a platform and fruitful playground for graphics application development, be they games or GPGPU or the next generation of DCC tools, or whatever the world feels like building. Performance is the differentiator with D3D10, not the feature check list. There's no caps bit minefield to navigate, nor a base feature to be waived. So the big new inflection point probably should have been a big hint.
SX2233 said:.......nVidia could've perfected the card better instead of rushing it out... .
|CR|Constantine said:The bottom line is this, the 8800GTX is a fantastic card and in no way was rushed out into the market place. It has arrived at the appropriate time frame that Vista is going to be released. ATi (once again) is late to the party. However, since DX10 games have also been delayed ATi will not be suffering as much as one would think.
Secondly, for the record NO ONE HERE has any clue to not only how the 8800 will work under Vista but also DX10. Odds are it should work fine, but in the world of high end parts "just working" does not cut it. The real proof in the putting is which card, the 8800 or the R600 will be faster, and better looking in DX10 titles. Until then all we have is that the 8800 is a monster of a DX9 card with DX10 capability.
It's up to each individual consumer to decide what is best for them. Personally I am tickled pink that Nvidia FINALLY got their act together in terms of IQ. Now that you can get uber quality IQ from both camps, and any top end card can play games with playable frame rates all that is really left is availability, price and driver support.
But make no mistake about it, when Crysis was demo'd the other day it ran like ass on a 8800 and well below 60FPS. Anyone thinking that they will be kicking some serious ass in Crysis with this card better think again.
What has always held true is the one that comes out with the latest card before a monster resource hogging title will win the day. So far it looks as though the R600 will be released when Crysis is released and I DOUBT Nvidia will have something ready (In volume) at that time.
nobody_here said:they both said it, Constantine said it here
http://www.hardforum.com/showpost.php?p=1030207758&postcount=70
|CR|Constantine said:The bottom line is this, the 8800GTX is a fantastic card and in no way was rushed out into the market place. It has arrived at the appropriate time frame that Vista is going to be released. ATi (once again) is late to the party. However, since DX10 games have also been delayed ATi will not be suffering as much as one would think.
Secondly, for the record NO ONE HERE has any clue to not only how the 8800 will work under Vista but also DX10. Odds are it should work fine, but in the world of high end parts "just working" does not cut it. The real proof in the putting is which card, the 8800 or the R600 will be faster, and better looking in DX10 titles. Until then all we have is that the 8800 is a monster of a DX9 card with DX10 capability.
It's up to each individual consumer to decide what is best for them. Personally I am tickled pink that Nvidia FINALLY got their act together in terms of IQ. Now that you can get uber quality IQ from both camps, and any top end card can play games with playable frame rates all that is really left is availability, price and driver support.
But make no mistake about it, when Crysis was demo'd the other day it ran like ass on a 8800 and well below 60FPS. Anyone thinking that they will be kicking some serious ass in Crysis with this card better think again.
What has always held true is the one that comes out with the latest card before a monster resource hogging title will win the day. So far it looks as though the R600 will be released when Crysis is released and I DOUBT Nvidia will have something ready (In volume) at that time.
JDAdams said:Personally I'm not interested in the 8800 since it uses a software layer to emulate the DX10 unified shader architecture instead of actually having unified shader support in hardware;
MastA said:
Chernobyl1 said:So come on then.
How has the 8800 been rushed out like you claim in post #1?
Silus said:It better, because it won't be competing with the 8800 GTX anyway. It will be competing with the refreshed version.
What specs are you refering to anyway ? 80 nm fab, 512 bit memory interface ? That's absolutely no indication of final performance and the only thing where it surpasses the G80 (if true), is memory bandwidth. Speculation is fine. I myself, started a thread last week about the refreshed G80 - G81 - but at this point, there's simply nothing that can even be compared to G80.
nobody_here said:there's nothing else to compare it to, both companies highest end products always get compared, even when there's a generation leap on one side but not the other
remember A64 vs. P4?
same thing happens every cycle with video cards, the newest and fastest from one side can only be compared/contrasted to the newest and the fastest from the other side
so now, why dont you address the points i made in my post instead of taking the cheap way out, little boy, here, i will repost it here for you since you seem to have overlooked it before
what specs? actually, the only things i have seen so far that differ is the use of GDDR4 and a supposed 512bit memory bus
GDDR4 is not anything to write home about (remember when DDR2 system memory first come out, it sucked), it's still not polished, GDDR3 is mature and easy to get in large quantities, i think NV did the right thing with that choice
as far as a 512bit bus, i think it is overkill honestly, i dont think there is a game or CPU that is going to be able to puch a card so hard that a 512bit bus will provide any advantage over a 384bit bus, time will tell
SX2233 said:So based on ur theory GDDR4>GDDR3? 512bit bus is not overkill, look at Crysis and many other DX10 games such as Turok.
MrGuvernment said:please explain what is not perfect about it and what needs to be improved, assuming you own one of course...
Astrogiblet said:Just give up already. Your obviously not going to win this battle.
This means you've confirmed that Crysis uses ~one gigabyte of surface data?SX2233 said:how about giving [the 8800 GTX] 1gb memory and a faster gpu glock
SX2233 said:So based on ur theory GDDR4>GDDR3? 512bit bus is not overkill, look at Crysis and many other DX10 games such as Turok.
phide said:
At your earlier request, yessir. There's no question that Beyond3D's staff writers are some of the most savvy out there. The number of forum members who don't subscribe to reality are plenty, unfortunately.nobody_here said:did you read their G80 architecture article?
phide said:At your earlier request, yessir. There's no question that Beyond3D's staff writers are some of the most savvy out there. The number of forum members who don't subscribe to reality are plenty, unfortunately.
It depends on sub-forum, but you're liable to run into some of these strange folk at the B3D forums.
Brent_Justice said:Do you have any facts, or even opinions, to back up why you think the card is not "perfected" as you put it? What is it missing in your opinion?
You made a very blanket statement without any proof to back it up.
5150Joker said:They could've spent more time figuring out how to make it shorter for a start. Second they could've waited a bit longer so they could build it on a smaller process as well. It really is pointless to get an 8800 GTX right now if you already have a 7900/X1900 class card because there really isn't any game the latter cards can't handle. Having an uber high end DX10 card right now is just for the sake of a bigger e-penis and not much else. Who knows how well it will perform when actual games that demand its power like Crysis, Quake Wars, Bioshock etc surface?
Why would you assume that performance in these games will be sub-par? I assume that performance in these titles you've listed will be quite excellent going from what I've seen. G80 is more than two times faster than the already capable G71 in a number of scenarios, especially those nasty pixel shader intensive scenarios we're going to become very familiar with.5150Joker said:Who knows how well it will perform when actual games that demand its power like Crysis, Quake Wars, Bioshock etc surface?
It couldn't possibly be for unparalleled performance and image quality in current games, of course. Why, buying a $650 graphics card for anything else than increasing the size of your so-called "e-penis" would be quite pointless.5150Joker said:Having an uber high end DX10 card right now is just for the sake of a bigger e-penis and not much else.
Had they built it on a smaller process, clock speeds would be higher, and we could shrug off this totally pointless 2x+ performance boost over the previous architecture. Only two times faster? What a sham indeed!5150Joker said:Second they could've waited a bit longer so they could build it on a smaller process as well.
You sound like someone who just got out of a cold pool.5150Joker said:Having an uber high end DX10 card right now is just for the sake of a bigger e-penis and not much else.
SX2233 said:how about giving it 1gb memory and a faster gpu glock
5150Joker said:They could've spent more time figuring out how to make it shorter for a start. Second they could've waited a bit longer so they could build it on a smaller process as well. It really is pointless to get an 8800 GTX right now if you already have a 7900/X1900 class card because there really isn't any game the latter cards can't handle. Having an uber high end DX10 card right now is just for the sake of a bigger e-penis and not much else. Who knows how well it will perform when actual games that demand its power like Crysis, Quake Wars, Bioshock etc surface?
SX2233 said:I never claimed anything it was simply a question, a hypothesis, it's too bad you can't see that.