4870 512mb VS 9800 gx2 WOW

It is interesting. If it's true then Big Green may be "boned" for the moment, but I'll stay tuned throughout the summer to see how they answer. I'm not quite sure what the current price points you speak of are, but I'm sure since it's ATi, it will be a decent deal.
 
...comparing ATI's unreleased card with new archiecture to nVidias 18 month old architecture?
 
Newsflash: 9800gx2 came out 2 months ago.:rolleyes:

Newsflash, G92 is a die shrink of G80, which came out 18 months ago. A die shrink is not an architecture change, its simply reduces power usage, reduces production costs, and enables higher clocking.:rolleyes:

G71 -> G80 was an architecture change. Pixel pipelines to unified shader architecture. Know what your talking about before you post.
 
according to the link.. they are comparing a single core video card against a dual gpu one(9800 GX2)...

that says alot..
time will tell if its true..

and Nvidia has had a chance to introduce something new
but they havent.. and it looks like that will not happen until july..

which raises the question as to what they were doing all this time
 
Newsflash, G92 is a die shrink of G80, which came out 18 months ago. A die shrink is not an architecture change, its simply reduces power usage, reduces production costs, and enables higher clocking.:rolleyes:

G71 -> G80 was an architecture change. Pixel pipelines to unified shader architecture. Know what your talking about before you post.

ript
 
Yeah, heaven forbid we compare a card coming out with Nvidia's latest release... Who's fault is it that Nvidia hasn't updated their architecture? Is that AMD's fault?

Unless Nvidia lets slip info on their new card, there is nothing else for them to compare it too. Hello? So I say who gives a sh!t what year the G92 came out or what changes are in there, it's Nvidia's latest and greatest and thus the best thing to compare it too.

That said, Nvidia hasn't lost this battle in years, so we'll see how this pans out.
 
Newsflash, G92 is a die shrink of G80, which came out 18 months ago. A die shrink is not an architecture change, its simply reduces power usage, reduces production costs, and enables higher clocking.:rolleyes:

G71 -> G80 was an architecture change. Pixel pipelines to unified shader architecture. Know what your talking about before you post.

newsflash: using newsflash became un-popular 5 posts ago.

the G92 is not a shrink. The G92b is a shrink. A shrink is reduction in size of transistors and if the fab process requires it a slight re-org to the transistor layout. The total transistor coun't never increases or decreases by any more or less than a couple thousand. the G92 from the G80 improved on the architecture in many ways including a very new and very differant TMU design. :rolleyes:. Furthermore the term G92 implies that there was at one point a G91 and/or G90 which probably would have been an architecture shift much like that of the one from the NV45 to the G70.

Also ending in "Know what your talking about before you post" only makes you look like a jackass.
 
Not to mention its the fastest card we have atm

I hope this turns out to be true ... so according to the article it would drop at 300 or something ? Im gettin 2 if so
 
Not to mention its the fastest card we have atm

I hope this turns out to be true ... so according to the article it would drop at 300 or something ? Im gettin 2 if so

yeah.... GDDR5 is currently pretty scarse so the recent rumors are expect may versions of the HD4850 and HD4870 to be GDDR3. .8ns means it should easily clock to 1.1GHz (2.2 effective) though. GDDR5 due in June? Perhalps they'll use the GDDR5 version to rain on Nvidia's Nvision08 parade?
 
Newsflash, G92 is a die shrink of G80, which came out 18 months ago. A die shrink is not an architecture change, its simply reduces power usage, reduces production costs, and enables higher clocking.:rolleyes:

G71 -> G80 was an architecture change. Pixel pipelines to unified shader architecture. Know what your talking about before you post.

G92 isn't just a die shrink. G92 is not 18 months old.

By your logic, the new ATI chip isn't an architecture change either:rolleyes: It's a r600 times 1.5:confused:
 
9800gtx 2 just dropped so it is valid

The GX2 is two *slightly* tweaked G80s bolted together. It *is* 18 month old tech. Just because Nvidia hasn't bothered to release anything better doesn't change that. There is nothing new about the G92.
 
The GX2 is two *slightly* tweaked G80s bolted together. It *is* 18 month old tech. Just because Nvidia hasn't bothered to release anything better doesn't change that. There is nothing new about the G92.


So what bearing does that have on this thread? What's the relevance? Is this thread about Nvidia's architecture, or are we off topic?
 
lets hope this is true so nvidia gets some competition and they have reasons to make a real card.
 
Any benches of the mystical 9900GT or 9900GTX, etc?
I think it's obvious that Nvidia has had a secret weapon up their sleeves this whole time. The "9800 series" was just a marketing decoy.
 
It'd be nice to go back to ATi again, heck, even AMD.

I loved my x850xt and opteron 170.
and
I loved my 9500pro and barton 2500+.
 
Came from the x800gt to the x1800GTO and to the 3870 when the 8800gt was no were to be found under 300$, and I dont see myself going to Nvidia anytime soon. I gotta say Nvidia has been pretty crappy on the pricing when I am looking to upgrade. Every single time it seems I get rdy to upgrade Nvidia's midrange is too hard to find or too expensive compared to ATi's counterpart. Seems like its gona be so again if Nvidia doesnt release anything before the 4870. Also ppl who gives a crap, you guys were comparing the x1xxx series to the 8800 series and saying crap then, it doesnt change. People will always compare to the competitors top no matter how old the product is, and yes you ppl are correct it is not ATi's fault that Nvidia hasnt been doing anything ground breaking with their releases. But does that really matter? Who ever has the best is what ppl are going to want correct? Then why are we caring if its old or new? Is it wrong to compare X2's to Core 2 Duos? Then why does it matter that we are comparing Nvidia's current best 9800GX2 (even though its old in architecture so to speak) to ATi's new card?
 
I honestly hope this card rips all of nVidia's current line up a new asshole. Then maybe nVidia will go back to work and put out some truly upgraded hardware and not these bullshit die-shrinks and recycled cards using different names. I've been using nVidia hardware since the GeForce 2 days and if they don't quit fucking around, I'll switch to ATI. While I can honestly say I prefer nVidia over ATI I am not a loyalist when it comes to hardware, I'll use whatever gives me the most bang for the buck. This rumored GT200 shit better rock my socks or I'll be sticking with my 8800GTX or jumping ship to the best that ATI has to offer.
 
I honestly hope this card rips all of nVidia's current line up a new asshole.
Not likely, this has fake written all over it and are more than likely a part of the R700 scam that is a blueprint of R600 scam they pulled before.
Why would Crysis use the extra shaders in 4870 when it does not use the normal ones in 3870.
 
Not likely, this has fake written all over it and are more than likely a part of the R700 scam that is a blueprint of R600 scam they pulled before.
Why would Crysis use the extra shaders in 4870 when it does not use the normal ones in 3870.

Maybe because there were architectural issues that prevented the r600 from using all the shaders, and that has been addressed and fixed in the r7xx series GPU?

IDK just a thought.
 
Maybe because there were architectural issues that prevented the r600 from using all the shaders, and that has been addressed and fixed in the r7xx series GPU?

IDK just a thought.
Yes, but RV700 are most likely in the 15-17k area in the useless 3dmark06 and the rumored price of $330 tells it is not a monstercard, just another midrange 3870, not even 500 for 2GB 4870X2 wispers monstercard.
21K might be possible with the 4870X2 IF it is powerd by a 4GHz CPU and a faking driver for Crysis@1024x768 that is made to do what 169.04 accidental did and was corrected in the next with no powerloss.

Besides the site is confimed fake and ATI-bias:
http://www.bilgiustam.com/ati-hd2900xt-8800gtxa-karsi-crysis-demo-ve-3dmark06-testleri/
 
seriously what does it matter if the G80 and G92 are old, If NV can't release something to compete then that's what will get compared.
No one had any problems comparing the G80 to the R580/1900/1950 series cards
 
This is excellent news. It means NV will HAVE to roll-out their new chips. I may jump on the 9900 bandwagon this summer.

Though, the 4870 might get my pennies if its cheaper and the price/performance ratio is in that lusted-after 'sweet spot.'
 
Sounds like from this thread that the 4870 is out there and kicking the shit of the 9800 GX2. There's nothing to see here folks, not yet. I think rumors like this are really meant to manipulate the market. I was this close to getting another 9800 GX2, but now I'm in a wait and see mode, not for the 4800 but for the nVidia 9900 or whatever.

And since I'm actually waiting that means the 4800 will not be a killer part and there will be no 9900, that's my luck. I never seem to get this right. But its only supposed to be a month, right?
 
I don't trust benches anymore

had a GTS -then a 3870X2 and now have a 8800GTX

and far happier with the 8800GTX .. the 3870x2 was good - but when playing an un-optimised game (like FSX) or did a level that was unoptimised then performance was appauling

just so inconsistant .. so one level of Crysis would be perfect, the next could be dreadful

much prefer the 8800 GTX
 
But that is a different situation - 2xcores

this one 1xcore is good enough

honestly, benches i still trust because they are benches anyway. Even though it doesnt translate to real world performance, it gives a good sneak at what the card can*** deliver.

Im no fan but i am happy if this is true because in the end all of us consumers will profit from lower prices due to stiff competition. Nvidia fanbois shooo! unless you guys want price monopoly like MS is doing to us.

Lets wait and see... ill definitely buy one once it gets released and performs as it is. Its worth giving my money to ATI to keep it alive... id rather have then annoy nvidia than not be there at all
 
I can't believe that site is getting so much exposure based on some made up graphs with absolutely nothing to back them up. This thing is all over the internet and it's basically just graphs that anybody could do in excel.....
 
I can't believe that site is getting so much exposure based on some made up graphs with absolutely nothing to back them up. This thing is all over the internet and it's basically just graphs that anybody could do in excel.....

I think its Underdog Syndrome. I think everybody wants to see AMD pull out a show stopper. Intel and nVidia have been tearing AMD a new one non-stop for almost two years now. It would be nice to see some balance. But AMD will have to tear nVidia a new one for me to go back to Radeons or nVidia will have to screw up.
 
Price and Performance is what I want. If the ATI 4870 can somehow perform like that at $300, I'm snatching it up first day. Though I'm weary about the companies deploying ATI cards, Visiontek looks to be the best bet.
 
Visiontek looks to be the best bet.
Aaaack! Avoid Visiontek like the plague! They don't honor warranties, their customer service is almost non-existent, and they are flat out incompetent in their service department.

I have a Visiontek 3870 that kept on corrupting and crashing in shader intensive DX10 games. I first thought it was the fan issue and updated the BIOS, then I plugged in the fan directly to 12V for full speed -- no dice. Finally I sent it back to Visiontek. After nearly four weeks with no emails from them, it came back with a letter saying they could find no problem "after extensively testing it for 36 hours." Of course the exact same problems remained, so I sent several, much more emphatic emails where they finally agreed to exchange the card. I ship it back. Three weeks later, again with no email communication from them, I get the same card back, same bullshit letter, only this time they "upgraded the BIOS." Arrrrrgh! Assholes!!! $36 in shipping/handling fees and my card still doesn't work.

Finally I took matters in my own hands, voided the warranty, and replaced the heatsink/fan with a VF900-cu. Guess what? No more problems! Plus, instead of a lousy 800/1200 overclock, I now get 891/1261 and temps stay cool.

Avoid Visiontek like the plague! Go with a reliable manufacturer like ASUS or Gigabyte. Even though their warranty period is only three years, at least they honor it.
 
Im no fan but i am happy if this is true because in the end all of us consumers will profit from lower prices due to stiff competition. Nvidia fanbois shooo! unless you guys want price monopoly like MS is doing to us.


they wish. PC sales are good, Mac sales are better, and Linux ant doin to bad either.
 
Aaaack! Avoid Visiontek like the plague! They don't honor warranties, their customer service is almost non-existent, and they are flat out incompetent in their service department.

Go with a reliable manufacturer like ASUS or Gigabyte. Even though their warranty period is only three years, at least they honor it.

Damn. Havn't had to work with them yet, but I've sold a couple of there cards. You've got horror story #3 I've heard about them in the past month. Thats it, you buy an ATI card from me it wont be from visiontek (or sapphire, or HIS).

I've gotta give powercolor a shot. The only problem with them is they never seem to be too intrested in updating their prices. NCIX still has one of there X1900s on for $500. Um, guys, we've moved on. Sorry. How about Gecube or Club3d?

I heard rumors that BFG was in financial trouble at the same time I heard rumors that Nvidia was cutting one of their board partners. Please, for the love of god, let it be BFG so they can switch to ATI chipsets. At that point the only ATI cards I would sell would be from BFG.

Gigabyte and Asus both wont bullshit you on the RMA, true, but they will take the 2-3weeks of turn around time.
 
GeCube, HIS, and Powercolor are all relatively small, Taiwan or Hong Kong based OEMs, Sapphire is a branch of a Chinese company (PC Partner, but headquarters are in Hong Kong), and VisionTek (Impero Electronics, general equipment wholesaler) and Diamond (Best Data Products, modems) are nameplate companies owned by U.S. corporations. ASUS, Gigabyte, and (sometimes) MSI are the only ones who actually build their own cards rather than slap a label on a BBA (built by ATI) card.

You know, for all the problems I had with eVga (they RMAed the wrong card back, then lost the returned card -- I waited 6 weeks for a resolution), at least they made an effort, poor as it was. I just wish nVidia had a better architecture for gpgpu applications. Even though I'll be paying a premium, I'll stick with ASUS, Gigabyte, or MSI in the future for my ATI cards.

[Edit - fact correction]
 
Lets wait and see... ill definitely buy one once it gets released and performs as it is. Its worth giving my money to ATI to keep it alive... id rather have then annoy nvidia than not be there at all

I never understood this line of reasoning really...
My money is hard earned and I'll only spend it, on a product that is worth it. I will definitely NOT buy something, just because it's from the underdog and I want to support it. That makes no sense to me. Also AMD/ATI does not want to be the underdog. The competition is just releasing better products than them.
 
I can't believe that site is getting so much exposure based on some made up graphs with absolutely nothing to back them up. This thing is all over the internet and it's basically just graphs that anybody could do in excel.....

It's no different than any other product launch, especially in the graphics cards market. There's always at least one site, claiming to have tested the cards first, that shows some graphs. I'm actually surprised there aren't more. They all link to the same site...

Before the long waited R600 was released, there were tons of sites, with what later were confirmed to be fake R600 performance numbers.
 
Back
Top