Could the 4870x2 be the next 8800GTX?

Yeah, but at what cost?

2 280s in load take 340-380W depending on how you measure already. As the 9800GTX+ obviously showed, the 55nm process for nVidia offers little in terms of improvement (I would say it shaves 10% off power max).

Even then, you'd have a card that's 310-350W at current GTX280 clocks, that can't manage a decisive win against the X2, especially at the usage levels of uberhighend (8XAA, no?)

nVidia has to swallow this one up for their own good.

Well NVIDIA doesn't necessarily need to score a decisive win. They just need to be more competitive than they are right now. If they don't do that by overall performance they need to do it providing more bang for the buck.
 
Well NVIDIA doesn't necessarily need to score a decisive win. They just need to be more competitive than they are right now. If they don't do that by overall performance they need to do it providing more bang for the buck.

They ARE competitive right now, price/perf is quite good, and the nVidia slanted are still happily getting GTX 200s- that's an indicator.

As for the margins, they have cash. GT200b in my prediction should get the 270 > 4870 and 290 > 4850X2, albeit at higher load consumption (their coolers can handle it).

The GX2 card? You need 3 PCIe connectors, and a cooling system that does not exist yet.

I think they need better focus on other stuff if they want their "blue ocean" strategy to work.
Rumours that DX11's compute shaders and even OpenCL are actually very closely modelled to CUDA, enabling developers that were trained for CUDA to switch from nVidia only to everyone without much hassle, instantly killing CUDA off for every task besides PhysX before it even grapples on.

The current G80 based technology is making them run out of spots to be very dominant, even including the mid-lowrange where they've sold a lot before by just being competitive enough. RV730 should make sure that doesn't happen this round, so I guess they have quite some to worry.

On the bright side, stock buyback! :D :D
 
:confused:I cant imagine the 4870X2 being on top for so long as the 8800GTX but i would like to point out:

A- At launch a single 8800GTX couldnt run Oblivion at 2560x1600 with max settings outdoors, in fact, one of the very first examples of SLI 8800GTX real life reasons was a review of the Dell 3007 WFP;). The 4870X2 on the other hand can handle ANY game at ANY resolution.

B- At launch the 8800GTX was WAY more expensive than the 4870X2:D

C- At launch the 8800GTX was a single card solution for 16x10 and 19x12 gaming that pretty much destroyed the competition, but today the 16x10 and 19x12 gaming have a sub $200 option, so unless you are a 30" owner ( or plan to be one in the next 2 months:p) or is building a whole new system there is simply not much reason to dump a brand new and shining 4850/260GTX and go for a 4870X2. Just wait for prices to go down, or for games that these mid range cards cant run at all in 19x12.

I will once again ask people to stop using Crysis as an example of "game of the future that every system HAS to be able to run at 100 fps" .

IF The Crysis engine wasnt a POS then why would the creators of the engine recognize that the sequel WILL NOT use the same engine, since performance optimizations are NECESSARY:confused:

And is will be a terrible thing for us all if the 4870X2 stays on top for too long: prices wont go down and we may run in the same BS that NVIDIA tried to dump on the last months: 768MB 8800GTX= 512MB 8800GTS= 512MB 9800GTX:mad:
 
For the record the 4870 X2 beats the GTX 280 and 260 in the majority of tests in crysis, across all the X2 reviews.

I went through and counted myself, go ahead and look if you don't believe me.
 
To be honest, most new games at moderate settings and resolutions are running just fine on 8800 GTX's, so the 4870x2 will probably do well for some time tot come.

However this is going to be a short product cycle because the competition is stiffer now between nVidia and AMD. I would be shocked if nVidia doesn't have a decent 55nm part out within the month that's well priced and is pretty close to a 4870x2 at $400.
 
Crysis is the new 3dMark06.:rolleyes:

No. 3D Mark 06 is abstract and meaningless. The variables that effect 3D Mark 06 scores aren't necessarily the variables that effect game performance. However Crysis (regardless of what you personally think of the game) is at least an actual game. It is a demanding game as well. So right now that is one of the best yard stick's for measuring system performance relative to playing games.
 
I can't believe how popular a $500+ card is with the gaming community. The HD4870 X2 is really hard to come by.

BTW, Ewiz has it for 534 Free Shipping on their Front Page.
 
Yes, I don't think it was badly coded, it was just optimized for Nvidia hardware

But that's also why I feel like the benchmark is a bit misleading only because its kind of like Lost Planet, or CoJ for ATI: the results are skewed

Yeh it runs real well on high with my 280? I do not see the fuss, It wont run like that at 2560x1600 but the game looks good at 1080P
 
Yes I second this notion, and even it there is valid evidence that it's poorly coded in some way I highly doubt that Crytek themselves would state that it is. Until someone can provide us with proof that its coded poorly I'm under the assumption that hardware just isnt mature enough for this game yet, but I repeat it's merely an assumption. And FFS stop saying that crysis one of the worst games ever, that's just getting ridiculous.

Amen.

I can run Crysis at Very High (custom config) at 1600x1200 with 16x AF and 2x Edge AA at 40-60 FPS.

Poorly coded my ass.

If all it takes is a custom config to make it look and perform better, then you can't categorically make the assertion that the entire thing is poorly coded -- that is just an ignorant and sensational stance to take.

Oh, and at the aforementioned settings, its consistently fluid/smooth as silk, while managing to make every single other game out there on the market look noticably outdated by comparison.

Jesus H. Christ, Cevat Yerli, chief bullshitter at Crytek, has said exactly that. Dan posted something about the quote a while ago too. Of course most of us knew it was poor optimizing to begin with.

Im calling BS on this.
Prove it or cease your incessant yapping about this fabricated point.

And for what it's worth, Crysis received an average score of 91% on Metacritic and was heralded as one of the best games of 2007 by virtually every critic out there, bar none.

Just because you either:
a) don't know how to run it properly or
b) have unrealistic expectations

Does not mean that Crysis doesn't run great or play great. :rolleyes:
 
If Crysis was designed to be run at medium settings, why does it look so bad? Unless you play at high (which you needed the best card out there for in SLI when Crysis came out if you had a decent monitor) it looks worse than most AAA titles that has come out in the last few years.

To answer the question, no I don't think 4870X2 will be the next 8800GTX. The 8800GTX being king of the hill for so long was a once in half a decade thing. Generally it doesn't take that long for new cards to come out. 4870X2 will be to 5870 (or whatever it's going to be named) what 3870X2 is to 4870 today, which means, worse, or at least not better.

That's a very ignorant thing to say, even at medium (despite me being spoiled with a custom Very High config of my own), most critics still claimed that it looked better than any other game out on the market, and continues to be so.
 
When is this tool going to be banned? I start a thread completely unrelated, its' thread-crapped into a Crysis discussion, and then the standard bullshit from this Hamidxa ruins any hope of the thread recovering.:rolleyes: This douche is too lazy to read Crysis: Warhead interviews in which Yerli admits that Crysis was poorly optimized. Hamidxa, for the love of God, please realize that game reviews aren't factual, they are opinion-based. Take your opinions and terrible ability to use a thesaurus and find another thread in which to take a mind-dump.
 
I'm just curious....

What were Crytek's hardware specs on the dev boxes used to QA test their game on Ultra High Settings @ max res?? Surely it had to be "playable" on their rigs. Did they test on every resolution? If so, why would they release a game "unplayable" on Ultra High settings @ max res?

I mean the game has been out for 2+ years and no card can run it on Ultra High @ max res? What were the devs using to write the game on Ultra High @max res?
 
I'm just curious....

What were Crytek's hardware specs on the dev boxes used to QA test their game on Ultra High Settings @ max res?? Surely it had to be "playable" on their rigs. Did they test on every resolution? If so, why would they release a game "unplayable" on Ultra High settings @ max res?

I mean the game has been out for 2+ years and no card can run it on Ultra High @ max res? What were the devs using to write the game on Ultra High @max res?

Crysis has been out for < 1 year. It looked good at playable settings when it came out and it looks even better on today's high end hardware. It will look fantastic maxed out on the next gen hardware.

Far Cry was the same story.

Can we all please stop bitching about Crysis performance? Please?
 
Here's an idea, why don't you take your stupid Crysis argument to the "Gaming" forum where they discuss "Gaming". Give me, and the rest of us, a break.

As for the original question, no, this won't be the next 8800GTX or the next 9700Pro (both of which I owned and loved). Yeah, it's fast and all, but what you have to realize is that when those cards were launched, the competition was utterly blown away and had nothing to compete. Not so in this case. As good as the 4870X2 is, it isn't an nVidia killer. All nVidia needs to do is release the GT280X2 and there back on top for single card performance. I don't think anyone is going to get to sit on their laurels this time around. Good for us. :D
 
LOL.

Crysis is already optimized, for the most part, the only "optimization" that can be done now is removing shader instructions (i.e.-make it look worse), changing MIP settings to 1 instead of 0 (i.e.-reduce texture size by two), etc.
 
I was on the fence with the GTX280 and I think I still am with the X2. But it's close, so I think the next round of cards (maybe whatever the next 280 refresh turns out to be) will tip me over the edge.

Ultimately I hope nothing is the next 8800GTX, and we don't have to sit and wait for years on end while nvidia twiddle their thumbs, occasionally putting out a minor update series like the 9's, wondering when or if ati will catch up. At least there's some serious competition now.
 
By Crytek's own admission Crysis isn't coded or optimized as well as it should be. The thread I started on that dealt with that statement. I staunchly defended Crysis' coding comparing it with Farcry's until Crytek admitted that it wasn't coded as well as it could have been. With that said at some point our hardware will be so fast it won't matter. When the next generation cards roll around we'll probaby see exactly that. As it is high end cards like the Geforce GTX 280 and Radeon 4870 X2 can play Crysis up to 1920x1200 with everything on Very High in DX10 mode. Next generation will probably give us 2560x1600 with everything on Very High with some AA and AF. At least Crossfire and SLI configurations will be able to do so.
 
I'm just curious....

What were Crytek's hardware specs on the dev boxes used to QA test their game on Ultra High Settings @ max res?? Surely it had to be "playable" on their rigs. Did they test on every resolution? If so, why would they release a game "unplayable" on Ultra High settings @ max res?

I mean the game has been out for 2+ years and no card can run it on Ultra High @ max res? What were the devs using to write the game on Ultra High @max res?

Ya didn't the OG quake or something of that era have "insane" settings that were unplayable at the time? Thus prolonging the life of the game
 
I can't believe how popular a $500+ card is with the gaming community. The HD4870 X2 is really hard to come by.

BTW, Ewiz has it for 534 Free Shipping on their Front Page.

I will once again rewrite what should be pretty obvious by now : the 4870X2 is geared towards 1920x1200 with 8xAA and HIGHER antialiasing methods and the 2560x1600 gamers ( in the last scenario with UNHEARD BEFORE levels of AA :)\)

Just look at the reviews of the last couple of days: most sites showed results only at 1920x1200 or 2560x1600- any site that doesnt display 2560x1600 results for this card simply DO NOT deserve respect :eek: the 4870X2 was meant for 30" inchers!!!:cool:

This card has a market niche for 30" owners that is pretty much undisputed: they are faster and cheaper than any other solution in the market today, sure you can use a single 260 GTX at 2560x1600 but you will have to disable AA and live below the 60fps mark 90% of the time:(, not to mention the frustation of knowing that for more 200 bucks you could have a solution that would be futureproof :rolleyes:)

Another part of the market for the 4870X2 are the thousands of people with cards at or below the 8800GT level that already game at 1920x1200 and are not happy with their cards or plan to move to 30" in the near future.

But i agree that at 1920x1200 if you have anything at 8800 Ultra / 8800GTS 512MB/ 9800GTX level the 4870X2 isnt that much atractive.
 
If we don't compare new cards' performance to Crysis... what else do we have to go off of? Nothing else is graphically demanding enough (bad coding or not).

An 8800GT can play most games at max settings at an average max resolution (1680x1050 and below). AoC may be a different story but that's a niche... most play FPS games.
 
Ya didn't the OG quake or something of that era have "insane" settings that were unplayable at the time? Thus prolonging the life of the game

Everquest 2 couldn't be maxed out when it was released either.
But I can't see how people would believe that crysis was released 2 years ago... :confused::eek:
 
When is this tool going to be banned? I start a thread completely unrelated, its' thread-crapped into a Crysis discussion, and then the standard bullshit from this Hamidxa ruins any hope of the thread recovering.:rolleyes: This douche is too lazy to read Crysis: Warhead interviews in which Yerli admits that Crysis was poorly optimized. Hamidxa, for the love of God, please realize that game reviews aren't factual, they are opinion-based. Take your opinions and terrible ability to use a thesaurus and find another thread in which to take a mind-dump.

Well someone decided it was funny to say Crysis in the 2nd post in this thread. Then ofcourse someone else had to reply saying that Crysis is an unoptimized piece of **** and that he does not understand why people still use it to benchmark overall system performance in essence triggering a crysis centered flame war.
 
But i agree that at 1920x1200 if you have anything at 8800 Ultra / 8800GTS 512MB/ 9800GTX level the 4870X2 isnt that much atractive.

Wrong. Play Age of Conan with "Full Bloom" enabled on 1920x1200. It will crush any 8 series Nvidia card. I upgraded to a 4870 from an 8800GTX and I could actually maximize my draw distance and enable the bloom effects. From other review sites I've read, its a killer card for AoE. Now, I'd like to see what the upcoming Warhammer MMO will do graphics wise.

And to the OP: Nope, its not an 8800GTX type of card. Nvidia is hurting by making both a speed as well as pricing mistake with the 200 series card by not having any expectations from ATI for their next gen part. ATI responded and Nvidia had to drop prices. I think that Nvidia is hard at work creating an "Ultra" series of the GTX280 to counter ATI's current salvo across the bow.
 
I had the 9800 Pro All in Wonder...what an outstanding card, that thing played doom 3 nicely BTW. Also had the 8800gtx, and eventually a couple of 8800 Ultras.

While I'd like to believe the 4870x2 is going to have the staying power as the 8800gtx/8800ultra as king of the hill...I just don't see it happening. The 4870x2 does have all the ingredients though...1GB Frame Buffer..GDDR5...55nm...and the black PCB to distinguish it from the rest. I think we can all agree it has the potential. I've read reviews where it essentially edges out a gtx280 sli/tri sli set up.

What's going to be critical is drivers and getting the card to scale well in all the newer games that come out. That fact that this thing has two GPUs is what's gong to hold it back. Plus the GTX280 isn't really too far behind in some instances. What's really going to distinguish this card from the rest is when games like far cry 2 and warhead come out...then we'll see if this card is going to have the staying power.

I'm also going to point out that the 4870x2 has an extra 5.0GB/s that has yet to be unlocked by drivers. Check this quote out:

"One novel feature of the HD 4870 X2 is that it has an additional direct-GPU-to-GPU interconnect called CrossFire Sideport (XSP). The XSP offers an additional 5 GB/s interlink bandwidth between the GPUs but is not enabled at this time. Yes, you read correctly. The official reason why the XSP is disabled at this time is because that much bandwidth is not required with current applications and it will be enabled at some point in the future via driver update.
One major advantage of the XSP that I see is that transfers between the GPUs would have a lower latency. The Gen2 PCI-E bridge will certainly be fast, but it will take a short time to process the incoming data and send it out to the other GPU (<140 ns). With the XSP's point-to-point interlink this delay is eliminated. So my speculation is that AMD is working on using the XSP feature but the driver support simply doesn't work as intended at this time."

Source: http://www.techpowerup.com/reviews/Sapphire/HD_4870_X2/

I found that interesting...and it makes me think that maybe ATI is waiting for Nvidia's response...and as a counter...unlock that extra bandwidth :confused:
 
not to mention the 9700 pro came out of nowhere.

Not quite out of nowhere. ATI bought a company called ArtX. That's where they got the technology that allowed them to build the 9700Pro in the first place. Had ATI not purchased ArtX, the Radeon 9700Pro we know would never have been.
 
I found that interesting...and it makes me think that maybe ATI is waiting for Nvidia's response...and as a counter...unlock that extra bandwidth :confused:

Exactly what I was thinking, so does that mean the 290 or whatever will have it's own hidden turbo switch which nvidia will turn on as soon as ati enable xsp... and so on. On the one hand it's good that there's potential to squeeze even more out of what's already a very impressive card, on the other I'd be a bit irritated if I owned one, was gaming on a 30" and knew ati were holding back. And on yet another hand, you have to be skeptical of claims that they'll be able to improve performance substantially in some hypothetical future driver update.
 
My question really boils down to a few factors. The 8800GTX remained the king of video cards for nearly 2 years (barring revisions and small increases in performance therein). It was such a step up from previous cards that it was able to hold its ground and still is a viable option for PC enthusiasts.
There is not that big a gap between the 4870X2 and a GTX280 not to mention we are talking about 2 GPUs using crossfire, but go on...
Do you project or estimate that the 4870x2 could be the next big upgrade for the time being? I figure that a few factors would make it possible.
Not really as the 55nm refresh of the GTX280 is a month or so away.

1.) Performance vs. last generation video cards. It seems that the 4870x2 looks to be faster than 8800GTX SLI in many instances, much like the 8800GTX beat out 7950GX2 cards when it was released.
It's not a huge difference between the 4870X2 and the 9800GX2, nothing like what was seen with the 8800GTX.
2.) It seems to be able to smash every game available at ultra-high resolutions with max detail, much like the 8800GTX was able to do in its time. (Crysis doesn't count because, as admitted by the developers, its poorly coded and generally a piece of shit).
hmm. Now I think you are just trying to stir crap up.
3.) There are no games on the horizon that will bring this card to its knees.
What crystal ball are you using that can see this horizon so well?

Opinions?
/delete thread
 
Wrong. Play Age of Conan with "Full Bloom" enabled on 1920x1200. It will crush any 8 series Nvidia card. I upgraded to a 4870 from an 8800GTX and I could actually maximize my draw distance and enable the bloom effects. From other review sites I've read, its a killer card for AoE. Now, I'd like to see what the upcoming Warhammer MMO will do graphics wise.

I've heard more about AOC being poorly optimised than Crysis. :p

If the current screenshots are anything to go by, Warhammer Online looks to have relatively crap graphics compared to the likes of Crysis and AoC, so should probably run better on 8 series cards.
 
Crysis has been out for < 1 year. It looked good at playable settings when it came out and it looks even better on today's high end hardware. It will look fantastic maxed out on the next gen hardware.

Far Cry was the same story.

Can we all please stop bitching about Crysis performance? Please?
why do people keep saying this BS? the 6800 cards came right after Far Cry and could run the game on maximum settings at 1600x1200. also remember that 1600x1200 was a large resolution for 2004. hell I played it on a 6600gt on max settings at 1280x1024 so stop with the comparisons to Far Cry because it doesnt fly. :rolleyes:

http://www.anandtech.com/showdoc.aspx?i=2277&p=8
 
There is not that big a gap between the 4870X2 and a GTX280 not to mention we are talking about 2 GPUs using crossfire, but go on...

Not really as the 55nm refresh of the GTX280 is a month or so away.


It's not a huge difference between the 4870X2 and the 9800GX2, nothing like what was seen with the 8800GTX.

hmm. Now I think you are just trying to stir crap up.

What crystal ball are you using that can see this horizon so well?


/delete thread


What the fuck is wrong with you? How old are you, really? This was a "speculate all you want" thread, and you take it as some sort of personal insult? Yeah I'm trying to stir crap (SARCASM).:rolleyes:

/ban

PS: I mentioned Crysis as an exception precisely because I didn't want people talking about it, because it is a programming failure. Ironically it turned into a thread about it.:rolleyes:
 
What the fuck is wrong with you? How old are you, really? This was a "speculate all you want" thread, and you take it as some sort of personal insult? Yeah I'm trying to stir crap (SARCASM).:rolleyes:

/ban

PS: I mentioned Crysis as an exception precisely because I didn't want people talking about it, because it is a programming failure. Ironically it turned into a thread about it.:rolleyes:
Your mature response furthers my point.
 
Your mature response furthers my point.

What point? It was a post to open speculation and see what people's opinions were. You took it completely out of context and you're now ruining the thread AGAIN. Just when I thought the Crysis monkeys were gone, a child who can't even manage to ascertain the proper context of a post joins in the charade. Go away.
 
Back
Top