Has Nvidia Seceded to ATI in the high end gaming segment?

You're really saying that $379 is a high price for the fastest single GPU card available? It's a bargain.

My 8800GTX launched at $679.

Even the 5850 outperforms the 285 by a fairly wide margin at a price of $260. Plus it's cooler, quieter and FAR less power hungry than the 5870 from what I've seen.

But in the end, I understand the argument that people are making against the need for even faster cards. The market has changed. Graphics technology progression has slowed and you no longer need to upgrade every 6 months to play the latest and greatest games. Even brand new, visually stunning games (Batman: AA is the best example of this) run at like 100+ FPS on 260/4870 hardware which cost under $150.
 
Hopefully Nvidia won't continue the refresh game and have a suitable response in a timely fashion(e.g. 2 weeks after the 5800 series launch date) or they might as well call it quits.
Someone's should probably sig this to remind us that things aren't always so black and white.

Reminds me of the whole "PC gaming is dead" thing.

Nothing is dead, things just evolve, until companies go bankrupt of course...
 
The GTX295 is still the fastest card on the market. I guess you might want to ask ATI the same question. :confused:
 
Get use to higher prices, it's how the market works. Not to mention the price is very reasonable anyway

^this

The 5870's die is much bigger than the 4870 was. Bigger die = lower yields = higher prices. Not to mention the dollar's inflation rate. $250 now is worth less than $250 last year and the year before(I'll save my Federal Reserve rant for somewhere more appropriate)
 
The GTX295 is still the fastest card on the market. I guess you might want to ask ATI the same question. :confused:

first off, i don't think it was clearly better than 4870X2, but more to the point you're talking about a card that's just under $500 new; for a few more bucks you can get CF 5850 and dance on a GTX 295. in this price range, the fact that it's SLI on a card vs. 2 separate cards doesn't really mean much except in very specific situations (like super SFF builds).
 
For the people complaining of the price of the 5870, when has a flagship card from either company ever been released at a comparable price? The only one i can remember is the 4870, and that was probably due to AMD trying to gain much needed market share.
 
AMD is nearly bankrupt, so I'm assuming they pulled out all the stops for the 58XX. That being said, this card is not the GTX 295 killer it should be... AMD needed to bust out the space alien technology and sell it for $450-$500, in conjunction with the 5850/5870 cards. As far as flagships go, the 5870's are more "Defiant" than "Enterprise F".
 
AMD is nearly bankrupt, so I'm assuming they pulled out all the stops for the 58XX. That being said, this card is not the GTX 295 killer it should be... AMD needed to bust out the space alien technology and sell it for $450-$500, in conjunction with the 5850/5870 cards. As far as flagships go, the 5870's are more "Defiant" than "Enterprise F".

AMD, while under financial stress, isn't going to run out of money. They have a very patient and rich backer in the Abu Dhabi investors.
 
AMD is nearly bankrupt, so I'm assuming they pulled out all the stops for the 58XX. That being said, this card is not the GTX 295 killer it should be... AMD needed to bust out the space alien technology and sell it for $450-$500, in conjunction with the 5850/5870 cards. As far as flagships go, the 5870's are more "Defiant" than "Enterprise F".

i must have missed where the 5870 was touted as a 295 killer. i always assumed as a single GPU card, it was meant to be the proverbial 285 killer.
 
Interesting. I thought ati was doing well financially after the 4800 series.
 
i must have missed where the 5870 was touted as a 295 killer. i always assumed as a single GPU card, it was meant to be the proverbial 285 killer.

I thought the 5850 was the 285 killer...since it beats the 285.
 
nVidia is taking more time with Fermi because they are doing something much more difficult and complicated than ATi did with the 5800 series. In fact, nVidia isn't even going up against ATi this round. They're going up against Intel's Larrabee. ATi may end up winning this war while the next war is being fought.
 
nVidia is taking more time with Fermi because they are doing something much more difficult and complicated than ATi did with the 5800 series. In fact, nVidia isn't even going up against ATi this round. They're going up against Intel's Larrabee. ATi may end up winning this war while the next war is being fought.

nVidia better hope for their own sake that they don't hurt Intel's bottom line... Look what's been happening to AMD the last couple of years after they gave Intel a bloody nose with the A64 vs P4 days.
 
Interesting. I thought ati was doing well financially after the 4800 series.

I don't think AMD will survive the recession.

Here are a few articles of interest:

1st up, Marketwatch from last week...

http://www.marketwatch.com/story/does-amd-really-pose-a-risk-for-bankruptcy-2009-09-29

"AMD is showing signs of financial distress," said Jack Zwingli, chief executive of Audit Integrity, a research firm based in Los Angeles that compiled the list. "It's not anything that people following the company don't know."

Zwingli pointed out that AMD's high debt level of nearly $3.9 billion is a cause for concern, along with the steady stream of net losses. "Are they throwing off enough money to support the business going forward?" he asked.

Next, see #8: http://finance.yahoo.com/tech-ticke...ard-Bankruptcy?tickers=AMD,LVS,S,M,GT,MYL,HTZ


Courtesy of EE Times
(09/28/2009 1:07 PM EDT)


Advanced Micro Devices -- AMD's manufacturing arm is in better shape, thanks to the completion of the spin out and the fact that its result, GlobalFoundries Inc., has hit the ground running (and spending). But the early momentum of GlobalFoundries does nothing to improve AMD's competitive position versus Intel. In the second quarter, the company lost more market share, and Intel's portion of the microprocessor market hit a four-year high, according to iSuppli. The trend is ominous for AMD, which appears destined to continue to lose relevance. "Intel has no significant competition," IC Insights Inc. President Bill McClean said recently. While the deep pockets of the Advanced Technology Investment Co., AMD's partner in GlobalFoundries, may keep AMD in business, it is unlikely to make AMD a winner, according to McClean.
 
Don't bother arguing with PRIME1.
His green glasses are way too thick to have any constructive discussion without somehow putting down AMD (read at his posts)

lol @ I don't think the 4800s sold well comment
 
The GTX285 still averages out to about $350 online...the 5850 is a bargain at 259 considering it's even faster.
 
Someone's should probably sig this to remind us that things aren't always so black and white.

Reminds me of the whole "PC gaming is dead" thing.

Nothing is dead, things just evolve, until companies go bankrupt of course...

There's no doubt PC gaming is in a wasteland right now, 91% of Activision's PC games division revenue last year were from World of Warcraft. For any other developer out there, the game is not the same as it was a couple years ago when consoles weren't even capable of running PC games at low settings (400-600mhz consoles in the previous gen).
 
i wouldn't disagree with your argument, and i didn't mean to indicate that they couldn't possibly have a multi-purpose chip. if i sounded like that, well, it's because i'm not holding my breath for nVidia. i should have the cash for an upgrade in the next month, and i'll be grabbing a 5850. considering that i'm upgrading from an 8600 GTS, i don't think it's a leap of faith to say that it will blow my mind in comparison.

maybe NV does have some beast on the horizon, but even if i wanted to play the waiting game, i have too many unanswered questions: will the added computational power also add latency, limit clock speeds, or otherwise compromise Fermi's gaming ability, will it make it a power-hungry beast, or drive up the price, and are yields going to make it hard to find and even more pricey, and is it true they won't release a reference design, meaning that all their partners will have to figure out PCB design (and solve any remaining problems for NV)?

then there's EyeFinity. i have a 24" monitor now, and even though i'm not ready to buy 2 more, i am excited to see ATI supporting triple monitor gaming because it is something i've wanted to do for some time. if it's actually a supported feature right now, i'm willing to get on board with the capability early. i know ATI has a reputation for not getting all their features supported right away, but if it's not obvious, i hold on to my hardware for years, not months.

you won't know what hit you:D

I wish Nvidia has gone both routes this time though. I still think that a simple die shrink of the GT200 would have been at least a good stop gap if they were that far behind. and it would be a good option for gamers rather then a largely undefined / none existing market place
 
There will be a quantum leap in our gaming experience but not before at least 2 years languishing in the status quot.

Keep the faith....................
 
If your hurting for performance right now buy one of amd's new cards. They are great cards, period. However, if you aren't hurting and (heaven forbid around here) have a little patientence, wait a couple of months to see what nvidia has to offer.

Trust me guys you won't drop dead and 2 or 3 months in the scheme of your life isn't a long time. Although if your a kid I guess it seems like an eternity. For the rest of us adults I'm pretty sure we'll make it though this tough tough time of 150 fps in games instead of 250fps. Go figure, I know its a hard time.

/sarcasm off.

Seriously though, its simply prudent to wait since even if nvidia's card isn't faster. What do you think will happen to ati's current card prices? That's right they will drop thanks to the competition.
 
Last edited:
hmm I dont think in anyway nvidia is out of gaming or given up, although their main focus with fermi is not entirely gaming, but they didn't exactly give up on it, I am sure it will pack close to twice the punch of their last gen in gaming if not better. though I just bought the hd 5870, I have always respected nvidia's cards, and have owned them as well. I don't think they are gonna give up gaming industry that easily though they have to compete in other industries as well, because of intel, larrabee will be more general purpose parallel processor than a gaming powerhouse, so nvidia needs to do something in that market since they can't make a cpu because of x86 licensing.
 
hmm I dont think in anyway nvidia is out of gaming or given up, although their main focus with fermi is not entirely gaming, but they didn't exactly give up on it, I am sure it will pack close to twice the punch of their last gen in gaming if not better. though I just bought the hd 5870, I have always respected nvidia's cards, and have owned them as well. I don't think they are gonna give up gaming industry that easily though they have to compete in other industries as well, because of intel, larrabee will be more general purpose parallel processor than a gaming powerhouse, so nvidia needs to do something in that market since they can't make a cpu because of x86 licensing.

^^
Exactly.
 
honestly physx would be the only reason to upgrade right now. eyeifinity, novelty. directx11 total bust until it gets into a console box.god damn mircosoft has gutted pc gaming, all xna did was make developers lazy, most development now is lowest common denomintor.
 
honestly physx would be the only reason to upgrade right now. eyeifinity, novelty. directx11 total bust until it gets into a console box.god damn mircosoft has gutted pc gaming, all xna did was make developers lazy, most development now is lowest common denomintor.

Interesting. I think Physx is the most useless feature.
 
honestly physx would be the only reason to upgrade right now. eyeifinity, novelty. directx11 total bust until it gets into a console box.god damn mircosoft has gutted pc gaming, all xna did was make developers lazy, most development now is lowest common denomintor.

I don't understand how people keep regurgitating the word "novelty" with Eyefinity when everyone is talking about it and or trying to get all the facts straight to get it working for them ... honestly Eyefinity brings the biggest thing to gaming from hardware since AF was introduced ...

Honestly ask Kyle what he thinks about Eyefinity
 
I don't understand how people keep regurgitating the word "novelty" with Eyefinity when everyone is talking about it and or trying to get all the facts straight to get it working for them ... honestly Eyefinity brings the biggest thing to gaming from hardware since AF was introduced ...

Honestly ask Kyle what he thinks about Eyefinity

Seriously? Even though Matrox and Quaddro cards have already done this? Even bigger than actually being able to view your games in 3D?

I guess when your cards have been feeding you dog food an old french fry tastes good. :D
 
Eyefinity will only be worth it if developers give us control of the FOV. And I'm not holding my breath. They still have a tendency to crop 4:3 images to 16:9. Eyefinity is a feature that will only interest a small fraction of people. Nothing like an ubiquitous option like AA/AF.
 
Seriously? Even though Matrox and Quaddro cards have already done this? Even bigger than actually being able to view your games in 3D?

I guess when your cards have been feeding you dog food an old french fry tastes good. :D

Matrox has a nice feature set but never had the performance to back it up. This really is the first time we've had the complete package.
 
Eyefinity will only be worth it if developers give us control of the FOV. And I'm not holding my breath. They still have a tendency to crop 4:3 images to 16:9. Eyefinity is a feature that will only interest a small fraction of people. Nothing like an ubiquitous option like AA/AF.

AA and AF can only take the image so far, and is still limited to what the monitor can give us, and those are starting to see limitations, if Nvidia gets into the game with multi monitor support like this for games we'll probably see some big support from dev's.

limited use? alot of people have dual monitors, this isn't just looking for maxing out your resolution but giving you a new wider view of games. I use a secondary monitor to view my desktop and use my primary for gaming, now I'm really wanting to buy 2 more monitors.
 
Matrox has a nice feature set but never had the performance to back it up. This really is the first time we've had the complete package.

Performance? You think ATI is doing something special to their new line of cards to magically make them perform better at 5000 res? LOL. The card is a powerfull card on it's own and goes to show exactly why when I say we don't need it, we don't need it. It's overkill for today and tomorrows games. So much so that ATI has all this extra power to spare and can throw it at something like eyefinity with all that resolution. Not to mention GL with getting the devs to code the engines/games with the correct aspect ratios and more importantly the correct FOV for said resolution. I don't recall Kyle showing us any framerates in that video review. It was a nice review and all that but why didn't see any fps? Just thought of that now actually.
 
Performance? You think ATI is doing something special to their new line of cards to magically make them perform better at 5000 res? LOL. The card is a powerfull card on it's own and goes to show exactly why when I say we don't need it, we don't need it. It's overkill for today and tomorrows games. So much so that ATI has all this extra power to spare and can throw it at something like eyefinity with all that resolution. Not to mention GL with getting the devs to code the engines/games with the correct aspect ratios and more importantly the correct FOV for said resolution. I don't recall Kyle showing us any framerates in that video review. It was a nice review and all that but why didn't see any fps? Just thought of that now actually.

While Kyle didn't give us a specific number on framerates, his video did show Call of Duty: World at War running smoothly, if even for just a brief time. If he was getting horrible performance, I'm sure he would have noted it. But I do believe that the 5870X2 will be required to run the majority of games at these resolutions. They just have to get Eyefinity working in Crossfire first...
 
I was under the impression that amd made a couple million on the 4000 series. I know I made a poll and more people bought the 4000 series than the GTX/GTS2xx series this time around. I think its uphill for them after the 3870 got its ass handed it to it with the 8800 series.

To answer your question, I think nVidia won the 3870 wars, and ATI won the 4870 wars, and we shall see what happens. I think nVidia is gonna beat up on the 5000 series, but it may be too little too late depending on when this card is released. A post-holiday release will devastate them.
Posted via [H] Mobile Device
 
Back
Top