Has Nvidia Seceded to ATI in the high end gaming segment?

Performance? You think ATI is doing something special to their new line of cards to magically make them perform better at 5000 res? LOL. The card is a powerfull card on it's own and goes to show exactly why when I say we don't need it, we don't need it. It's overkill for today and tomorrows games. So much so that ATI has all this extra power to spare and can throw it at something like eyefinity with all that resolution. Not to mention GL with getting the devs to code the engines/games with the correct aspect ratios and more importantly the correct FOV for said resolution. I don't recall Kyle showing us any framerates in that video review. It was a nice review and all that but why didn't see any fps? Just thought of that now actually.

I'm not really sure what your point is here? You say it's more power than we need and then you say they can add 3rd monitor support beucase they have the power. So is it really more than we need? Obvioulsy they're putting the power to good use, so why so bitter?
 
I was under the impression that amd made a couple million on the 4000 series.
Nope, they lost money.

I know I made a poll and more people bought the 4000 series than the GTX/GTS2xx series this time around.
Yes but if you recall your poll was stupid as it left out a lot of cards.

To answer your question, I think nVidia won the 3870 wars, and ATI won the 4870 wars,
Wrong again.
http://www.bit-tech.net/news/hardware/2009/04/30/nvidia-increases-market-share/1
 
you won't know what hit you:D

a big red bolt of awesome :D

Even though Matrox and Quaddro cards have already done this? Even bigger than actually being able to view your games in 3D?

do you actually think about what you type, or did you sign your fingers' power of attorney over to nvidia?

the matrox TH2G only supported very low resolution monitors, and it wasn't exactly a cheap piece of hardware. the NVS 400 series quadros aren't designed for gaming, but even if you throw support by the side of the road, they're not exactly using any cutting edge technology. the newest NVS 420 and 450 consist of... another G80 rehash (G98) on a $400+ card in late 2009... sorry, drinking hasn't done me quite enough drain bamage for that to make any sense as a viable alternative to what ATI has brought us.

as someone who's been interested in multi-monitor gaming for some time, i think EyeFinity is a big deal. that's certainly just my opinion. what's not opinion is that it's a huge improvement over the previous options, which came down to a shitty, expensive hardware adapter prone to failure, or amateur-programed middleware with no official support from anyone. ATI brings official support of a major hardware developer, which creates a president, however small, for game developers to include good multi-monitor support and opens it up as something reviewers can legitimately comment on--i think multi-monitor gaming will only improve from here, even if it never catches on as the norm.

still i wouldn't underestimate its popularity. knowing that mutli-monitor setups are great for productivity, and not just gaming, EyeFinity already has an consumer base ready to jump on board that i'd be willing to bet is more prolific than the multi-GPU crowd has ever been -- once ATI gets official DP to DVI converters supported, which may take time but will happen.

lastly, personally speaking, yes, gaming at 5760x1200 or possibly even 6000x1920 someday is a much more appealing possibility than being able to view my games in 3D on a shitty little 22", 1680x1050 120Hz monitor, because that's what you're stuck with using nvidia's 3D vision.

i got nothing against nvidia as a company, and i hope they do well. but seriously, enough stupidity from the nvidiots.
 
Yes but if you recall your poll was stupid as it left out a lot of cards.[/url]

Steam hardware survey shows the 4800 series having a much higher market share than the GT200 series.

Even if you think this isn't indicative of real market tendencies, you'd have to be real stubborn to refuse to believe a cheaper card will not sell more than a more expensive one when both perform more or less the same.
 
I love how PRIME1's link to market share numbers contains information from Q3 and Q4 2008... So the numbers are almost a year old, and yet somehow reflect the current market share? Simple googling reveals AMD has gained back 4% of the market share in Q2 2009, so while Nvidia is technically still 'top dog' their lead has decreased (ATI did gain back more market share in Q3, but there are no concrete numbers). HD 4000 was in fact also profitable for AMD. This is also revealed by simple googling.

This thread is dumb.
 
I'm not really sure what your point is here? You say it's more power than we need and then you say they can add 3rd monitor support beucase they have the power. So is it really more than we need? Obvioulsy they're putting the power to good use, so why so bitter?

Just because I am critical over something why do you assume I'm bitter? I'm not bitter at all I compare and contrast which is why these forums exist no? To discuss and compare the latest and greatest hardware for better or for worse. Just because I'm not singing eyefinities praises from the mountain tops doesn't mean I'm jealous bitter hater as you seem to think. Adults don't always have to love or hate something to disagree with it.

Yes power is being put to use never said it wasn't. What I meant was for the average gamer here it's not needed. Especially if your in a current gen card(cards)setup. I also don't see the big deal about it either way to be honest. It's so been there done that. THTG has been around for ages and while it's nice it's nothing to drool over imo. On the other hand if devs were really supportive of the PC platform like years ago then perhaps this would be a bigger deal. As it stands I don't think it's a big deal. See? No hate my man. Just opinion. In fact if you look back at my earlier posts you will see me complementing the 5800 series of cards.
 
Just because I am critical over something why do you assume I'm bitter? I'm not bitter at all I compare and contrast which is why these forums exist no? To discuss and compare the latest and greatest hardware for better or for worse. Just because I'm not singing eyefinities praises from the mountain tops doesn't mean I'm jealous bitter hater as you seem to think. Adults don't always have to love or hate something to disagree with it.

Yes power is being put to use never said it wasn't. What I meant was for the average gamer here it's not needed. Especially if your in a current gen card(cards)setup. I also don't see the big deal about it either way to be honest. It's so been there done that. THTG has been around for ages and while it's nice it's nothing to drool over imo. On the other hand if devs were really supportive of the PC platform like years ago then perhaps this would be a bigger deal. As it stands I don't think it's a big deal. See? No hate my man. Just opinion. In fact if you look back at my earlier posts you will see me complementing the 5800 series of cards.

Since when has a top of the line card been targeted towards the "average gamer?" I find it funny when someone with a TRI-SLI setup is complaining (or in your own words, criticizing) another card for having too much power. :rolleyes:
 
Last edited:
Since when has a top of the line card been targeted towards the "average gamer?" I find it funny when someone with a TRI-SLI setup is complaining (or in your own words, criticizing) another card for having too much power. :rolleyes:

Yeah I noticed this as well. Tri-sli and highly overclocked i7, yet complaining about no use for all this power. Kind of hypocritical, no?
 
Christ, this has gone on for three pages and not one person asked the simple question.

Is the OP a complete fucking moron? :rolleyes:
 
Just seems like it since they haven't responded as quick as they usually would. Usually both these big card companies will release their new generation of cards not too long after the competitor releases their new cards.

This is why I don't think Nvidia expected the 5800 series to be quite this good, which is why they don't have a set release date for their gt300 series of cards.

Do I like this? No, I don't want ATI to dominate the market like they are because I don't want to pay high prices. We are seeing this trend already with ATI cards. 5850 is $259, as opposed to the 4850 which launched at $199. 5870 launched at $379, as opposed to the 4870 which launched at $299.

This is not looking good for consumers. Hopefully Nvidia won't continue the refresh game and have a suitable response in a timely fashion(e.g. 2 weeks after the 5800 series launch date) or they might as well call it quits.


Seriously are you blind or ignorant, considering how cheap video cards have come out lately, stop complaining about the price hike. It comes with territory and R&D, ATi was the one who basically forced Nvidia to go lower on the pricing schemes. As i recall a while ago when high end cards such as the ATI9800XT or the 8800 GTX was around the ballpark of 499+.when it was first released. So its not anything to throw a fit about in fact you should praise how their keeping it at a reasonable price for a high end card.
 
Christ, this has gone on for three pages and not one person asked the simple question.

Is the OP a complete fucking moron? :rolleyes:

well I don't think question was phrased as well as it might but it is an interesting topic. Nvidia has put its money where its mouth is. it has really changed what the chip is meant to do and could very like not compete on the top end gaming cards. a better question could be has nvidia abandoned the current high end strategy for a product that is only mid ranged in the game sense but has much broader market.

I don't see this a moron level topic.
 
We'll find out once the cards are out.
Post #446
It could have been their RV770, but they are being more ambitious and simultaneously conservative, putting off a shrinkage/optimization pass until the next revision, opting instead of rearchitecting pieces.

It looks like they want to absorb the pain upfront now of switching the architecture yet again, rather than do a generation that is merely a refinement/optimization, followed later by bigger changes.

If they had simultaneously made all of the architectural changes, and also concentrated on tweaking all of the units for size, the card might have been delayed further.

Of course, there are limits to how dense they can go, but I wouldn't say they can't do better. Really, this is not much different in software development where whenever you introduce major architectural changes, you end up going for correctness first, and then later, you go back and optimize everything.

Their hand may have been forced. Looking at Larabee, and other product roadmaps, they probably felt they needed to get a much more general purpose GPU this generation else be caught with their pants down by Intel next year.
 
Someone please fix the title of this topic. It makes this board look like a hive of illiterate failures every time it's in the last post column.
 
Since when has a top of the line card been targeted towards the "average gamer?" I find it funny when someone with a TRI-SLI setup is complaining (or in your own words, criticizing) another card for having too much power. :rolleyes:

Ok so because I run a very powerfull rig with tri sli etc etc that automatically dictates what I can and cannot comment on. OK ya. make this about me now, all good I see where this is going. I wont be a part of that nonsense.
 
Ok so because I run a very powerfull rig with tri sli etc etc that automatically dictates what I can and cannot comment on. OK ya. make this about me now, all good I see where this is going. I wont be a part of that nonsense.

You either don't see or are failing to acknowledge how hypocritical your own comments are. You have a more powerful setup then something you are calling too powerful. That's like me telling someone they own too many guns only for them to find out I own twice as much. It's an absurd and baseless statement. Is it too powerful for your average gamer? Sure, it's probably overkill, but how many average gamers do you know who are buying 5870's? I don't know any.
 
well I don't think question was phrased as well as it might but it is an interesting topic. Nvidia has put its money where its mouth is. it has really changed what the chip is meant to do and could very like not compete on the top end gaming cards. a better question could be has nvidia abandoned the current high end strategy for a product that is only mid ranged in the game sense but has much broader market.

I don't see this a moron level topic.

The "conclusions" you reach are truly amazing...:rolleyes:

How is GF100 a "mid ranged in the game sense" product ? You don't even know how it performs yet and you're already assuming what it does in games. So no, your question is as bad as the OP's.

If anything, a "simple" gamer, that only cares about a graphics card to play games, will ask about the direction that NVIDIA's taking, given the fact that no graphics performance was shown at their GPU Technology Conference. And the answer to that is simple, their direction is that they are widening their business again. After getting into to the handheld business with Tegra, they are now even more focused on the HPC market (they already were going for the HPC market with GT200 and the Tesla platform), which is a very profitable one. But does this mean that they are not focused on the consumer graphics card market ? Not at all. It's still their core business and Fermi was surely engineered with everything in mind. And even though NVIDIA is "late", this position surely gives them a clear sight of what they need to beat, since they already know how the HD 5800 series perform.
 
The "conclusions" you reach are truly amazing...:rolleyes:

How is GF100 a "mid ranged in the game sense" product ? You don't even know how it performs yet and you're already assuming what it does in games. So no, your question is as bad as the OP's.

If anything, a "simple" gamer, that only cares about a graphics card to play games, will ask about the direction that NVIDIA's taking, given the fact that no graphics performance was shown at their GPU Technology Conference. And the answer to that is simple, their direction is that they are widening their business again. After getting into to the handheld business with Tegra, they are now even more focused on the HPC market (they already were going for the HPC market with GT200 and the Tesla platform), which is a very profitable one. But does this mean that they are not focused on the consumer graphics card market ? Not at all. It's still their core business and Fermi was surely engineered with everything in mind. And even though NVIDIA is "late", this position surely gives them a clear sight of what they need to beat, since they already know how the HD 5800 series perform.

valset always claims the high road, claiming his impartiality because he owns a gtx280, though conspicuously calls it a g4saurus. his comments for the past year have pretty much been swipes at nvidia. if he feels so hurt and betrayed by nvidia, ill gladly let him have one of my 4890's for free.
 
after receiving my 5870 order, i'm looking to unload my 4870x2's this weekend for $200 bucks a piece if anyone's willing to pick them up from me in california. i was going to save my 4890's for my htpcs but i'll gladly lose one just to shut him up.
 
You either don't see or are failing to acknowledge how hypocritical your own comments are. You have a more powerful setup then something you are calling too powerful. That's like me telling someone they own too many guns only for them to find out I own twice as much. It's an absurd and baseless statement. Is it too powerful for your average gamer? Sure, it's probably overkill, but how many average gamers do you know who are buying 5870's? I don't know any.

The average gamer that comes to [H] would quite possibly buy a 5870. I see a lot of people herein their sigs running 1 single top of the line gpu in their rigs. That's more of what I meant. I know joe budweiser that shops at bestbuy isn't going to spend 400-500 on a single video card. I assumed that much was obvious.

As far as my rig is concerned. My own rig has no bearing on my commenting on another piece of hardware. What, if I was running a single gts 250 then I would be cool saying what I said? Please. I already said I know my system is pretty much overkill as well. So what now? Wanna talk about me and my system more? Or about the topic @ hand? You decide my man.
 
Nope, they lost money.


Yes but if you recall your poll was stupid as it left out a lot of cards.


Wrong again.
http://www.bit-tech.net/news/hardware/2009/04/30/nvidia-increases-market-share/1

Ahhh I remember you now. Not even wasting my time.

Actually, if you look at steam, the 8800 series is still the most popular card, and for good reason. But more people bought the 4800 series than the GTX/GTS series. And sorry, the GTS is the same as the 9800.

And after clicking that fucking stupid link and reading "nVidia now says its beating ati into submission", I decided that source too, which was written almost 6 months ago, is also a waste of my time. Especially since ati took the lead in laptop chips since then. I don't particularly care about anything less than enthusiast, where ati did in fact outsell nvidia, but moneys gotta be made somewhere.
Posted via [H] Mobile Device
 
Last edited:
Ahhh I remember you now. Not even wasting my time.
Then your post should have ended there.:rolleyes:
Actually, if you look at steam, the 8800 series is still the most popular card, and for good reason. But more people bought the 4800 series than the GTX/GTS series. And sorry, the GTS is the same as the 9800.
As long as you ingorne that the slower 4800 cards like the 4850, 4830, etc. actually competed with the 9xxx series. Leaving those cards out of the poll was stupid. Like having a poll, "which CPU did you buy? AMD or Celeron".

And after clicking that fucking stupid link and reading "nVidia now says its beating ati into submission", I decided that source too, which was written almost 6 months ago, is also a waste of my time.
The source was Mercury Research. Also are you saying that six months ago the 4800 series was not out? Truth hurts, eh?

Especially since ati took the lead in laptop chips since then. I don't particularly care about anything less than enthusiast, where ati did in fact outsell nvidia, but moneys gotta be made somewhere.
Enthusiast = laptop chips only? :D

Actually it's laptop cards not chips, NVIDIA beat them in mobile market as well.
 
Just seems like it since they haven't responded as quick as they usually would. Usually both these big card companies will release their new generation of cards not too long after the competitor releases their new cards.

This is why I don't think Nvidia expected the 5800 series to be quite this good, which is why they don't have a set release date for their gt300 series of cards.

Do I like this? No, I don't want ATI to dominate the market like they are because I don't want to pay high prices. We are seeing this trend already with ATI cards. 5850 is $259, as opposed to the 4850 which launched at $199. 5870 launched at $379, as opposed to the 4870 which launched at $299.

This is not looking good for consumers. Hopefully Nvidia won't continue the refresh game and have a suitable response in a timely fashion(e.g. 2 weeks after the 5800 series launch date) or they might as well call it quits.

Where I live the 5870 costs $409.99 while the GTX 285 costs $379.99. That's pretty good value in my opinion.
 
Take heart.
Right now, ATI has introduced some really exciting technology and has definately got nvidia's attention.

So, ATI brought out a new card...........and it's really fast and has EyeFinity and DX11......and they are kicking nvidia around.

If you recall, nvidia had the 8800 series and the GTX 200 series which kicked ATI around for quite a while........

Tit for tat. Advantage ATI for now. Simple high tide, low tide. The tide will turn in a few months, no doubt, or not.:eek:

Still.............there is nothing out that even makes my GTX 285s sweat.........so why should I jump on the 5870????
I did it the last time with a couple of 4870 X2s and eventually went back to the 285s.

I'm on the fence about Eyefinity, though. Once it is supported in CrossfireX, I might jump on it. All I need to do is get another 24" from Dell and I'm set................my wife will kick my ass though.:eek::eek:


uhm no GTX200 was not kicking ATi around as they were definately price performance leader....and now that continues with 58xx series
 
The "conclusions" you reach are truly amazing...:rolleyes:

How is GF100 a "mid ranged in the game sense" product ? You don't even know how it performs yet and you're already assuming what it does in games. So no, your question is as bad as the OP's.

If anything, a "simple" gamer, that only cares about a graphics card to play games, will ask about the direction that NVIDIA's taking, given the fact that no graphics performance was shown at their GPU Technology Conference. And the answer to that is simple, their direction is that they are widening their business again. After getting into to the handheld business with Tegra, they are now even more focused on the HPC market (they already were going for the HPC market with GT200 and the Tesla platform), which is a very profitable one. But does this mean that they are not focused on the consumer graphics card market ? Not at all. It's still their core business and Fermi was surely engineered with everything in mind. And even though NVIDIA is "late", this position surely gives them a clear sight of what they need to beat, since they already know how the HD 5800 series perform.

I see you have been busy with the flame wars as usually. :D sorry but it looks like Nivida is trying to kick ass some where else this round or haven't you noticed? anyway I have the troll his breakfast and you still have a lot of threads to make your rounds too :cool:
 
valset always claims the high road, claiming his impartiality because he owns a gtx280, though conspicuously calls it a g4saurus. his comments for the past year have pretty much been swipes at nvidia. if he feels so hurt and betrayed by nvidia, ill gladly let him have one of my 4890's for free.

actually happy, believe it or not I am actually impressed with them right now, like I said above they have put their money where there mouth is. and yes I am a proud g4saurus owner. :D some of us do not take ourselves so seriously that we carry our pride online. sorry it was funny. hell this is even better http://www.collegehumor.com/video:1922186 XFX rocks.

anyways sorry I stepped on your toes here. I just don't care for a lot of Nvidias ethics of late.
 
The average gamer that comes to [H] would quite possibly buy a 5870. I see a lot of people herein their sigs running 1 single top of the line gpu in their rigs. That's more of what I meant. I know joe budweiser that shops at bestbuy isn't going to spend 400-500 on a single video card. I assumed that much was obvious.

As far as my rig is concerned. My own rig has no bearing on my commenting on another piece of hardware. What, if I was running a single gts 250 then I would be cool saying what I said? Please. I already said I know my system is pretty much overkill as well. So what now? Wanna talk about me and my system more? Or about the topic @ hand? You decide my man.

The average gamer on here is not your average gamer. You overgeneralized and that's where the problem is. You made a blanket statement about a card and users needs. If you're talking about people still running their games on a 19" monitor I'd fully agree with you. I could care less about your system, heck i like seeing nice setups. But when someone with a powerful system says something eles is "too powerful" that's going to raise red flags. Why is it acceptable for you to have a system that's overkill but not others?
 
And i won't even talk about free of sli issues, no microstuttering, no 95c while playing Crysis in summer with 85% fan and alot stable fps,and for the end 120$ cheaper.ITS FAR from expensive get real!

Your sig seems to suggest you are getting 5870 crossfire?
 
actually happy, believe it or not I am actually impressed with them right now, like I said above they have put their money where there mouth is. and yes I am a proud g4saurus owner. :D some of us do not take ourselves so seriously that we carry our pride online. sorry it was funny. hell this is even better http://www.collegehumor.com/video:1922186 XFX rocks.

anyways sorry I stepped on your toes here. I just don't care for a lot of Nvidias ethics of late.

a-ok. there are no toes to step online right? but we all know you're not an entirely impartial commenter, and figured i'd do you a favor and give you a little ati happiness. offer stands.
 
Renny, I don't understand why Mr. Valset would want a 4890 seeing that he has a gtx280, which is from my understanding...faster than a 4890.
 
Then your post should have ended there.:rolleyes:

As long as you ingorne that the slower 4800 cards like the 4850, 4830, etc. actually competed with the 9xxx series. Leaving those cards out of the poll was stupid. Like having a poll, "which CPU did you buy? AMD or Celeron".


The source was Mercury Research. Also are you saying that six months ago the 4800 series was not out? Truth hurts, eh?


Enthusiast = laptop chips only? :D

Actually it's laptop cards not chips, NVIDIA beat them in mobile market as well.

How exactly was the 9000 series set to compete with the HD4000 series :confused::confused: It was released well prior to it :confused::confused::confused: :rolleyes:

(February 2008 for the 9800 and September 2008 for the 4000 series) Why would nVidia release something 7 months prior to the release of a new card?? :confused: I just don't understand.

Now if you'll click the steam hardware survey You'll notice that yes, nVidia has the majority of the market share. This is not exactly a big win for them because it doesn't look like very many people are adopting the GTX280/285/260/295, and are just keeping their 8800 series. Bad new for both camps. But it does look more like people are adopting the 4800 series as an upgrade for whatever they had.


Regardless of what you say, this round was GTS250, GTX 260, GTX 260 216, GTX 280, GTX 285, and GTX 295 VS the HD4000 series. I would guess the overwhelming majority of the forum would state this is fact. Rebranding your previous generation's high-mid range cards is by no means a way to create a mainstream card.

This article states that "AMD Takes Notebook Discrete Graphics Market Share Lead", and is a little more recent than anything you've posted this far.

So if ATi has 53 percent of the market share in notebooks, how exactly can nVidia have more than 47 percent? I'm sure intel owns some of the remainder, too. Is nVidia such a great company that when their products are included, they can actually exceed 100 percent? I'm unsure... :confused::confused:

According to Jon Peddie's Market Watch report published by JPR, "First Quarter, 2009 - Graphics Semiconductor shipments and market activity." Data shows Q2 unit growth of 87.27% over Q1, 36.5% market share gain in Q2 over Q1, and a Q2 market share of 53%.
 
PRIME1, can you show me a link dated recently that states AMD/ATI lost money on the 4800's?
 
Last edited:
Back
Top