Nvidia Wants To Own The Graphics Card Market

Vista was nearly a decade ago. It's ancient history. In recent years nVidia's driver support is excellent. AMD/ATI's driver team has mostly been laid off. They barely support their cards. You could even make a compelling argument that Intel HD graphics are better supported at this point.


Yes but the Nvidia Uberdriver narrative very dominant at the time (mostly because ATI drivers were much worse before AMD took them over). Nvidia didn't just have a few rough edges they blew a hole through the total Vista landscape. Haven't seen any other hardware maker mess the bed in such magnitude since.


NVIDIA drivers responsible for nearly 30% of Vista crashes in 2007
http://www.engadget.com/2008/03/27/nvidia-drivers-responsible-for-nearly-30-of-vista-crashes-in-20/
 
Really, I haven't had any issues with it editing wise. What was your issues for photos?


You can't (in Windows 7 on) move photos and sub-folders around randomly in a main folder. After a days shooting, I can have a card full of photos with different material. I can have landscapes scattered around with macros, portraits etc. If I try to move and group photos together by subject inside a folder I can't. I can make a new sub-folder to group things but that sub-folder will stay at the top or bottom of main folder. If I want to move (say) a sunset folder down to middle of a main folder where there are a lot of sunset shots I cant do it. In Vista you can mover anything around any which way you want. This has been a common complaint for awhile. I'm not sure there's been a work around yet but I'll be looking.
 
You can't (in Windows 7 on) move photos and sub-folders around randomly in a main folder. After a days shooting, I can have a card full of photos with different material. I can have landscapes scattered around with macros, portraits etc. If I try to move and group photos together by subject inside a folder I can't. I can make a new sub-folder to group things but that sub-folder will stay at the top or bottom of main folder. If I want to move (say) a sunset folder down to middle of a main folder where there are a lot of sunset shots I cant do it. In Vista you can mover anything around any which way you want. This has been a common complaint for awhile. I'm not sure there's been a work around yet but I'll be looking.

Ah, I handle everything in Lightroom, no issues here. Lightroom doesn't have that problem despite using the same organizational backbone.

But we maybe going through different naming conventions. I do a simple date system

Example

2015-11-08 - Hardocp Hardware Images

And then I use keywords to label the files for search.

It's scary how convenient it is, but I also produce 240k images a year so I had to get it down pat otherwise it was a nightmare.
 
AMD (ATI) in graphics is needed to keep Nvidia hopping, and vice versa. There will be another if AMD falls but it will take a while. In the meantime you will see Nvidia squeeze ever dollar out of it's customers (as AMD would if it were in that position) for the same old same old. Whether you are fanboy or not, AMD or Nvidia dying, GFX or CPU wise, is a bad thing. Everyone will lose if that happens.

I have ATI and Nvidia newer graphics cards 970's and 390's and the driver issues have been greater with the Nvidia experience stuff but not enough to matter. Other than that the cards both perform well. You guys think drivers are bad now lol you it was much worse in the past....

BTW We are seeing the stagnation with Intel in CPU's already...

As for Vista and Nvidia.... between Nvidia drivers + Intel 8XX series chipsets they killed Vista. I was in software QA at the time.... man did that OS try me early on. after SP1 all was good.
 
I myself refuse to support Nvidia and their BS industry breaking crap, they constantly shank folks who time and time again have put their "faith" and hard earned money supporting their practises, why is any one thinking what they routinely do good for anyone even those who own their products is beyond me, they have shown time and time again they will do anything including outright destructive acts against even their own shareholders.

Fool me once shame on them, fool me twice not happening, screw them, they do not deserve the $$$ that folks throw at them for many years at all to destroy other possible competition via their fanbase lapping up more or less any action they take no matter how corrupt it has proven to be.
 
AMD (ATI) in graphics is needed to keep Nvidia hopping, and vice versa. There will be another if AMD falls but it will take a while. In the meantime you will see Nvidia squeeze ever dollar out of it's customers (as AMD would if it were in that position) for the same old same old. Whether you are fanboy or not, AMD or Nvidia dying, GFX or CPU wise, is a bad thing. Everyone will lose if that happens.

I have ATI and Nvidia newer graphics cards 970's and 390's and the driver issues have been greater with the Nvidia experience stuff but not enough to matter. Other than that the cards both perform well. You guys think drivers are bad now lol you it was much worse in the past....

BTW We are seeing the stagnation with Intel in CPU's already...

As for Vista and Nvidia.... between Nvidia drivers + Intel 8XX series chipsets they killed Vista. I was in software QA at the time.... man did that OS try me early on. after SP1 all was good.

What stagnation are we seeing from Intel ... they seem to be aligned to the demands of the market right now (more power efficient chips for mobile users) ... Intel chips seem to be overclocked into the high 4's or low 5's (depending on the cooling) when users want that ... anyone who thinks that CPUs should be running at 6 or 7 GHz doesn't really understand the market ... the only area that Intel is still really weak in is the super mobile markets dominated by ARM ... they need to focus more effort there as that is the growth market for the near future
 
I myself refuse to support Nvidia and their BS industry breaking crap, they constantly shank folks who time and time again have put their "faith" and hard earned money supporting their practises, why is any one thinking what they routinely do good for anyone even those who own their products is beyond me, they have shown time and time again they will do anything including outright destructive acts against even their own shareholders.

Fool me once shame on them, fool me twice not happening, screw them, they do not deserve the $$$ that folks throw at them for many years at all to destroy other possible competition via their fanbase lapping up more or less any action they take no matter how corrupt it has proven to be.

I just buy the better product, regardless of the company producing it. nVidia has been producing better cards ever since the Geforce 8800.
 
I hate this shit, when companies try to leverage their market position in one area to force themselves into others.


When I am buying a video card, I am buying one thing only. A video card. I don't want to buy into a fucking ecosystem.

Nvidia does this shit over and over again, and all it does is piss people off. First it was PhysX, then Gsync. And there's fucking GameWorks too...

WE DO NOT WANT PROPRIETARY STANDARDS.

All standards must be open and fully cross compatible with all hardware, or die a horrible fucking death.
 
I just buy the better product, regardless of the company producing it. nVidia has been producing better cards ever since the Geforce 8800.

I mostly agree with this sentiment, but to be honest, I feel the 7970 and 290x were both better than anything Nvidia had on the market when they launched. The lead didn't last for long, but it was there.
 
Zarathustra[H];1041959593 said:
I mostly agree with this sentiment, but to be honest, I feel the 7970 and 290x were both better than anything Nvidia had on the market when they launched. The lead didn't last for long, but it was there.

Too bad miners drove the prices up and swallowed them up, which had the effect of shoving gamers out of the way, and over to Nvidia. I know that was my thing - 290x looked great but I wasn't going to pay over MSRP because of the mining craze.
 
Too bad miners drove the prices up and swallowed them up, which had the effect of shoving gamers out of the way, and over to Nvidia. I know that was my thing - 290x looked great but I wasn't going to pay over MSRP because of the mining craze.

I couldn't believe my eyes when I sold my HD 5970 two years ago on eBay. I got a whopping $450 for it. That's a card I bought in March of 2010. I had no idea why the guy who bought it off of me overpaid by at least $250. I asked around on another forum as to why this happened, and they told me it was due to the mining craze.
 
What stagnation are we seeing from Intel ... they seem to be aligned to the demands of the market right now (more power efficient chips for mobile users) ... Intel chips seem to be overclocked into the high 4's or low 5's (depending on the cooling) when users want that ... anyone who thinks that CPUs should be running at 6 or 7 GHz doesn't really understand the market ... the only area that Intel is still really weak in is the super mobile markets dominated by ARM ... they need to focus more effort there as that is the growth market for the near future

I agree with this. How much can intel do also? Despite many of us being enthusiast, the average cpu on steam is actually just a paltry 2.3~2.69 dual core with a fairly large market share.

How much can you keep releasing these incredibly insane machines when a large portion of your clients really don't care about it.

Yeah a lot of us want massive performance for cheap, but that really doesn't help business out either.

I think Intels innovations in their chipsets and performance is actually extremely well thought out and doesn't lack innovation at all.
 
If AMD goes down, some one else will take its place.

The market it too atractive for just one player.

Yes, this perfectly explains why there is no other competition currently considering AMD/ATI has less than 25% of the market share.

:rolleyes:
 
regardless of who you like, competition is important to keep things moving forward...
 
Yes, this perfectly explains why there is no other competition currently considering AMD/ATI has less than 25% of the market share.

:rolleyes:

Well intel could always be split up by big government and then converge again 17 years later a la the Bells.
 
Well intel could always be split up by big government and then converge again 17 years later a la the Bells.

Or not get split up at all, a la Microsoft. Once you have enough market share and are pulling in that much money, you can do pretty much whatever you want to suck every $ out of your customers for less and less quality products. Witness cable companies where they have a virtual monopoly in areas with poor over the air reception. Prices go up, and they have a 'screw you' attitude towards their customers.
 
If AMD goes down, some one else will take its place.

The market it too atractive for just one player.
GPUs aren't the same as selling oranges. They represent some of the most sophisticated technology on the planet. There's not necessarily going to be a viable way for a competitor to catch up if Nvidia ends up owning the GPU market.
 
I just buy the better product, regardless of the company producing it. nVidia has been producing better cards ever since the Geforce 8800.

in your opinion they have in mine, they have not. In many ways since the 6000 series they have produced many poorly made products no matter the speed they ran/run the quality has been utter crap in many many cases, putting below required parts on their cards, using bad manufacturing practices, WAY overpricing many products in question, extremely terrible and ultimately destructive practises for the industry as a whole.

Was not that long ago that Nvidia cards were far more expensive, as well as far noisier and more power hungry then competing ATi/AMD cards keep that in mind; I refuse to support them as this would mean supporting the bullshit they pull for everyone that gives them any money, including game developers looking for an "Edge" squandering performance away from AMD in the process....

To the person above the above statement :p sorry dont know multi-quote. Intel has stagnated as the ONLY reason why they have produced the current chips they have is cause AMD HAS been there constantly to keep them producing faster, more efficient, better processors. So yes it appears that they have done very power efficient etc, but also look at the past generations, performance wise and so forth the bumps have been VERY small, so IMO they are not pushing boundaries they are doing "just" enough to keep shareholders happy and making billions in process, nothing more as ever since Core 2 they really have not pushed massive leaps comparatively speaking
 
Gotta say AMD's drivers in Windows 10 is pretty damn good, and from what i have seen nVidia is not in a great position right now, they are bringing out drivers almost daily trying to fix all their issues. It's ironic that AMD has better Windows 10 drivers than nVidia, something nVidia have enjoyed for years. Personally i have never had a issue with either nVidias or AMD's drivers both have been good, only issue really is AMD's hardware has the power but it's been held back by poor mutithreaded drivers, but DX12 seems to be heaven for AMD. Nvidia have a great tool to help them optimise their drivers and get the most out of their GPU's but at present it's not doing much for them in DX12 maybe due to them having a good multi-threaded performance to begin with while AMD's seemed to be running on a single thread the whole time. X290 the longest lasting card ever still plays all games out there and is getting pretty ancient.
 
The lock-in they're talking about is forcing people to have an Nvidia card for things like using the Shield controller with your PC.

There's no reason why they couldn't release a controller driver outside of the Nvidia graphics drivers, but they won't.

It's kind of a d*ck move if you ask me.
 
What stagnation are we seeing from Intel ... they seem to be aligned to the demands of the market right now (more power efficient chips for mobile users) ... Intel chips seem to be overclocked into the high 4's or low 5's (depending on the cooling) when users want that ... anyone who thinks that CPUs should be running at 6 or 7 GHz doesn't really understand the market ... the only area that Intel is still really weak in is the super mobile markets dominated by ARM ... they need to focus more effort there as that is the growth market for the near future

The stagnation is that I do not really have a compelling need to update a 1336 Nehalem CPU.
 
The lock-in they're talking about is forcing people to have an Nvidia card for things like using the Shield controller with your PC.

There's no reason why they couldn't release a controller driver outside of the Nvidia graphics drivers, but they won't.

It's kind of a d*ck move if you ask me.

Yeah that's the reason i left nVidia in the first place and gone with the 290 over 2 years ago, not regretted it one bit. Don't like the way nVidia try to force people to stay in their eco system with system works with stupid amounts of tess being done to try and cripple the competition, Shield which has no real reason to be GPU vendor locked and physx. Pisses me off that i have a AMD card and a nVidia 470 yet you can't use the 470 as a physx card along side a none AMD card WTF is that about you have to hack the drivers and the game to get round it but every driver update brakes it and it's now almost impossible to do with the newer games.
 
So Nvidia hasn't learned anything from Origin, Raptr, and others.

We don't need another "community experience" especially one that forces up to be a part of it.
 
I hope that's true. I don't think I'll buy it mainly because I'm usually more of the middle type guy, so I usually buy late, but mid-high and it works well for me. I bought a gfx card like a year or so ago. 280. And it works. Not that I've used my desktop much. Not many games I want to really play. Did use it to play the shit out of Witcher 3 though. And a good bit of Dragon Age, but man did that get boring after I wiped out the dragons.

Oh I only buy sweetspot price/performance cards also. I hate overspending on halo top card (or anything) which is priced out of proportion to what it should be just because of marketing. I'm running 2x 280x oc'd cards and really can't complain especially the performance of these cards increasing with almost every big driver release! :)
 
GPUs aren't the same as selling oranges. They represent some of the most sophisticated technology on the planet. There's not necessarily going to be a viable way for a competitor to catch up if Nvidia ends up owning the GPU market.

The same could be said about CPUs and just take a look at ARM.
 
Intel doesn't make graphics cards.

Sooo, if AMD goes under, Intel should pick up the GPU portion and Nvidia should pick up the CPU portion.

That show would be worth getting a jumbo size popcorn.
 
We are already boned.

Example, GTX 970 has not come down in price even though, it came out over a year ago.

That is the "future" (present) that we have.
 
The same could be said about CPUs and just take a look at ARM.
That's apples to oranges. ARM isn't competing on the desktop. x86 processors still dominate in terms of maximum performance and ARM processors have been developed since the 80s. They gained so much marketshare because of smartphones. Even then, the fact that it's taking Intel so long to catch up to them kind of proves the point, because they had to develop technology for a different market that took them years to close the gap on, and that's coming from a high tech company with deep pockets. With high end graphics, the difference is arguably even more extreme. Nvidia and AMD are neck and neck in different ways, Intel is a distance third, then that's mostly it.
 
Go for it, NVIDIA. I have no allegiance. I go for the best. If they can be the best, I'll pick up one of their cards. Of course, I've been on AMD lately, but my next upgrade is on the Green team.

Just don't make it all gimmicks and proprietary BS.
 
If AMD goes down, some one else will take its place.

The market it too atractive for just one player.

The only outfits I can think of that are still producing discrete graphic cards are Matrox and S3. (Not even really sure about the latter.) Anyway, I'm thinking neither has much to offer for the majority of people inhabiting the [H] Forums.

The list of companies involved in developing and/or manufacturing discrete graphics for either the PC or Apple who are now either no longer doing so, if not out of business entirely, is a mile long.

Just off the top of my head I can recall Cirrus Logic, 3dfx, Weitek, Tseng Labs, Rendition, SGI, Trident, Oak Technology, 3Dlabs, SIS, No. 9, Chromatic and I'll bet at least a dozen others I can't remember. I'm showing my age here.

I could see Intel electing to jump into the arena, but only in the absence of a legitimate Nvidia competitor. Especially if Intel were to absorb what remains of ATi/AMD's personnel, if not their IP. Seems to me that AMD's concept of CPU/GPU "Fusion" is inevitable, and somebody having an existing x86/x64 capability would have a distinct edge on Nvidia, *if* they chose to leverage that advantage.
 
Intel doesn't make graphics cards.

Perhaps but they are the largest and most profitable graphics supplier because of Chipsets ... discrete GPUs are only used in a small percentage of the market (which is why the GPU companies have come up with things to sell those users 2-4 cards for the same computer) ... NVidia and AMD are about 5 billion dollar companies each while Intel is closer to 50 billion ...

every computer/server manufactured has to have at least one CPU chip in it (most coming from Intel) ... every computer/server manufactured has to also have a Chipset with graphics (again mostly from Intel) ... finally some smaller percentage of computers will add a separate GPU (mostly from NVidia) ... because Intel sells so many graphics chipsets and because most computers only use the chipset for graphics they are the clear graphics champ in the non-GPU market
 
Just don't make it all gimmicks and proprietary BS.

But that's exactly what we are talking about here :p

- Proprietary PhysX
- Proprietary GameWorks Effects exclusives
- Proprietary G sync standard

Personally I don't give a rats ass about the shield garbage, but there's that too.

It's one thing if there were something special about Nvidia hardware making it alone capable of computing PhysX calculations, GameWorks effects or working with G-Sync monitors, but that most certainly isn't the case. These thigns use standard GPU capabilities, and only don't work because of Nvidias shitty tendency to try to lock other out.

Meanwhile AMD has taken the high road with everything they've done, the most recent examples have been FreeSync which is an open standard, and HBM for which they did a huge chunk of the development, but will get no royalties and are allowing others (including Nvidia) to use it.

Can you imagine if AMD did the same thing Nvidia does?

You'd either have to pick and choose which games you like to get the AMD or Nvidia exclusives in game or buy one of each to enjoy all games with all exclusives. It's insane!

AMD has taken the high road thus far, and honestly it has hurt them, and Nvidia probably will get away with this type of behavior.

Here's how GPU's should work:

Each competing on how well they perform in the latest DirectX and OpenGL standards, with everything else being done in OpenCL. Outside of performance to these standards (and other related aspects, such as noise levels, power use, etc.) there should be no further brand differentiation based on features, and never under any circumstance an exclusive of any kind.

What Nvidia is doing is really shitty, and it hurts their customers more than anyone else. You would hope behavior like this would draw the attention of FTC regulators, but they sadly tread WAY too lightly these days.
 
The lock-in they're talking about is forcing people to have an Nvidia card for things like using the Shield controller with your PC.

There's no reason why they couldn't release a controller driver outside of the Nvidia graphics drivers, but they won't.

It's kind of a d*ck move if you ask me.

Extra support costs money, and companies aren't generally in the habit of spending additional money to support the competition's products.
 
Zarathustra[H];1041960896 said:
Meanwhile AMD has taken the high road with everything they've done, the most recent examples have been FreeSync which is an open standard, and HBM for which they did a huge chunk of the development, but will get no royalties and are allowing others (including Nvidia) to use it.

Can you imagine if AMD did the same thing Nvidia does?

They can't due to their weak marketshare position. So they only take "the high road" because the industry ignores them if they try proprietary. See: Mantle.
 
Zarathustra[H];1041960896 said:
- Proprietary Mantle
- Proprietary Freesync
- Proprietary True Audio standard

AMD has taken the same road as NVIDIA The only difference is AMD is riding in a donkey cart while NVIDIA is cruising along in a corvette.
 
That's apples to oranges. ARM isn't competing on the desktop. x86 processors still dominate in terms of maximum performance and ARM processors have been developed since the 80s. They gained so much marketshare because of smartphones. Even then, the fact that it's taking Intel so long to catch up to them kind of proves the point, because they had to develop technology for a different market that took them years to close the gap on, and that's coming from a high tech company with deep pockets. With high end graphics, the difference is arguably even more extreme. Nvidia and AMD are neck and neck in different ways, Intel is a distance third, then that's mostly it.

I'd keep my eye on the PowerVR folks. They have tons of R&D money and it looks like they are inching upwards in terms of GPU class, especially with the GT7900. They also have past experience with discrete graphics.
 
Back
Top