NVIDIA - Anti-Competitive, Anti-Consumer, Anti-Technology

A lot of this argument comes from how they try to manipulate the market and lock in consumers.

Aside from GPP (which wasn't 'vendor lock-in'), what you're listing as lock-in are also technologies that they got to market first. The G-Sync example I see repeated here is a great example: they had working hardware when AMD had a hacked solution in response, and AMD is still cleaning up the shitshow that is FreeSync such that it's actually competitive with G-Sync across the board.

Gameworks runs on AMD hardware, and sometimes better- it's done in DirectX!

CUDA is another example- but before CUDA, there was nothing. Apple did OpenCL because fruit; AMD picked it up because they had nothing else. Also, CUDA is fast.
 
This is where I straddle the line. I'll buy the NV GPU because it does what I want. But I'm not buying the monitor. I'll just adjust my games to play synced without dropping frames. It's worked so far. Then again, I only need 60 or 120 to be happy.

Oh me too, I just got sick of screen tearing and input lag. I was planning on getting a Gsync display until I saw the prices. So I ended up switching to a AMD card with a freesync display for cheaper than what the Gsync display would have cost me.

What's even more bullshit, is you KNOW Nvidia could release a driver update that would work with freesync displays but they wont because they want to charge a premium for Gsync. Anti-Consumer indeed.
 
Aside from GPP (which wasn't 'vendor lock-in'), what you're listing as lock-in are also technologies that they got to market first. The G-Sync example I see repeated here is a great example: they had working hardware when AMD had a hacked solution in response, and AMD is still cleaning up the shitshow that is FreeSync such that it's actually competitive with G-Sync across the board.

Gameworks runs on AMD hardware, and sometimes better- it's done in DirectX!

CUDA is another example- but before CUDA, there was nothing. Apple did OpenCL because fruit; AMD picked it up because they had nothing else. Also, CUDA is fast.

That's the thing. Whether it was their own innovation, or one based on someone they acquired, they typically put these things out there first. Some of them were even offered to AMD, who refused, and went the "open" (or slightly more open) route in response to look benevolent. If AMD put something together first and put it out there with a fairly wide early adoption, do you think they'd make it open from the start? Probably not.
 
That's the thing. Whether it was their own innovation, or one based on someone they acquired, they typically put these things out there first. Some of them were even offered to AMD, who refused, and went the "open" (or slightly more open) route in response to look benevolent. If AMD put something together first and put it out there with a fairly wide early adoption, do you think they'd make it open from the start? Probably not.

They'd be pulling the same crap. Instead, they're trying to bankroll on sympathy- but I buy technology, not feels :D
 
Oh me too, I just got sick of screen tearing and input lag. I was planning on getting a Gsync display until I saw the prices. So I ended up switching to a AMD card with a freesync display for cheaper than what the Gsync display would have cost me.

What's even more bullshit, is you KNOW Nvidia could release a driver update that would work with freesync displays but they wont because they want to charge a premium for Gsync. Anti-Consumer indeed.

Yeah, the latter is kind of shitty. Both technologies are interesting to me, but I just haven't felt like buying into them. I typically play at what's now consider low-res :D 1080 on a 47" TV that's mounted on the wall above my gaming computer / desk, so it just doesn't currently interest me enough to buy into one or the other. I might change my tune when I actually feel like going and buying a new monitor at some point, but for now, it's just not that high on my list. VSync and high/ultra settings at "low" res do the trick for me at least for now. :D
 
Reasonable prices on Video cards, eh sure. But their Gsync technology fat chance! Its total bullshit when you can get the same technology from AMD at half the price.
My solution is to just use a GTX 1080 Ti card so I'm pegged at 60fps in all games at 4k resolution. No need for adaptive vsync. And IMO gaming on a large 4k TV is way better than any PC monitor, 120hz or not.
 
What's even more bullshit, is you KNOW Nvidia could release a driver update that would work with freesync displays but they wont because they want to charge a premium for Gsync. Anti-Consumer indeed.

I don't know that it could be done in a driver update, though I'd expect it to be possible.

What I do know is that Freesync (1) needs to die. At least with Freesync 2 you more or less know what you're going to get, and the issue I have is that it took AMD far too long to even approach parity of spec.

[and G-Sync remains technically superior as the range always starts at 30Hz, regardless of the top refresh rate...]
 
That's the thing. Whether it was their own innovation, or one based on someone they acquired, they typically put these things out there first. Some of them were even offered to AMD, who refused, and went the "open" (or slightly more open) route in response to look benevolent. If AMD put something together first and put it out there with a fairly wide early adoption, do you think they'd make it open from the start? Probably not.

Mantle became vulkan ?
 
And IMO gaming on a large 4k TV is way better than any PC monitor, 120hz or not.

I have no intention of ever going back to sub-120Hz for gaming. Ever.

I'd prefer 240Hz if that were feasible; also, not really a fan of large-screen gaming for the kinds of gaming I do. It would just mean that it has to be further away.
 
Mantle became vulkan ?

And also a part of DirectX 12. And hats off to AMD on that one. I still think there were other motivations behind it though than just being open. If Mantle took off immediately, I'm not sure we would have seen that migration. That's total what-if territory though.
 
And also a part of DirectX 12. And hats off to AMD on that one. I still think there were other motivations behind it though than just being open. If Mantle took off immediately, I'm not sure we would have seen that migration. That's total what-if territory though.

It was not AMD who initially initiated it, they took initiative when developers asked for it.
They had consoles and they wanted longer life for the consoles, they were hilariously weak compared to mighty xbox360 which surpassed the best builds out there when you first got it but it didn't take long to surpass it with the 8800 series from Nvidia but!
Xbox 1 and PS4 was a rather low end system from the get go and the refreshed xbox is a dualcore Intel Pentium with a RX580 and you need powerful api's to work with that.

Custom design is where AMD will win the devs to prepare them for a match another day in desktop graphics -
 
Gotta remember that DX12 had been in development for quite a while; it's not like it was a new concept that was being applied for Microsoft or for AMD.

Now, I like Vulkan, and Mantle was cool, but neither were really innovative. The main issue is that OpenGL was going nowhere, and AMD seized an opportunity.
 
mighty xbox360 which surpassed the best builds out there

Sorry, this one made me chuckle- neither console of that generation was particularly fast relative to PC hardware. Really, a console hasn't been fast since the original Xbox that ran lower-end desktop hardware, and yet it was slow for its time :D.
 
I have no intention of ever going back to sub-120Hz for gaming. Ever.

I'd prefer 240Hz if that were feasible; also, not really a fan of large-screen gaming for the kinds of gaming I do. It would just mean that it has to be further away.
Eh, would 120hz 4k gaming be nice? Absolutely. But it's going to require *massive* GPU power. I don't even know if the top Volta card will be able to do it. I prefer the added detail of a 4k image over 1440p, personally, not to mention HDR. And gaming from my couch is really nice. I don't think I could go back to an old school desk.
 
Eh, would 120hz 4k gaming be nice? Absolutely. But it's going to require *massive* GPU power. I don't even know if the top Volta card will be able to do it. I prefer the added detail of a 4k image over 1440p, personally, not to mention HDR. And gaming from my couch is really nice. I don't think I could go back to an old school desk.

Nice thing is that you can drop settings if FPS is more important than IQ- it's the balance we find every time a new game comes out!

And while I agree that the couch itself is nice, it's generally unergonomic, but that was just a 'my IMO in contrast to your IMO', not a right/wrong thing ;)
 
Sorry, this one made me chuckle- neither console of that generation was particularly fast relative to PC hardware. Really, a console hasn't been fast since the original Xbox that ran lower-end desktop hardware, and yet it was slow for its time :D.

It had faster cache than any gpu, it had 4 core cpu which was actually as fast as anything out there, it had a gpu near the top end of the gpu market with featureset ahead of any desktop gpu.
You had to shell out 75% of the cost of the xbox just for a gpu better than it.

What you could had was a X850 XT which was about same or X1900's which was quite a lot faster but 150 $ more than the entire xbox 360, no other gpu from ATI at the time was comparable at the time of launch.

The xbox 360 when it initiatially came and for roughly 12 months it was mighty even for computer guys

Edit: the x1800,X1900 actually came AFTER the 360 so you could only have rather similiar performance..
 
Gotta remember that DX12 had been in development for quite a while; it's not like it was a new concept that was being applied for Microsoft or for AMD.

Now, I like Vulkan, and Mantle was cool, but neither were really innovative. The main issue is that OpenGL was going nowhere, and AMD seized an opportunity.

Right. Because if we really want to talk about giving coders more access to the hardware, we have merely to go back to the DOS / Amiga / Console eras of the 80s-90s. It's really the NT-based OSes that started abstracting things. For good reasons too really, it's just that you're not exactly going to squeeze the most performance and trickery out of those methods.
 
It had faster cache than any gpu, it had 4 core cpu which was actually as fast as anything out there, it had a gpu near the top end of the gpu market with featureset ahead of any desktop gpu.
You had to shell out 75% of the cost of the xbox just for a gpu better than it.

There's those moving goalposts!

Yes, consoles are typically a good value if the performance is good enough and you don't mind it being fixed (and attached to a TV and limited input devices and a level of vendor lock-in to make Nvidia jealous and...).

But the three-core CPU was low-IPC trash- that thing made the Pentium IV's IPC look beefy!

[and yes, it had a decent DX9.5 GPU, and yes, it had some cache, and yes, it had so little RAM that it just couldn't handle detail or large maps...]
 

Agreed. Honestly the first thing I think of when someone comes out with a counter argument of "the other side does it too, and the side I am defending at least is ethical about it!" Or some such nonsense, is that the person doing that is for sure 100% a paid shill.

No one talks like that other than third graders and paid shills...
 
The best thing to do is have as many screens as possible in your house before your wife tells you enough is enough, and then have as many consoles, PCs, old computers, and other hardware attached to those screens as you can pack into a piece of furniture. Then you have to branch out to your kids' rooms and add more there, trickle down your old consoles and PCs into their rooms, spare rooms, basement furnace rooms, etc. until which console, which brand of GPU, which streaming box, etc. ceases to matter in the least.

That was my approach. I have so many devices in my house that there's plenty of room for experimentation, buying the "other" brand, trying a console, sitting at a desk, sitting on a couch, sitting on a bar stool, laying in bed, sitting on the floor, etc. :D

At the point where I'm at now, none of these arguments really matters. I could be considered to be on any or all sides of all of these arguments based on where you're standing in my house. Luckily I'm good at concealing the mess. :p
 
Agreed. Honestly the first thing I think of when someone comes out with a counter argument of "the other side does it too, and the side I am defending at least is ethical about it!" Or some such nonsense, is that the person doing that is for sure 100% a paid shill.

No one talks like that other than third graders and paid shills...

Except ALL corporations do it. I'm not using that in defense of any particular one, or trying to push one over the other, but it's true. ALL corporations use their fair share of tactics to get themselves ahead. Some are shady, some aren't, but all of them have a healthy mix of both.
 

5UDvmBp.png
 
Funny enough, I made this because you couldn't have a discussion on AMD cards in the AMD forums when I ran them exclusively the past 10 years, but I have been enlightened to the green eye for the past 6 months or so and shall give no quarter now. :pompous:

NsZ0tsc.jpg


Good video so far though, definitely a good history write-up and some stuff I've long forgotten about. I can feel my heart blackening and the taste for blood growing! My timing on bandwagons is inversely impeccable though lol, I should start shorting my PC purchases.
 
Say what you will about nVidia but they release kick ass GPUs at reasonable prices. Look at what AMD charged for Vega, it's a joke. If you guys are at all serious about PC gaming I encourage you to go with an nVidia GPU. 4k gaming just isn't possible with AMD.

I own a 980ti and a Gsync display and find all four of you statements to be utter nonsense.
 
Aside from GPP (which wasn't 'vendor lock-in'), what you're listing as lock-in are also technologies that they got to market first. The G-Sync example I see repeated here is a great example: they had working hardware when AMD had a hacked solution in response, and AMD is still cleaning up the shitshow that is FreeSync such that it's actually competitive with G-Sync across the board.

Gameworks runs on AMD hardware, and sometimes better- it's done in DirectX!

CUDA is another example- but before CUDA, there was nothing. Apple did OpenCL because fruit; AMD picked it up because they had nothing else. Also, CUDA is fast.


Gameworks used to be Nvidia exclusive as I recall.

I never mentioned cuda, but it doesn't bother me. OpenCL exists.

There was also PhysX where they bought out the company and then proceeded to block it being used in the drivers whenever an AMD GPU was installed.

As far as G-Sync is concerned, yes, they were first and they did do it better, but the fact that they didn't make it an open standard speaks volumes against them.

EVERYTHING should use open, royalty free standards.
 
"Jimmy "the Scottish Hammer" McJimmerson"

Why can't I ever think of kickass names like that, fuck!
 
From the green eye to the brown eye we know the path thru nvidia to its customers.
 
Did Hardocp ever confirm one of its members to be a paid Nvidia Schill? Just curious if that was ever official. I know several other sites came out with it.

::17 minutes in, looks at timer on bottom:: "holy sht this is an hour long"
Lol i did the same thing at minute 35

It takes a lot.of research and prep work to make an hour long video on this topic. Kudos to him for putting in the time
 
  • Like
Reactions: c3k
like this
That's the thing. Whether it was their own innovation, or one based on someone they acquired, they typically put these things out there first. Some of them were even offered to AMD, who refused, and went the "open" (or slightly more open) route in response to look benevolent. If AMD put something together first and put it out there with a fairly wide early adoption, do you think they'd make it open from the start? Probably not.
You say "these things", but it kind of varies what you're talking about. AMD had hardware tessellation 10 years before it was official in DirectX 11. Also, as someone who cares about antialiasing, they were the first to have MSAA, first to have mixed mode MSAA + SSAA for alpha textures, first to have officially supported SSAA, first to have DirectX 11 SSAA, first to have temporal AA, and first to have shader-based antiliasing. Nvidia came up with their own solution after-the-fact in all those areas. In some of those cases I would argue the Nvidia version was better, but AMD has pioneered plenty of stuff also.
 
ok, I got it ... nVidia is a soulless company that has been corrupted by wealth and greed. Nothing new under the son folks, didn't Bill steal Steve's GUI ? What was that movie called again? Oh yea, Pirates of Silicon Valley. But rest assured, if not before then when this ride we call 'life' is over justice will be served ... you can count on it
 
Last edited:
You say "these things", but it kind of varies what you're talking about. AMD had hardware tessellation 10 years before it was official in DirectX 11. Also, as someone who cares about antialiasing, they were the first to have MSAA, first to have mixed mode MSAA + SSAA for alpha textures, first to have officially supported SSAA, first to have DirectX 11 SSAA, first to have temporal AA, and first to have shader-based antiliasing. Nvidia came up with their own solution after-the-fact in all those areas. In some of those cases I would argue the Nvidia version was better, but AMD has pioneered plenty of stuff also.

Yep! My point isn’t to say which one is better or who came up with what first. Both companies, and ones that came before them have done cool things. Corporate competition, tactics, goof-ballery, antics, shenanigans, etc. are just low on my list of deciding factors. I just buy what I like at a particular moment, and call it good.

Never hurts to call out people/companies on their BS though. Good to keep things in check. These types of things will continue to occur in all businesses. (Well most.)
 
Say what you will about nVidia but they release kick ass GPUs at reasonable prices. Look at what AMD charged for Vega, it's a joke. If you guys are at all serious about PC gaming I encourage you to go with an nVidia GPU. 4k gaming just isn't possible with AMD.

Not sure what you are complaining about. AMD released vega at 499.99 and it was available there for a bit until mining craze took over. I have vega 56 for a bit. Ran damn well unvervolted and overclocked. Yea the used more power but performance wise I think they were priced pretty competitively at launch.

I currently have a 1080ti. To me I love the card but I honestly don't love nvidia. I know they are shady and they tired to pull this GPP crap off when they don't even have any competition. If there wasn't serious back lash from public and Kyle being vocal about it despite the fact most popular sites were scared to even criticize nvidia direcly on it. Nvidia shuts it down like they were being treated unfairly lol.

They day AMD has an equivalent card, or heck if Intel comes out with a kick as gaming GPU. I will likely never buy nvidia again. They get my money since they are the fastest but I will happily sell my card at a discount if there was anything else faster. But I am not going to sit here and say Nvidia doesn't some serious ethical cleansing to do! I am sure it starts from the top.
 
ok, I got it ... nVidia is a soulless company that has been corrupted by wealth and greed. Nothing new under the son folks, didn't Bill steal Steve's GUI ? What was that movie called again? Oh yea, Pirates of Silicon Valley. But rest assured, if not before then when this ride we call 'life' is over justice will be served ... you can count on it

I always wonder how it becomes okay just because everyone does it lol! This is a debate worth having, no one should just get a pass. Hey they all do it, so its okay? Not sure if that is exactly what you are saying.
 
A lot of this argument comes from how they try to manipulate the market and lock in consumers.

Examples:

Gameworks: An attempt to get game develooers to create games with features prevented from being used on the competitions hardware.

G-Sync: knowing that people tend to buy monitors a lot less often than they do GPU's, they create a tech that only works with their GPU's so that even if the competition were to catch up, consumers are locked in to Nvidia GPU's or their fancy G-Sync monitor no longer syncs.

GPP: A blatant way to try to force board partners to abandon the competition from their gaming brands or face losing support from Nvidia.


I think it is pretty clear they are anti-competition. Sure, they are in the lead right now, and they deserve that spot. They have done a better job than anyone else at putting out high end GPU's. The tactics above are pretty clear attempts to try to hurt competition and make sure no one else ever catches back up again, though, and that is not cool at all and possiböy even illegal.

I agree with everything you said. I have a 1080ti becuase its fast and effecient and I have a gsync monitor. If AMD has another GPU that competes or whoops on nvidia like it did with 6870 and so on. I will dump both in a second lol. Problem with AMD is that they had to concentrate on their core business. They have their CPUs down right now and executing it perfectly I think they will eventually dump the profits to pump a damn good GPU again! GCN lived way past its age and I am surprised it still is competitive to this day. That just shows It was a damn good architecture but used for too long.
 
Everyone used to get on this guy. But he even called out AMD for vega. I mean you sit there and wonder he must spend shit load of time with all the info he digs up lol. I mean damn!

Nvidia got away with selling gtx 970 and figuring you would never know the true specs of it. They think they can get away with anything and I am not surprised they tried to pull GPP shit off and then backed out.
 
Back
Top