AMD Intentionally Held Back from Competing with RTX 4090

Status
Not open for further replies.
Fair. Considering nVidia's pricing structure and the fact that their GPU sales are down, it seems like a high range, but not insane range card was the right move to make. Well, I'm sure some people that spend $1600 on graphics cards would disagree, but whatever.
It's a VERY small sum that spend that much. I just, for instance, kept my 6900xt and Instead saved my money. That was not hard to do. Now I have money and a still wicked quick gpu.

I think the days of 1600$ plus GPUs are going to end. For 2 reasons 1. People don't have as much money under this insano Inflation and 2. Eggs are like $8 to 12 a dozen and increasing. It's like feed my family or play cs:go at 799FPS? Once your starving games lose thier appeal.
 
There is some correlation here, desktop sales are down to their lowest numbers ever, it stands to reason that discrete GPU sales would be down as well.
I wonder if anybody tracks mobile numbers, I would be curious to see what GPU sales for gaming laptops looks like or mobile in general for that matter.
 
I hope you are right. Or at the very least, sure have your halo product but put everything under it back to sane prices. $1200 4080's and $800 4070 Ti's isn't good for anyone.
I also expect an uptick in 1080p screen sales, which then increases a need for 3060 class performance at a reasonable price point.
 
If you are into game engine development much of it is very relevant.
Sorta... most of this stuff related to data science and ML. Some of that can be helpful to game development... but quickly going through it also seems to suggest all these things would also wall you into having to use nVidia hardware? I'll keep reading, kinda cool anyway. Thanks for the share.

*edit* starting to see em. Could using a better filter but (y)
 
Last edited:
Sorta... most of it stuff related to data science and ML. Some of that can be helpful to game development... but quickly going through it also seems to suggest all these things would also wall you into having to use nVidia hardware? I'll keep reading, kinda cool anyway. Thanks for the share.
Their work there relating to the C++ libraries and the matrix transformations does a lot for greatly improving performance when you change camera angles or rotate objects on the screen, core stuff like that but yes it does use NVidia exclusive features, which are different than proprietary as there is nothing stopping others from inventing similar features, they just haven't.
This is where Nvidia gets a lot of their lock-in, they make and help make open tools and frameworks that get them in the door first at the lowest levels of your projects, then adding more of their exclusive or proprietary features after that just makes sense because they are right there and you already are using a crapload of Nvidia puzzle pieces.
 
Their work there relating to the C++ libraries and the matrix transformations does a lot for greatly improving performance when you change camera angles or rotate objects on the screen, core stuff like that but yes it does use NVidia exclusive features, which are different than proprietary as there is nothing stopping others from inventing similar features, they just haven't.
This is where Nvidia gets a lot of their lock-in, they make and help make open tools and frameworks that get them in the door first at the lowest levels of your projects, then adding more of their exclusive or proprietary features after that just makes sense because they are right there and you already are using a crapload of Nvidia puzzle pieces.
Which is why they're going all-in on the machine learning angle since it's still a hotly competitive market with no established standards.
 
Which is why they're going all-in on the machine learning angle since it's still a hotly competitive market with no established standards.
There is also me sort of standard all the calculations need low precision 8 and 4 bit. Something the current AMD lineup doesn’t do at all, they approximates those calculations using 16bit. But it makes the Nvidia hardware between 30 and 100x faster at those calculations than on the relevant AMD hardware.
The libraries are pretty open, AMD just literally does not have any hardware capable of performing the calculations right now.
This will change of course and AMD will get the hardware out, but Nvidia is 100% taking advantage of this, because changing from AMD to Nvidia or Nvidia to AMD requires significant overhauling in both code and hardware so once they have picked a direction for a project they are pretty ride-or-die.
 
Last edited:
Those first are often awarded for it, not new or unusual or unreasonable.
 
Those first are often awarded for it, not new or unusual or unreasonable.

That isn't very often true in tech though. Tech history is littered with example of tech that fails at first, only to be refined by some one else to great success.
 
That isn't very often true in tech though. Tech history is littered with example of tech that fails at first, only to be refined by some one else to great success.

That requires previously mentioned competence though
 
Gameworks is not an API. It is a middleware that uses standard DirectX calls to generate graphical effects that take advantage of NVIDIA hardware.
Though I misspoke about what Gameworks "is", the ultimate conclusion on the affect is the same. Though nVidia themselves also refer to it as an SDK.
If you have a citation that NVIDIA deliberately made Gameworks effects perform worse on competitor hardware rather than just be optimized to work on their own hardware, I'd love to see it.
There are tons of articles out there about this subject. It was probably one of the most debated things in the graphics space in 2014-2015. In fairness to your point there is some level of he-said she-said. All things in the tech sector are opaque. Said another way: Intel suppressed AMD for 10 years and even that took a long protracted time in a court of law to reveal conclusively.

ExtremeTech I think has a good amount: https://www.extremetech.com/search?s=gameworks amd
"The Way It's Meant to be Played" is no different from ATi "Get in the Game"/AMD Gaming Evolved/AMD Gaming. You can't ignore how AMD Gaming titles like The Callisto Protocol and Far Cry 6 run worse on NVIDIA hardware while calling NVIDIA "dicks" at the same time.
Sure I can. Especially considering all of the open source options that AMD is using are things nVidia can actually optimize for, whereas vice versa AMD cannot. If devs use "AMD technology" (that's all open source) they are free to share any and all optimization information with nVidia. If devs use nVidia Gameworks, they can share no optimization information with AMD. That's explicitly stated in that quote.

While I expressly agree that it's impossible to equally optimize for multiple vendors' graphics cards, AMD doesn't use any SDK's or middleware that create vendor lock-in, prevent optimization of code on competitor hardware, or any code that isn't open. All of those are tangible.

As for the "dicks" part, you'd have to go back and read my post about the numerous things nVidia has done and not just this one. Even if you disagree specifically about Gameworks, GPP is something incontrovertible. As is their treatment of reviewers and EVGA; groups and people that have helped to build nVidia's brand.
NVIDIA is one of the biggest promoter members of the Khronos Group.
Because they have to be. nVidia has a vested interest in governing standards for graphics, only the Promoter level allows them to have a member on the board of Khronos group and gives voting rights. Considering that the dues to do this are "only" $75k per year, which nVidia can likely find as change in their couch cushions, this isn't surprising.

"biggest promoter" I would love to see some level of citation about though.


Fair. How many of those SDK's/libraries/API's are specifically for gaming? I'm not great at counting, but I'm going with 4? Maybe? (The ones listed under the Design, Visualization, and Simulation tab) In comparison to AMD which includes all of their SDK's/libraries/API's and all of the middleware?

Feel free to correct me here, but in the context of whom is giving more to gave devs in an open source way, I'm pretty sure there still is no comparison by wide margin.
 
Last edited:
Fair. How many of those SDK's/libraries/API's are specifically for gaming? I'm not great at counting, but I'm going with 4? Maybe? (The ones listed under the Design, Visualization, and Simulation tab) In comparison to AMD which includes all of their SDK's/libraries/API's and all of the middleware?

Feel free to correct me here, but in the context of whom is giving more to gave devs in an open source way, I'm pretty sure there still is no comparison by wide margin.
If you are building a gaming engine, so Unreal, Unity, etc ... Much of what is listed is very relevant, if you are developing a game based on one of those engines then it isn't something you worry about because it's not something you are actively working with that is where you move over to Nvidia Game Works, which is Opensource but using Nvidia repositories which are of various source types.
 
If you are building a gaming engine, so Unreal, Unity, etc ... Much of what is listed is very relevant, if you are developing a game based on one of those engines then it isn't something you worry about because it's not something you are actively working with that is where you move over to Nvidia Game Works, which is Opensource but using Nvidia repositories which are of various source types.
It's "partially" open, but also requires a proprietary/commercial license. And they didn't even get to partial open source until well after AMD more or less forced their hand by creating GPUopen.

Again, all of this is for dev lock-in. GPUopen doesn't require any of this.
 
Do people think AMD wouldn't love dev-lock in if they could achieve it?

Do people think AMD relies on open source so much because of kindness and charity? Or because it's easier to offload development/fixes to the community when you don't have the time/resources/talent to create an enticing enough closed source API?
 
List the nVidia API's that are open source, royalty free for any game dev and opposing graphics manufacturers to use. I'll wait.

Would you like to compare that to AMD's list? We can start with looking at GPUOpen, a massive SDK and middleware suite.

Even if you're a non-dev, you can go through all of AMD's open SDKs: https://gpuopen.com/
Updated 09/29/2021 09:58 AM

How much does CUDA cost?​


The CUDA software, including the toolkit, SDK, etc are free and can be downloaded from CUDA Zone:
http://www.nvidia.com/cuda
https://gpuopen.com/
Guessing you didn't read the article and weren't there.

According to Nvidia, developers can, under certain licensing circumstances, gain access to (and optimize) the GameWorks code, but cannot share that code with AMD for optimization purposes.

Not remotely false at all. nVidia intentionally put a wall around AMD's or any third parties ability to optimize for Gameworks code - and that's from nVidia's own mouth. Guess you disagree with nVidia about nVidia? Feel free to actually show a counter.
->
Article said:
"
NVIDIA was very blunt in stating that AMD’s claim that GameWorks was a “black box” is incorrect by telling me that developers can absolutely get a license for the libraries that includes source code. With that, they are free to make any change to improve performance or stability on NVIDIA or other hardware platforms. NVIDIA offers this option for exactly these reasons, though not all licensors choose to get the “with source code” option. The other difference between the binary-only and source code options of GameWorks is cost, though NVIDIA declined to say by how much.

I even asked NVIDIA for a specific example of a time when a licensor of GameWorks had used the source code to modify a feature to the benefit of a non-NVIDIA hardware. Unequivocally I was told, yes."
You act like this is something that needs contesting. It doesn't. But also nVidia could make all their API's open source, and don't. I can tell you which is helping more people.

You're more than welcome to talk about whatever you want to talk about. It's not my job to post about everything all companies have done. You call my statement a diatribe, then say it's not long enough because certain things aren't included? Try at least being consistent.

Not remotely unproven. I just did, and it has been shown repeatedly. Feel free to actually have a counter with examples.
Read what you just wrote. You just "proved" something by .. what. You said it?, that proves nothing. Read the quote from the Article you linked.. it disproves your point. (lol)
Which they could release as tools for free, and don't. Meaning other people (AMD) have to.

It's not good for game devs and not good for gamers. All it does is divide the market at best and at worst force people to buy nVidia hardware. Which again is the real point of these API's: to push nVidia hardware. A point that is apparently lost on you.

If nVidia was acting benevolent to game devs then they would be happy to have AMD also have access to the technology so that the game devs can sell titles that are equally performative on any vendor hardware as to increase the size of their game sales. Funny that it's not no? Thought they were trying to help game devs sell games?
So you are still going on about Gameworks?? See the quote above as it applies to this too.
Divide the market? Hardly. G-Sync only worked on nVidia GPU's, only thing that might align with that statement.

This boils down to not giving your IP to your competitor, something all companies do including AMD.
They wanted to take ROG branding and make it their own. Which it wasn't. Did nVidia develop ROG? Do they have copyright over ROG? Do they have ownership over any ASUS property? Last time I checked the answer is no. So why should nVidia get any right to dictate what ASUS does with their intellectual and/or branding property?

You have to have a very twisted version of right and wrong to say that nVidia should be allowed to enter someone's house and say what they can do with their property. And a lot of gall on top of that to then say that's beneficial to the consumer.

I would love to see you have a conversation with Kyle about it.

As Kyle stated in his findings:
While investigating the GPP, Kyle Bennett from HardOCP spoke with seven companies, none of which wanted to go on the record. However they did speak anonymously about the GPP, and, according to Bennett, all the people he spoke to had similar opinions about the program.
Bennett summarizes the opinions as follows:
  • The terms of the GPP agreement are potentially illegal
  • The GPP will hurt consumer choices
  • The GPP will hurt a partner's ability to do business with other companies like AMD and Intel
These opinions stem from a key component of the GPP agreement document, which Bennett read but decided not to publish. This component states that GPP partners must have their "gaming brand aligned exclusively with GeForce". In other words, if a company like Asus wanted to join the GPP, they would not be allowed to sell AMD graphics cards as Republic of Gamers products. No more ROG-branded Radeon video cards, no more ROG laptops with AMD graphics inside.
and​
The GPP requires participants to align their gaming brands exclusively with GeForce, and if they don't sign up to the program, their direct competitors that are part of the GPP will get special treatment from Nvidia. So there is a pretty strong incentive for OEMs and AIBs to sign up otherwise they'll be left in the dust by the dominant player in the graphics market.
And it allegedly goes beyond the terms outlined in the GPP document. Some AIBs expressed concerns that it they do not sign up to the GPP, Nvidia would restrict GPU allocations and preference GPP members instead. This isn't in the GPP agreement itself, but is allegedly happening through under-the-table agreements.
The biggest issue that stems from this kind of arrangement is that OEMs and AIBs are essentially forced into signing up to remain competitive among Nvidia partners. If they don't join the GPP, they won't get benefits like marketing development funds or launch partner status. And if a competitor does join, they will receive a genuine advantage, which puts anyone that decides not to join the GPP in a disadvantageous position.
But by all means, I would love for you to explicitly state that Kyle is a liar. The GPP was by definition anti-competitive and likely illegal. The whole point of it was to force AIB's to go exclusively nVidia or be shadow-banned.
What a huge pile of bullshit you have here.
It comes apart with 1 simple statement regarding ROG. ROG could have decided to become the AMD GPU brand, and ASUS could create something totally new to go with nVidia GPU's. Nowhere was it ever shown, stated, that ROG HAD to ditch AMD. They could just have easily ditched nVidia GPU's and achieved the brand separation that nVidia wanted.
-brand separation-
this was the goal. How the companies achieved it was up to them. If there was something that said YOU MUST MAKE ROG THE NVIDIA BRAND, it was never disclosed. And everything I have ever read was very overblown. GPP was also optional, no one is forcing any AIB to make Nvidia powered graphics cards. Was it 'disadvantageous' to not participate, probably. As a gamer, who do I fucking care? Someone will make the card I want, and they will get my business.

My and Kyle has disagreed on this, all of this time. Not required in the forum rules to "always agree".
Indeed. Glad we agree.
lol.
Well, your first link doesn't have margin information from EVGA, just a JRPeddle research graph with it's final point "2022 Est" which means estimated. Good try tho.

"bunch of additional information as to why they left" -> "For starters, EVGA relied on third-party manufacturers in order to create the circuit boards and the coolers for its GPUs, while EVGA itself only took care of the engineering. Having to split its profits with these third parties made it so that EVGA’s own profit margin was smaller than that of its competition."
Lol, this is your argument??? So EVGA's business model wasn't profitable, this is Nvidia's fault??? Wow.
I already have. Creating closed APIs is bad for consumers and literally bad for the rest of the market. Bad enough that your examples below like GSync forced VESA to make open standards. It's those open standards that even allow adaptive sync on TVs. GSync is all but dead and adaptive sync won out.

I'll tell you that nVidia would have been more than happy to force every display manufacturer to have to buy a proprietary chip and muscle AMD out.
Probably.
G-Sync still benefitted the gaming industry, and we likely wouldn't have adaptive sync at all without it. The freesync vesa standard was crap, and it came out AFTER G-Sync. Freesync 2 is finally decent enough that Nvidia supports it, but would that even exist without Nvidia?
We can agree to disagree on closed or open API's. Gamers do not typically give a shit about any of that, I want my games to work, with the best experience possible and the newest graphic features.
If Nvidia makes a profit while doing that, good on them. I see it the same as buying all of the games I play and refusing to pirate them. I want those game dev's making good games to keep on making them. That means I pay for them. In the same way to support and encourage graphics innovations, I buy those GPU's which have the superior features. I freely admit that Nvidia makes a profit off of me every couple of years, and in the past so has AMD.
nVidia would've happily forced every TV to cost $100 more to get a GSync chip forcing every consumer to have to pay more money just to get adaptive sync.
Now your'e reaching.
Gsync by it's very nature is designed to be anti-consumer so that nVidia can pocket more money. The great irony is that now nVidia is riding on the back of open adaptive sync standards. They're still jerks that won't let other people use GSync though without paying an absurd licensing cost.
G-Sync Ultimate monitors are pretty amazing... I played Cyberpunk on a G-Sync monitor with the proprietary chip, and it was smooth at 35 fps. I know that without that it wouldn't have played as well.

Free Market is nice in that you can choose to never buy G-Sync or Nvidia GPU's, and that is totally fine too.
There isn't anything that needs to be explained. The vast majority of users that aren't buying any of these cards already understands. Only staunch nVidia defenders like yourself don't.

We get it, you prefer nVidia being closed source and jerks. As well as using underhanded tactics, abusing their AiB's, and threatening reviewers over samples when they say anything they don't like.

Yeap.

Perspective.
Agree to disagree, fine with me. You hate Nvidia, we get it.
Again, all of which could've been open source and helped out the entire market place.

Honestly much of this has just hurt nVidia in terms of R&D dollars. Generally what has happened is nVidia makes something closed source, tries to ram it down everyone's throats.
The only thing closed source was G-Sync. RTX is open. Cuda is open. AMD could probably design their GPU's to work with DLSS, but they don't need to, they have FSR. What other technologies are left? Nothing. Simple shit like shaders and tessellation they can both do without any help from anyone.

Ram down their throats?? lol. LCD manufacturers were never forced to implement G-Sync, and no customer was ever forced to buy it. But we did buy it, because it was awesome.
AMD counters by making an open source variation, and then everyone uses the open source version because: 1) it's open, 2) requires no licensing, 3) is vendor agnostic.
AMD does this because it is in their best interest. Not everything AMD is "open", but those that come to market after something else has been established by a competitor, better be free or they have little chance of gaining any significant uptake. See previous post where I said they would die on the vine.
Game devs have no dog in the fight. They want everyone to play their games regardless of AMD or nVidia hardware.
Yes.
Therefore it's actually worse for them to use nVidia technologies because then they are giving a worse experience to a significant portion of the market.
Again, this isn't true. Gameworks sucked on AMD for 1 generation. AMD's raytracing sucking, that's not Nvidia's fault, and it is incredibly naïve to think a company should give it away. Raytracing works with DirectX API's that anyone can program for, game devs and GPU drivers from anywhere. Vulkan too. It's OPEN. Still sucks on AMD, which is AMD's own fault. they are always 2 years late and a dollar short. And they really have no excuse anymore, their CPU's are doing great, so where is the R&D inverstment? Maybe they just don't give enough of a shit... "We've had to give all of this open source shit away... why are we even doing this again? oh yeah, we can build GPU's into our CPU's... fuck!"
So they either use the open source option or have to program twice.
No they don't. Maybe 25 years ago when all of the hardware was different and you have to program directly to the hardware. This is why we have DirectX and Vulkan and OpenGL. The game dev programs to the standard API's, that (some of which) have been standard for more than a couple of decades. DirectX was already at v9 in 2002.
 
Last edited:
Updated 09/29/2021 09:58 AM

How much does CUDA cost?​


The CUDA software, including the toolkit, SDK, etc are free and can be downloaded from CUDA Zone:
http://www.nvidia.com/cuda
https://gpuopen.com/

->


Read what you just wrote. You just "proved" something by .. what. You said it?, that proves nothing. Read the quote from the Article you linked.. it disproves your point. (lol)

So you are still going on about Gameworks?? See the quote above as it applies to this too.
Divide the market? Hardly. G-Sync only worked on nVidia GPU's, only thing that might align with that statement.

This boils down to not giving your IP to your competitor, something all companies do including AMD.

What a huge pile of bullshit you have here.
It comes apart with 1 simple statement regarding ROG. ROG could have decided to become the AMD GPU brand, and ASUS could create something totally new to go with nVidia GPU's. Nowhere was it ever shown, stated, that ROG HAD to ditch AMD. They could just have easily ditched nVidia GPU's and achieved the brand separation that nVidia wanted.
-brand separation-
this was the goal. How the companies achieved it was up to them. If there was something that said YOU MUST MAKE ROG THE NVIDIA BRAND, it was never disclosed. And everything I have ever read was very overblown. GPP was also optional, no one is forcing any AIB to make Nvidia powered graphics cards. Was it 'disadvantageous' to not participate, probably. As a gamer, who do I fucking care? Someone will make the card I want, and they will get my business.

My and Kyle has disagreed on this, all of this time. Not required in the forum rules to "always agree".

lol.

Well, your first link doesn't have margin information from EVGA, just a JRPeddle research graph with it's final point "2022 Est" which means estimated. Good try tho.

"bunch of additional information as to why they left" -> "For starters, EVGA relied on third-party manufacturers in order to create the circuit boards and the coolers for its GPUs, while EVGA itself only took care of the engineering. Having to split its profits with these third parties made it so that EVGA’s own profit margin was smaller than that of its competition."
Lol, this is your argument??? So EVGA's business model wasn't profitable, this is Nvidia's fault??? Wow.

Probably.
G-Sync still benefitted the gaming industry, and we likely wouldn't have adaptive sync at all without it. The freesync vesa standard was crap, and it came out AFTER G-Sync. Freesync 2 is finally decent enough that Nvidia supports it, but would that even exist without Nvidia?
We can agree to disagree on closed or open API's. Gamers do not typically give a shit about any of that, I want my games to work, with the best experience possible and the newest graphic features.
If Nvidia makes a profit while doing that, good on them. I see it the same as buying all of the games I play and refusing to pirate them. I want those game dev's making good games to keep on making them. That means I pay for them. In the same way to support and encourage graphics innovations, I buy those GPU's which have the superior features. I freely admit that Nvidia makes a profit off of me every couple of years, and in the past so has AMD.

Now your'e reaching.

G-Sync Ultimate monitors are pretty amazing... I played Cyberpunk on a G-Sync monitor with the proprietary chip, and it was smooth at 35 fps. I know that without that it wouldn't have played as well.

Free Market is nice in that you can choose to never buy G-Sync or Nvidia GPU's, and that is totally fine too.

Agree to disagree, fine with me. You hate Nvidia, we get it.

The only thing closed source was G-Sync. RTX is open. Cuda is open. AMD could probably design their GPU's to work with DLSS, but they don't need to, they have FSR. What other technologies are left? Nothing. Simple shit like shaders and tessellation they can both do without any help from anyone.

Ram down their throats?? lol. LCD manufacturers were never forced to implement G-Sync, and no customer was ever forced to buy it. But we did buy it, because it was awesome.

AMD does this because it is in their best interest. Not everything AMD is "open", but those that come to market after something else has been established by a competitor, better be free or they have little chance of gaining any significant uptake. See previous post where I said they would die on the vine.

Yes.

Again, this isn't true. Gameworks sucked on AMD for 1 generation. AMD's raytracing sucking, that's not Nvidia's fault, and it is incredibly naïve to think a company should give it away. Raytracing works with DirectX API's that anyone can program for, game devs and GPU drivers from anywhere. Vulkan too. It's OPEN. Still sucks on AMD, which is AMD's own fault. they are always 2 years late and a dollar short. And they really have no excuse anymore, their CPU's are doing great, so where is the R&D inverstment? Maybe they just don't give enough of a shit... "We've had to give all of this open source shit away... why are we even doing this again? oh yeah, we can build GPU's into our CPU's... fuck!"

No they don't. Maybe 25 years ago when all of the hardware was different and you have to program directly to the hardware. This is why we have DirectX and Vulkan and OpenGL. The game dev programs to the standard API's, that (some of which) have been standard for more than a couple of decades. DirectX was already at v9 in 2002.

RTX is open like AMD’s Mantle was open. It’s great if it’s “open” except you need hardware specific for it.
 
Updated 09/29/2021 09:58 AM

How much does CUDA cost?​


The CUDA software, including the toolkit, SDK, etc are free and can be downloaded from CUDA Zone:
http://www.nvidia.com/cuda
https://gpuopen.com/

->
CUDA is propreitary and requires vendor lock in. It's not platform agnostic. AMD aside, I can't run this on Intel hardware either.
Read what you just wrote. You just "proved" something by .. what. You said it?, that proves nothing. Read the quote from the Article you linked.. it disproves your point. (lol)
https://www.extremetech.com/gaming/...surps-power-from-developers-end-users-and-amd

This kind of maneuver ultimately hurts developers in the guise of helping them. Even if the developers at Ubisoft or WB Montreal wanted to help AMD improve its performance, they can't share the code. If Nvidia decides to stop supporting older GPUs in a future release, game developers won't be able to implement their own solutions without starting from scratch and building a new version of the library from the ground up. And while we acknowledge that current Gameworks titles implement no overt AMD penalties, developers who rely on that fact in the future may discover that their games run unexplainably poorly on AMD hardware with no insight into why.
So you are still going on about Gameworks?? See the quote above as it applies to this too.
Divide the market? Hardly. G-Sync only worked on nVidia GPU's, only thing that might align with that statement.

This boils down to not giving your IP to your competitor, something all companies do including AMD.
What Middleware, API, or SDK does AMD use that is closed source?
What a huge pile of bullshit you have here.
It comes apart with 1 simple statement regarding ROG. ROG could have decided to become the AMD GPU brand, and ASUS could create something totally new to go with nVidia GPU's. Nowhere was it ever shown, stated, that ROG HAD to ditch AMD. They could just have easily ditched nVidia GPU's and achieved the brand separation that nVidia wanted.
-brand separation-
this was the goal. How the companies achieved it was up to them. If there was something that said YOU MUST MAKE ROG THE NVIDIA BRAND, it was never disclosed. And everything I have ever read was very overblown. GPP was also optional, no one is forcing any AIB to make Nvidia powered graphics cards. Was it 'disadvantageous' to not participate, probably. As a gamer, who do I fucking care? Someone will make the card I want, and they will get my business.

My and Kyle has disagreed on this, all of this time. Not required in the forum rules to "always agree".
FrgMstr this guy says you're a liar.

But I'll trust someone who actually read the GPP docs (Kyle), industry insiders all of that have been quoted (even if off the record), Hardware Nexus, amongst many others that explicitly stated what GPP was.

If it was benign and nVidia was completely in the right, why drop the program?
lol.

Well, your first link doesn't have margin information from EVGA, just a JRPeddle research graph with it's final point "2022 Est" which means estimated. Good try tho.

"bunch of additional information as to why they left" -> "For starters, EVGA relied on third-party manufacturers in order to create the circuit boards and the coolers for its GPUs, while EVGA itself only took care of the engineering. Having to split its profits with these third parties made it so that EVGA’s own profit margin was smaller than that of its competition."
Lol, this is your argument??? So EVGA's business model wasn't profitable, this is Nvidia's fault??? Wow.
Indeed. If you don't understand nVidia's pricing structure then you can make that ignorant statement. nVidia placed a cap on how high any vendor can price a GPU . Meaning, if you can't make a GPU cost what nVidia "wants it to cost" then you lose money.

EVGA wanted to create premium boards, with custom electronics, custom cooling solutions, that were capable of the highest over-clocks. The Kingpin line of cards consistently were ahead of all other cards due to the R&D and optimizations they did. Guess what? Those things cost money. Not being able to charge an extra $100-$200 for them in order to meet nVidia pricing guidelines, yes made them not profitable, yes cost more than the competition, and yes is nVidia's fault for placing a pricing cap.
Probably.
G-Sync still benefitted the gaming industry, and we likely wouldn't have adaptive sync at all without it. The freesync vesa standard was crap, and it came out AFTER G-Sync. Freesync 2 is finally decent enough that Nvidia supports it, but would that even exist without Nvidia?
We can agree to disagree on closed or open API's. Gamers do not typically give a shit about any of that, I want my games to work, with the best experience possible and the newest graphic features.
They don't until they do. Or are you still not aware that Gameworks was a huge controversy in the gaming community. It was talked about on every board including this board extensively.
If Nvidia makes a profit while doing that, good on them. I see it the same as buying all of the games I play and refusing to pirate them. I want those game dev's making good games to keep on making them. That means I pay for them. In the same way to support and encourage graphics innovations, I buy those GPU's which have the superior features. I freely admit that Nvidia makes a profit off of me every couple of years, and in the past so has AMD.
I don't think anyone cares when nVidia makes a graphics advancement - in the sense of "complaining about it". That is obviously not the issue. The issue is what that does in the gaming space and to the other half of the market.

If nVidia did all of its winning from having a superior product, there would be no complaints. But they feel it's necessary to do a bunch of other things that are bad for the industry. I've said this more than once.
Now your'e reaching.
Not at all. It's been expressed that GSync modules add $100 to the cost of monitors, considerably more if you want HDR. If the free open standard didn't exist and we wanted to have some form of adaptive sync what do you think would happen?
G-Sync Ultimate monitors are pretty amazing... I played Cyberpunk on a G-Sync monitor with the proprietary chip, and it was smooth at 35 fps. I know that without that it wouldn't have played as well.

Free Market is nice in that you can choose to never buy G-Sync or Nvidia GPU's, and that is totally fine too.
Which with nVidia walls gives an inferior experience. If nVidia has their way anyway.
Agree to disagree, fine with me. You hate Nvidia, we get it.
I used to love nVidia when they were the underdog in the late 90's and early 00's. It's not their being on top I object to. It's their anti-consumer behavior. But you're their lap dog, I get it.
The only thing closed source was G-Sync. RTX is open. Cuda is open. AMD could probably design their GPU's to work with DLSS, but they don't need to, they have FSR. What other technologies are left? Nothing. Simple shit like shaders and tessellation they can both do without any help from anyone.
Cuda isn't open. If it was then it would work on Intel and AMD cards and doesn't. It has a proprietary license and only works on "supported GPUs" (nVidia). AMD has to use OpenCL and the rest of their OpenGPU stack to compete.
Ram down their throats?? lol. LCD manufacturers were never forced to implement G-Sync, and no customer was ever forced to buy it. But we did buy it, because it was awesome.
If you wanted adaptive sync you were. If Freesync didn't exist, you would be forced to buy it in order to have that feature. You can't have it both ways.
AMD does this because it is in their best interest. Not everything AMD is "open", but those that come to market after something else has been established by a competitor, better be free or they have little chance of gaining any significant uptake. See previous post where I said they would die on the vine.
Second time: what SDI/APK/Middleware does AMD use that are closed source? Still waiting.
Again, this isn't true. Gameworks sucked on AMD for 1 generation. AMD's raytracing sucking, that's not Nvidia's fault, and it is incredibly naïve to think a company should give it away. Raytracing works with DirectX API's that anyone can program for, game devs and GPU drivers from anywhere. Vulkan too. It's OPEN. Still sucks on AMD, which is AMD's own fault. they are always 2 years late and a dollar short. And they really have no excuse anymore, their CPU's are doing great, so where is the R&D inverstment? Maybe they just don't give enough of a shit... "We've had to give all of this open source shit away... why are we even doing this again? oh yeah, we can build GPU's into our CPU's... fuck!"
So you have a reading comprehension issue. Because my first post on this directly addressed RT as an exception. And the major reason it's the exception is because it was adopted into the DX12 spec. That is in stark contrast to literally every other nVidia technology we're talking about.
No they don't. Maybe 25 years ago when all of the hardware was different and you have to program directly to the hardware. This is why we have DirectX and Vulkan and OpenGL. The game dev programs to the standard API's, that (some of which) have been standard for more than a couple of decades. DirectX was already at v9 in 2002.
You've read none of the Gameworks articles. The devs explicitly state that in the design of an engine they have to choose their dev path. If you use nVidia's middleware stack and optimize for them there isn't really a way to insert GPUopen Middleware without, guess what?, doing it twice.

If a game dev wants to support FSR and DLSS, it requires twice the programing time because both have to be implemented. Being DirectX, Vulkan, or OpenGL has nothing to do with anything we're talking about. Being DirectX doesn't magically enable both DLSS and FSR. Or Gameworks and GPUopen.

Maybe that's the problem here? You just have a fundamental misunderstanding of the tools and vendor lock in.
 
Last edited:
I find it funny that people who recommend and praise Apple for some reason hate Nvidia.
This is OT.

But if you're referring to me: Apple doesn't screw over the entire PC industry or any of the other industries they're a part of like cellphones.

Apple is <10% of the PC market. And they are <15% of the cellphone market. Nothing they do changes the broader picture.

A majority of the software I use on Mac is also available on other platforms such as: DaVinci Resolve, PS, and LR. The "magic" comes from producing better hardware and having better software integration. I'm also not on Windows because Microsoft is in my opinion continuing down a path with their OS that I don't appreciate. Microsoft also being dicks is an entirely different conversation.

Again, no one has a problem with nVidia being performatively the best. The issue with nVidia is them being dicks. If they would stop doing that then I wouldn't have a problem with them.

In case it isn't obvious, it's not like I'm the only person with this position, regardless of platform. There are plenty of primarily PC users that feel the same. Or is quoting a bunch of articles and looking at threads here not enough to verify that?
 
Last edited:
People only ever hate Nvidia because they're dicks, not because people like to hate the popular kid at all. Also Apple aren't dicks.

giphy.gif
 
This is OT.

But if you're referring to me: Apple doesn't screw over the entire PC industry or any of the other industries they're a part of like cellphones.

Apple is <10% of the PC market. And they are <15% of the cellphone market. Nothing they do changes the broader picture.

A majority of the software I use on Mac is also available on other platforms such as: DaVinci Resolve, PS, and LR. The "magic" comes from producing better hardware and having better software integration.

Again, no one has a problem with nVidia being performatively the best. The issue with nVidia is them being dicks. If they would stop doing that then I wouldn't have a problem with them.

In case it isn't obvious, it's not like I'm the only person with this position, regardless of platform. There are plenty of primarily PC users that feel the same. Or is quoting a bunch of articles and looking at threads here not enough to verify that?
Sadly the only way to make Nvidia stop being dicks is to offer something competitive against them, there is very little out there that is, there is no open alternative for CUDA, and the 3 or 4 projects that do exist that combined encompass what CUDA can do is 6-10 years behind in terms of performance and development libraries.
I mean I have more reason than most to complain about Nvidia, I mean I actually pay them for Grid licensing and access to developer resources, but I have to because there is literally nothing out there that does even remotely as well, I would welcome either an AMD or an Intel option.
 
Plus, I like Nvidia, and I don't even deny they're dicks at times. I phrase it as them being sharks, but tomato/tomahto.

Point being, just shows me they want it more than AMD. And in the end, Nvidia still delivers, for all the 'shit they stir'.
 
Plus, I like Nvidia, and I don't even deny they're dicks at times. I phrase it as them being sharks, but tomato/tomahto.

Point being, just shows me they want it more than AMD. And in the end, Nvidia still delivers, for all the 'shit they stir'.
Well said.
 
Sour grapes. AMD makes 1 good product and 5 trash on average so didn't expect anything more from them.
On graphics they are so far behind it isn't even funny anymore.

AMD isn't that far behind. Their biggest shortcoming of course is ray tracing. But let us be honest, almost no one can afford to run ray tracing on Nvidia. Unless you're willing to use DLSS which works okay, but that comes with downsides and still Nvidia's GPUs struggle to get an acceptable frame rate.
 
AMD isn't that far behind. Their biggest shortcoming of course is ray tracing. But let us be honest, almost no one can afford to run ray tracing on Nvidia. Unless you're willing to use DLSS which works okay, but that comes with downsides and still Nvidia's GPUs struggle to get an acceptable frame rate.
Yes they are on features, as has been discussed repeatedly. As far as Nvidia and rt+dlss it is still a huge net image quality gain and Dlss is in 2.5x+ the number of titles fsr is, which can also be used by Nvidia. Framerates vary by game but unless you think 60+ is unplayable you'll be thrilled.
 
Plus, I like Nvidia, and I don't even deny they're dicks at times. I phrase it as them being sharks, but tomato/tomahto.

Point being, just shows me they want it more than AMD. And in the end, Nvidia still delivers, for all the 'shit they stir'.
Yep, they deliver.
 
Yep, they deliver.
Exactly. I don't care about them being "dicks" or whatever Linux users seem to call them. I care about the awesome cards they deliver. I don't need their software to all be FOSS to be good tech. I speak as a gamer and indie game dev.

Also, GoodBoy I agree with you on gpp. Your posts are great as far as debunking fud, too.
 
Exactly. I don't care about them being "dicks" or whatever Linux users seem to call them. I care about the awesome cards they deliver. I don't need their software to all be FOSS to be good tech. I speak as a gamer and indie game dev.

Also, GoodBoy I agree with you on gpp. Your posts are great as far as debunking fud, too.
Did you speak with any of the card manufacturerers or read the GPP documents? What is your source? Would you care to provide us with links?
 
Did you speak with any of the card manufacturerers or read the GPP documents? What is your source? Would you care to provide us with links?
? Enough. It was overblown like so many other tech "scandals" in my opinion. Nice bait, do you have those yourself? You're the one who claims it was horribly unfair.

The fud I was talking about when @'ing GoodBoy was all the "Nvidia is a bully" stuff you had posted.
 
Last edited:
? Enough. It was overblown like so many other tech "scandals" in my opinion. You can't prove it wasn't, I can't prove it was. Nice bait.

The fud I was talking about when @'ing GoodBoy was all the "Nvidia is a bully" stuff you had posted.
Nvidia is a corporation that will flex its power to expand, improve, and dominate the environment it is in. Every corporation does the same. I admire Jensen and the empire he has built. NV is a well-run company and innovates like few others. The strong survive and the weak expire. Hopefully, Intel and AMD won't back down to the challenge. Consumers need competition. Unfortunately, there isn't much at the moment.
 
And competition requires actually competing, not just existing or trying, or not trying, as AMD claims they did (didn't?) here....

When they make a GPU that makes me go

MemeLoveTriangle_297886754.jpg

Then they'll get deserved credit. As they did with their CPUs, and got me to go back to AMD again after ~16/17 years of Intel CPUs.

Edit: And switching back to AMD from Intel was also due to not just AMD's achievement there, but Intel resting on its laurels. Which Nvidia does not do IMO, and especially compared to how much so Intel did.
 
Last edited:
AMD isn't that far behind. Their biggest shortcoming of course is ray tracing. But let us be honest, almost no one can afford to run ray tracing on Nvidia. Unless you're willing to use DLSS which works okay, but that comes with downsides and still Nvidia's GPUs struggle to get an acceptable frame rate.
I am playing most of my games at 120-144 fps at 4K using the features including RT. It’s not only RT, it is also DLSS, frame gen etc. 4090 is the first proper 4K card. Not sure I can say that about 7900XTX.
 
Yes they are on features, as has been discussed repeatedly. As far as Nvidia and rt+dlss it is still a huge net image quality gain and Dlss is in 2.5x+ the number of titles fsr is, which can also be used by Nvidia. Framerates vary by game but unless you think 60+ is unplayable you'll be thrilled.

I consider 60 to be the minimum. Unless you have a 4090 and want to use DLSS (image quality hit, varies per game) you're probably going to get unacceptable frame rates. With the ever increasing costs it is become more of a boutique option that few actual gamers can use. For half the games that have DLSS/ray tracing, the image quality and frame rate hit isn't worth it for what ray tracing brings. Ray tracing often looks artificially reflective, and add in the DLSS issues (depends on the game) it just isn't worth using half the time.

When Nvidia can have little frame rate hit and have it running on a $250 GPU, then I would say it is a big advantage. Right now Nvidia is in a better position but in no way absolutely blowing AMD out of the water.
 
Status
Not open for further replies.
Back
Top