• Some users have recently had their accounts hijacked. It seems that the now defunct EVGA forums might have compromised your password there and seems many are using the same PW here. We would suggest you UPDATE YOUR PASSWORD and TURN ON 2FA for your account here to further secure it. None of the compromised accounts had 2FA turned on.
    Once you have enabled 2FA, your account will be updated soon to show a badge, letting other members know that you use 2FA to protect your account. This should be beneficial for everyone that uses FSFT.

AMD is a Software Company

and lose money. Nvidia is not very good at losing money. When you fire-sale tons of inventory at a loss, you bring down the perceived value (what customers expect to spend on what they want) of your product. Fire selling the 3060s at $50 makes the 4060 look a ton less compelling, then you'll have to fire-sell those, then people are used to spending sub-$100 for 4060s and then the 5060 comes out with pretty much the same performance at $799 and people won't want to spend that kind of money.

So Nvidia waits.
Waits for what? The next pandemic? Bitcoin hits $400K, and there is another mining craze?

The alternatives to a fire sale? Recycling them, Somehow reusing the GPU chips, but for what? Donation to Global South gamers?
 
Waits for what? The next pandemic? Bitcoin hits $400K, and there is another mining craze?

The alternatives to a fire sale? Recycling them, Somehow reusing the GPU chips, but for what? Donation to Global South gamers?
Supposedly Nvidia knew it was overcharging for the 4060 series and put a portion of each of those sales aside in an account to pay for the discounts and rebates they would need to apply when AMD released their cards and forced them to lower the price.
It was launched at high prices to clear out the 3060 stock, but AMD never released a product they felt they needed to lower their prices to combat so they didn't.
What Nvidia chooses to use that fund for now is anybody's guess, that is of course assuming it existed to begin with.
 
Supposedly Nvidia knew it was overcharging for the 4060 series and put a portion of each of those sales aside in an account to pay for the discounts and rebates they would need to apply when AMD released their cards and forced them to lower the price.
It was launched at high prices to clear out the 3060 stock, but AMD never released a product they felt they needed to lower their prices to combat so they didn't.
What Nvidia chooses to use that fund for now is anybody's guess, that is of course assuming it existed to begin with.
Well I dunno. Give Jensen a billion dollar bonus? Declare a dividend for all the shareholder? Buy a country in South America? :rolleyes: https://www.investopedia.com/articles/insights/120816/top-3-shareholders-nvidia-corporation-nvda.asp That net income of $42 Billion is amazing. Amazing profit margin for a hardware company.
 
Well I dunno. Give Jensen a billion dollar bonus? Declare a dividend for all the shareholder? Buy a country in South America? :rolleyes: https://www.investopedia.com/articles/insights/120816/top-3-shareholders-nvidia-corporation-nvda.asp That net income of $42 Billion is amazing. Amazing profit margin for a hardware company.
Nvidia is a software company that makes hardware tailored for its software.
Nvidia spends more on software development than AMD spends money, I think their software budget is up to $8B and change.

AMD and Intel make hardware and then provide basic drivers and firmware so others can make software.
Nvidia flipped that and they made software, sold their software, and provided hardware to accelerate that software.

So while others keep their hardware generic so anybody can develop software for it, Nvidia says "FU-NO! This is the software you will use, and it will run on these cards better than anything else anybody can provide you. We will support it and we will support you in using it and you will quickly come to love it"

So AMD and Intel can show all their pretty bar charts showing them dominating in OpenCL and OpenML over the big Nvidia A100s and A200s, and Nvidia can turn around and say "Great but none of our clients use those language sets they use CUDA and if we compare CUDA performance to the CL and ML performance you see we are 8x faster at those jobs than they are, so have a nice day!"

Honestly AMD's best approach will be to swallow their pride and join Intel in developing and pushing the OneAPI environment because they need to combine efforts if they want to combat the monster that Nvidia has released into the world.
 
I feel it is a bit of semantic, the hardware Nvidia and AMD do, is not yet a simple commodity...

Sure when you are fabless, you can play with word, but designing and selling hardware is still a giant part of what they do, the software stack is incredibly important and hard (Intel entry in the GPU market leave little doubt and Intel sold the most GPU since 2010 by a giant margin, not starting from zero at all).
 
Nvidia is a software company that makes hardware tailored for its software.
Nvidia spends more on software development than AMD spends money, I think their software budget is up to $8B and change.

AMD and Intel make hardware and then provide basic drivers and firmware so others can make software.
Nvidia flipped that and they made software, sold their software, and provided hardware to accelerate that software.

So while others keep their hardware generic so anybody can develop software for it, Nvidia says "FU-NO! This is the software you will use, and it will run on these cards better than anything else anybody can provide you. We will support it and we will support you in using it and you will quickly come to love it"

So AMD and Intel can show all their pretty bar charts showing them dominating in OpenCL and OpenML over the big Nvidia A100s and A200s, and Nvidia can turn around and say "Great but none of our clients use those language sets they use CUDA and if we compare CUDA performance to the CL and ML performance you see we are 8x faster at those jobs than they are, so have a nice day!"

Honestly AMD's best approach will be to swallow their pride and join Intel in developing and pushing the OneAPI environment because they need to combine efforts if they want to combat the monster that Nvidia has released into the world.

That is a creative reimaging of Nvidia history. They are first most a hardware company despite recent marketing saying otherwise. They make and sell you hardware that works with a proprietary software ecosystem they developed. They were fortunate that many vendors incorporated their proprietary CUDA instructions into their software that performed the tasks companies bought Nvidia hardware for. Great for Nvidia but they sell hardware, as your buying A100 because the software from another vendor requires the CUDA instructions. They devoted resources to work with these other companies to insure adoption of their hardware and instructions, which has turned out well for them, but they sell hardware to you, not CUDA.
 
That is a creative reimaging of Nvidia history. They are first most a hardware company despite recent marketing saying otherwise. They make and sell you hardware that works with a proprietary software ecosystem they developed. They were fortunate that many vendors incorporated their proprietary CUDA instructions into their software that performed the tasks companies bought Nvidia hardware for. Great for Nvidia but they sell hardware, as your buying A100 because the software from another vendor requires the CUDA instructions. They devoted resources to work with these other companies to insure adoption of their hardware and instructions, which has turned out well for them, but they sell hardware to you, not CUDA.
You haven't been able to do anything with Nvidia hardware without CUDA since 2008.
This is not fortune, this was a design.
It was not a choice to incorporate CUDA it was mandatory if you wanted to support Nvidia hardware in any meaningful capacity, and Nvidia sent out software engineers en mass to ensure it was being used.


View: https://www.youtube.com/watch?v=zcfK25wiXog&ab_channel=EconomicArchive

View: https://www.youtube.com/watch?v=8FPQTxOTW10&ab_channel=NVIDIA

Nvidia has been hyping up CUDA and all the things it will do for over a decade now and has been incredibly aggressive in getting it used and relied upon. This was not dumb luck or blind chance.

Adjacently Related:
Does this man not age?

But look at their 2012 GTC topics list, Cloud Computing, Cloud Gaming, AI, Raytracing, Autonomous Image Recognition...
 
Last edited:
Waits for what? The next pandemic? Bitcoin hits $400K, and there is another mining craze?

The alternatives to a fire sale? Recycling them, Somehow reusing the GPU chips, but for what? Donation to Global South gamers?

They wait until people buy them. What are you going to do? Buy the competition? Apparently not! AMD could wrap their products in $100 bills and give them away for free and people still won't buy them. People are buying cards: not in mass swarms like before, but the flow is steady. Every chip Nvidia makes sells at a profit.
 
I don't think Valve should need to be contributing at all. They contributed because they were going to use AMD for their Steam Deck. Valve were to the ones who helped make RADV, and also made ACO for AMD as well. I say this like Intel isn't also guilty of this as well.

That is the case sometimes but not always. AMD did make AMDVLK, and is the primary contributor to it, but also hardly anyone uses it over RADV. They're also the main contributors to FSR and RadeonSi. ROCm wasn't even open source until a few months ago, and AMD isn't giving up on that. The issue I see is that AMD is treating these as side projects and not giving them the full attention they need.

I've never seen AMD gloat at anything, but then again who pays attention if they do?

There's a few cases I know of where Nvidia fails on Linux. One is Wayland, though I hear it's better now. Two is driver installation as it's never easy. I have screwed up Nvidia driver installs so often on Linux that I just don't bother installing them sometimes. Third is when a new game comes out and requires new changes to the drivers, which of course you can't because Nvidia is closed source. It's open source now, but realistically nobody is doing anything with it. Borderlands 3 was a nightmare for a lot of people to get running on Linux because something new had to be added to the drivers. Valve just quickly adds it to RADV and they're done, but Nvdia owners had to wait for Nvidia to do it which took a while.
View attachment 664276

At this point if RDNA4 was better than the RTX-50 series in every way, it would still lose. Nvidia is cooking up something that will be exclusive to their hardware and make their cards better. Look at Ray-Tracing when Nvidia first introduced it we had Tom's Hardware that said you should get these cards for Ray-Tracing, even though you needed something like a 2070 or 2080 to get a playable frame rate. Eventually Nvidia creates DLSS to solve this problem and it's still solving Ray-Tracing performance for them to this day. It's now a reason to go buy an Nvidia GPU, just to have this feature, with or without using Ray-Tracing.

Their problem to deal with. They have money now and should invest it into these things. These are the same people who released the RX 7900 XTX for $1k but also sold the 7900 XT for $100 less, as if people are stupid enough to be tricking into that. This was such a massive screw up that they ended up making a 7900 GRE that they only sold to China, because they were hoping to milk America and Europe out of their money. They eventually released the 7900 GRE to the west, and still nobody wants to buy them. AMD is all out of good will from it's consumers. They either make a better product or Intel will. Who knows, maybe Jensen Huang's heart will grow three sizes larger and make their GPU's affordable.
AMD gpus are good in Linux for using a browser and perhaps, some gaming, but anything else in the productivy realm - any software required - then, it is subpar - and some ppl even claim unusable.
Some poster before this already discussed rocm. The only reason some companies buy AMD gpus for AI or anything productivity-related is because they received discounts from AMD or bought in bulk - and that might be cheaper than buying Nvidia's workstation cards? I don't think there's any other reason.
 
AMD gpus are good in Linux for using a browser and perhaps, some gaming, but anything else in the productivy realm - any software required - then, it is subpar - and some ppl even claim unusable.
Some poster before this already discussed rocm. The only reason some companies buy AMD gpus for AI or anything productivity-related is because they received discounts from AMD or bought in bulk - and that might be cheaper than buying Nvidia's workstation cards? I don't think there's any other reason.
I wouldn't say the only reason, but in Linux you need to pay attention to your distro and your software, there are some combo's that work well one way or the other. But there are more cases where the AMD devices will leave you high and dry than the Nvidia ones do, again that is for corporate/enterprise loads.
AMD tends to handle the consumer side a little better than Nvidia does for gaming and media entertainment.
 
AMD gpus are good in Linux for using a browser and perhaps, some gaming, but anything else in the productivy realm - any software required - then, it is subpar - and some ppl even claim unusable.
Some poster before this already discussed rocm. The only reason some companies buy AMD gpus for AI or anything productivity-related is because they received discounts from AMD or bought in bulk - and that might be cheaper than buying Nvidia's workstation cards? I don't think there's any other reason.
For most people who don't need compute, it's fine. Obviously AMD sucks when it comes to productivity, and this is especially true on Linux. I don't use ROCm because it's hard to install and not worth it for me, but this is mostly because I want to maintain RADV, which can be done but again hard to do. If you install AMD's drivers with AMDVLK and all then it'll work, but I don't want AMDVLK. I run a Jellyfin server and I don't use any hardware acceleration because AMD's AMF requires again bits of their drivers and VAAPI doesn't seem to work on some h265 videos. AMD is trying to fix this so people like me can still use RADV, but it's still a terrible situation.
 
So AMD and Intel can show all their pretty bar charts showing them dominating in OpenCL and OpenML over the big Nvidia A100s and A200s, and Nvidia can turn around and say "Great but none of our clients use those language sets they use CUDA and if we compare CUDA performance to the CL and ML performance you see we are 8x faster at those jobs than they are, so have a nice day!"

How can I tell if Adobe still supports CUDA in preference to AMD? If yes, then I as a Photoshop and Lightroom user, I am doomed to buy NVidia. My heart wants me to buy AMD, to pair with my AMD Ryzen 9 CPU.
Honestly AMD's best approach will be to swallow their pride and join Intel in developing and pushing the OneAPI environment because they need to combine efforts if they want to combat the monster that Nvidia has released into the world.
Wise. I hope that corporate vanity at both these companies still allows them to do that. It's not like the Department of Justice would get after them for "monopolistic practices."

Do their corporate muckety-mucks read this forum?
 
How can I tell if Adobe still supports CUDA in preference to AMD? If yes, then I as a Photoshop and Lightroom user, I am doomed to buy NVidia. My heart wants me to buy AMD, to pair with my AMD Ryzen 9 CPU.

Wise. I hope that corporate vanity at both these companies still allows them to do that. It's not like the Department of Justice would get after them for "monopolistic practices."

Do their corporate muckety-mucks read this forum?
Adobe uses CUDA for Premier, but Photoshop and Lightroom are DX12.
 
Do their corporate muckety-mucks read this forum?
Maybe not anymore in 2024 or for something as mainstream as GPU, but when sony documentation leaked, they had report made by interns reading home theater forum to get feedback about the issue, most wanted feature, most annoying stuff according to the forum users for their AV divisions, for niche enough product were people go read message board that are filled with quite the experts, I imagine they did
 
Honestly, I get why people are saying AMD is becoming a software company—it’s not just about the chips anymore. Stuff like Adrenalin drivers, FSR, and ROCm are a big part of what makes their hardware actually shine. They’ve really stepped up on the software side, especially compared to a few years ago. I wouldn’t go as far as calling them a software company, but it’s definitely a huge part of their game now. You can’t compete with Nvidia or Intel without solid software to back the hardware these days.
 
They have about the same gross margin than in 2010-2011... and not those of a software company.

You can be both (Apple, Nvidia, Tesla) and it would be a mistake to think otherwise. No amount of software work... if you are involved in a competitive non-trivial hardware affair, you need to get deeply involved with it, you cannot rely purely on TSMC to make your thing work, you need to power-cool them (or at least some notion of those challenge when you design them).

It is not like the actual whole supply chain of making what they do and the ability to have them on shelves on time at price seem to have got easier in the last 15 years that you can just focus on software sides of things (or the physical networking, powering, cooling, etc... of their system or how much of those part the competitor thus the client's expectation to have AMD involved in it).

Put it that way, if AMD became a software company (in an article of 2024), why would they have finalized the acquisitions for many billions in 2025 zt system:
https://www.amd.com/en/newsroom/pre...-amd-completes-acquisition-of-zt-systems.html

A company specialized a creating-installing physical world compute system.... because that part did not get really simpler over time and Nvidia-AMD are becoming more and more physical system expert. Why they seem affected so much by tariff that affect only physical world object going through custom ?

Because they are not software companies at the end of the day. And with Microsoft, Google and Apple getting into powerplan (from gaz, solar to fusion) and material science, while the Disney, Ford type of company have quite the complicated software stack, that distinction seem more and more useless to make for big company.
 
Last edited:
This announcement last year was just them recognizing that they hadn't taken software as seriously as their competitors.

Intel for years has had their own Linux distro which they have teaked heavy for Intel hardware. Of course no one uses it for production server things. That was never the intention though, RHEL SLES and the other big server developers have been able to pull intel performance tweaks from Intels own testing distro (Clear LInux). It has helped Intel limp though the onslaught of AMD server chips for a lot longer then anyone would have expected. Intel may have slower server chips but the OS was heavilly tuned. It for sure got them through the first year or two of the Epyc assault on their core business.

Nvidia I don't even have to go into that. Cuda has been eating everyones launch. Even in gaming Nvidias drivers were seen as the gold standard.

I think the last year AMD has been proving that they have been taking the demand for quality AMD software to drive their hardware seriously. ROCm might not be Cuda yet but the improvements are noticible. On the server side same thing. I would love to see AMD do like Intel and do their own AMD tuned and tweaked develpment distro. Intel was smart making clear linux usable but not in anyway a everyday driver type distro it kept it viable for what it was intended to do, without any big call for INtel to compete with MS or anything dumb. AMD could follow the same suit. Not a windows replacement, just a good test bed for Linux developers to help them even further tune. (AMD has been lucky Valve choose AMD hardware... some of their tweaks have even found their way into server systems)

I think the biggest difference people are 100% noticing now though is in gaming. Nvidia shitting the bed and releasing driver after driver of issues is and was obviously completely out of AMDs control. However as essentially everyone has noticed. The 9070 launch is the first time likely ever that AMD hasn't had any issues at all. New card, new architecture, new features... and nothing is broken. It all just works. Something their competition has been selling hardware on for a long time now.

Hopefully AMD keeps it up, and goes harder. Thing with software its always changing, you can't stop testing tweaking and testing and tweaking, take the foot of for even a second and you end up with a cascade of F ups like Nvidia seems ot be experiencing.
 
Honestly, I get why people are saying AMD is becoming a software company—it’s not just about the chips anymore. Stuff like Adrenalin drivers, FSR, and ROCm are a big part of what makes their hardware actually shine. They’ve really stepped up on the software side, especially compared to a few years ago. I wouldn’t go as far as calling them a software company, but it’s definitely a huge part of their game now. You can’t compete with Nvidia or Intel without solid software to back the hardware these days.
Calling AMD a software company is like calling GM a software company because they had to write software for their Engine Control Module. You can't not make software because your business is primarily making hardware. I don't know if AMD's software has gotten better because I'm on Linux where I rely less on AMD to keep their drivers up to date. I haven't used ROCm because installing it on Linux is about as desirable as installing Nvidia drivers on Linux. FSR4 is an improvement... for RDNA4 owners only. I criticized Nvidia for doing this, and they did a 180 and made DLSS4 for all RTX owners. Now AMD does a 180 and made FSR4 exclusive to RDNA4. At the very least, they should be able to get it working on RDNA3. They said they would but this is something that should have been figured out before hand.
 
Back
Top