• Some users have recently had their accounts hijacked. It seems that the now defunct EVGA forums might have compromised your password there and seems many are using the same PW here. We would suggest you UPDATE YOUR PASSWORD and TURN ON 2FA for your account here to further secure it. None of the compromised accounts had 2FA turned on.
    Once you have enabled 2FA, your account will be updated soon to show a badge, letting other members know that you use 2FA to protect your account. This should be beneficial for everyone that uses FSFT.

Radeon 9000 series on sale...in March

I think this can be a bit contradictory or a tension if you will, if Nvidia would much prefer to be a standalone without partner, why are they not by now ?

As long as really competent AIBs with global supply chain and distribution network (and that do innovate from time to time a little bit) are ready to do very small margins I imagine Nvidia is perfectly fine with that, becoming an Apple is really hard (well here the answer maybe they would much prefer it is just really hard).

And (big and) if you do not want to do your own Laptop, you still need to keep a good relation with the Asus-MSI of the world that put your gpu in their laptop, a lever a EVGA did not had.

Considering how card supply issue they have right now despite the giant help from really good AIBs, probably far away as a plan to ever get rid of them.

We are also talking thin margin for them versus the capital to put, people talking about not doing some 270mm gaming card to concentrate more on AI booming business, imagine spending time and money on rma, global supply chain of assembling gaming graphic card instead.... Do it for the new product instead, digits like workstation and gaming pc and keep the partner relation alive for the laptop market (and not having to do all the dgpu and supports after sales yourself)
The distribution is all they are getting at this point. That also means a partial sharing of risk.
Nvidia is clamped down on anything they could much innovate on at this point. The cooler was more or less the last thing they could really do of any real value.
Any issues with supply the last few years have mainly been down to silicon. It not like their was a run on plastic shrouds, GPU fans, and 12pin connectors.
I don't disagree with you that Nvidia should keep its partners... it just feels like they have little power to tell Nvidia to knock it off. This launch the founder supply constrains (if the rumors are anything close to true and who knows) could be them at least a little like you say realizing they have a lot of irons in the fire. Maybe Nvidia has limited the number of founders cards they will produce. If true its going to annoy customers who were looking forward to getting that cool 2 slot 5090 solution, its really going to annoy them when they are forced to spend $2200 instead of $2000 for a 3 fan MSI or ASUS 3 slot solution.
 
its really going to annoy them when they are forced to spend $2200 instead of $2000 for a 3 fan MSI or ASUS 3 slot solution.
Sure, but what are they going to do about it? If I can't get a 5090 for $2,000, I'll grumble, then pay $2,200 for a 5090. (Pending performance numbers when we get them - might not be worth upgrading from my 4090.)

Only way I see Nvidia keeping AIB partners is for lower-end parts with lower margins. They have zero competition at the high end, and if they can establish vertical integration, that keeps their margins fat.
 
Nvidia would much prefer to be a standalone without partners, it's probably hard for partners currently to make a card that's even close to the fe card for a similar price. Wasn't that one of the reasons why evga left the scene? As it was getting harder to make a profit due to the founders cards?
That’s what they said but really it was their warranty hanging as a noose around their necks.
EVGA designed cards, but they outsourced manufacturing, that left them a tiny margin.
As the base specs of the cards went up and the competition got better there was fewer and fewer methods for EVGAs cards to stand out. Then Nvidia placed caps on selling prices so EVGA found itself unable to charge what it needed to to keep its designs profitable. EVGA was circling already but then the crypto boom happened and EVGA sold a shit ton of cards to miners direct. They made more money in 2 years than they made in the 10 prior but that’s a lot of cards being pushed hard with warranties that will strangle them.
So they exited the market and gracefully cancelled those warranties.
 
Yes, and I'd be surprised if Nvidia had any AIB partners at all for the 7000-series launch in 2029 or whenever.
They will have Foxconn… as that’s who just finished their largest facility yet and is focused on building NVidia’s cards.

Bigger question who’s getting the contract for the proposed $500B in AI infrastructure for the “Stargate” program…
 
Wonder if it is for the new FSR 4 to be ready at launch instead of like FSR 3.0 announced it will be available one day at the previous launch of card.

They still called it research project type of wording when showing it to people at CES.
I think that was likely the issue.

Nvidia came out and confidently announced DLSS4------and then said that DLSS4 makes a 5070 perform as well as a 4090.

If AMD doesn't have FSR4 ready to go, and implemented in at least a few games, on launch day: They just got eaten by Nvidia's marketing.

I do also think that Nvidia's pricing on the 5070 and 5070 TI may have been a little lower than expected. But, I think the true reason is that AMD didn't have FSR4 ready.
 
They will have Foxconn… as that’s who just finished their largest facility yet and is focused on building NVidia’s cards.

Bigger question who’s getting the contract for the proposed $500B in AI infrastructure for the “Stargate” program…
Foxconn, just like the plant they built in Wisconsin last time 😆
 
Bigger question who’s getting the contract for the proposed $500B in AI infrastructure for the “Stargate” program…
When microsoft announced that in 2024:
https://www.forbes.com/sites/cindyg...n-stargate-a-100b-us-data-center/?form=MG0AV3

Not much details, but there was talk of Musk and Ellison begging Jensen for GPUs recently and nvidia stock seem to have liked the announced that it could be going fast:
https://www.benzinga.com/25/01/4311...targate-ai-infrastructure-venture?form=MG0AV3

https://x.com/OpenAI/status/1881830103858172059
Arm, Microsoft, NVIDIA, Oracle, and OpenAI are the key initial technology partners
As part of Stargate, Oracle, NVIDIA, and OpenAI will closely collaborate to build and operate this computing system. This builds on a deep collaboration between OpenAI and NVIDIA going back to 2016 and a newer partnership between OpenAI and Oracle.
 
Last edited:
AMD probably decided that having a good first impression is better than beating NVIDIA out the door with broken products and fixing later.
 
AMD probably decided that having a good first impression is better than beating NVIDIA out the door with broken products and fixing later.
That's a bit too optimistic. Chances are high that AMD was blindsided by more than just one thing, but a combination of lower than expected pricing, DLSS 4 and improved 2 slot size of the 5000 series.

That's my best guess taking into account how squirrely AMD has been with this release.
 
That's a bit too optimistic. Chances are high that AMD was blindsided by more than just one thing, but a combination of lower than expected pricing, DLSS 4 and improved 2 slot size of the 5000 series.

That's my best guess taking into account how squirrely AMD has been with this release.
I don't think the FE design has anything to do with it. AMD's reference designs for the last two generations, were very good.
 
If they had Nvidia size joke in their CES presentation, easy to cut (and very least easy to still announce a week or 2 later....) and it was talked quite a bit on the "leaks" for a long time the 2 slot design and it is on a product tier AMD do not participate in.

Hard to think DLSS 4 would have been blindsiding, the difference with DLSS 3/FSR 3 do not seem that big of a deal on how AMD would present things, the 1->3 fake frame is the most obvious, what is the least Nvidia will do for dlss 4 guess, neural texture-material had paper for a while and could take a while to matter.

If they got took by surprise, would be pricing (or in a twist, all the talk is 180 wrong, they got surprised by how little Nvidia performance went up and the pricing talk is about how much higher to push them more than expected not the other direction). A new transformer DLSS running in FP4 and what it mean for the future (if that was is meant by dlss 4.0), not sure what AMD can do about this between now and march...
 
Last edited:
FSR 3 ? they call fluid motion frames at least back in the days
They can (and have been) doing similar slide for a while:
View attachment 705371

Lovelace launch was full of those, that in no way could have took AMD by surprise, an Nvidia presentation without being mostly filled with fakeframe benchmark would have been the big surprise in 2025.

AMD's Fluid Motion Frames is not the same as FSR frame generation.

AFMF is driver level, which means AFMF's analysis occurs after the game's frame rendering has been complete. It lacks integration with the game's internal data (e.g. motion vectors) and the UI is baked into its analysis, which can cause quite noticeable artifacts with UI elements sizzling/jittering. Its greatest strength and weakness is that it's almost entirely application agnostic (must be DX11 or higher), but it's more of a cudgel than a scalpel as far as accuracy goes.

FSR frame generation requires application level (ad hoc) support where motion vectors and the UI are accounted for in its analysis and predictions. Supposedly the superior approach as far as image quality is concerned, but without wide spread support nobody cares about FSR frame gen.
 
Last edited:
If AMD doesn't have FSR4 ready to go, and implemented in at least a few games, on launch day: They just got eaten by Nvidia's marketing.

I do also think that Nvidia's pricing on the 5070 and 5070 TI may have been a little lower than expected. But, I think the true reason is that AMD didn't have FSR4 ready.
I don't know if you caught it but it was already shown to be functioning in Ratchet & Clank:RA. HUB had a vid of it at CES.
 
Remember when the next generation of video cards was considered late if it took more than a year-and-a-quarter to come out? Then time lines increasingly slipped to more than 1.5 years would be late. At least RDNA2 had the excuse of delays in release due to Covid and RDNA3 being designed during Covid (which still wouldn't explain its many shortcomings) and now with the further renamed 9070 series it's up to 2.25+ years.

AMD waiting for Nvidia to reveal the whole stack of RTX 5000 cards while distributors and retailers are sitting on stock reeks of cowardice, indecision, and rapid backtracking from an aborted attempt to follow Nvidia's general trend from the GTX 1000 series onwards of ever more price gouging.

For those who remember their European history classes, Cato the Elder used to end his speeches with "Carthago delenda est," Carthage must be destroyed... well, in my crass American take, AMD-o Marketing-o delenda est. Nothing will change until seemingly every last person from executive to intern is banished and black listed from AMD's marketing department.
Ahem... Ceterum autem censeo venalicium esse delendam.

Imagine being AMD's homo venalicius right now. This is the sort of stuff he'll need to explain in job interviews for the rest of his career.
 
Last edited:
Wonder if it is for the new FSR 4 to be ready at launch instead of like FSR 3.0 announced it will be available one day at the previous launch of card.
FSR4 is a driver update ontop of FSR3.1 so any titles that already have FSR3.1 will work with the FSR4 algorithm. That was already confirmed so AMD at least managed to get that right.
 
I don't know if you caught it but it was already shown to be functioning in Ratchet & Clank:RA. HUB had a vid of it at CES.
Yeah I saw both Hardware Unboxed and Digital Foundry's impression of AMD's "research" project. Which was probably FSR4. Impressions were good. Even better than Sony's PSSR for PS5 pro.

However, they weren't calling it FSR4. And the demo was in one game. Of a character standing still. Near some moving elements in the foreground and background.

Doesn't say "THIS IS READY TO BE LAUNCHED", to me.

AMD has really fumbled Radeon, since RDNA 2 (6000 series) was surprisingly good and competitive.

Too bad, because Ryzen is awesome. And their stock could go up even more, if they had a full catalog of great GPUs again.
 
I swear, 20 years down the road economics classes are going to be talking about AMD's GPU division and using it as an example of how NOT to run your business. Rumors are popping up that the 9070XT is going to be $599 at launch, but by the time it launches people will have either upgraded, or hit up ebay for a used 40-series GPU. C'mon AMD, stop it, take some of the cash flow from your CPU division and funnel it into RTG, you claim you care about gamers, now show you care about "gamers," and make a good product stack with features that're ready to go. *sigh* Hell, Nvidia's gaming GPU isn't even their primary business, it's like bottom of the totem pole in terms of importance and they still release products that're good to go day 1.
 
This launch cycle gonna be a shit show by Nvidia and Amd. Sooo glad I don't need a gpu cuz it's all fucked up as we will all soon see. I'm already laughing at the clown show.
 
This launch cycle gonna be a shit show by Nvidia and Amd. Sooo glad I don't need a gpu cuz it's all fucked up as we will all soon see. I'm already laughing at the clown show.

https://x.com/PowerGPU/status/1881758554836877675

Warning you all now.
The launch of the RTX 5090 will be the worst when it comes to availability. Already being told to expect it to be that way for the first 3 months.
 
Remember when the next generation of video cards was considered late if it took more than a year-and-a-quarter to come out? Then time lines increasingly slipped to more than 1.5 years would be late. At least RDNA2 had the excuse of delays in release due to Covid and RDNA3 being designed during Covid (which still wouldn't explain its many shortcomings) and now with the further renamed 9070 series it's up to 2.25+ years.

AMD waiting for Nvidia to reveal the whole stack of RTX 5000 cards while distributors and retailers are sitting on stock reeks of cowardice, indecision, and rapid backtracking from an aborted attempt to follow Nvidia's general trend from the GTX 1000 series onwards of ever more price gouging.

For those who remember their European history classes, Cato the Elder used to end his speeches with "Carthago delenda est," Carthage must be destroyed... well, in my crass American take, AMD-o Marketing-o delenda est. Nothing will change until seemingly every last person from executive to intern is banished and black listed from AMD's marketing department.
Had AMD launched at whatever price nVidia would have countered just like it did before CES. nVidia made two significant changes once it got whiff of AMD's performance. It increased the performance of the 5070 Ti and dropped the price of the 5070 below that of what the 4060 launched at. Had AMD went ahead and launched every card would have been DOA. Then what would it do? Drop the price afterwards which would have made everyone complain at that too. AMD is trying to make sure the cards sell, not stroke it's ego.
 
I think this can be a bit contradictory or a tension if you will, if Nvidia would much prefer to be a standalone without partner, why are they not by now ?

As long as really competent AIBs with global supply chain and distribution network (and that do innovate from time to time a little bit) are ready to do very small margins I imagine Nvidia is perfectly fine with that, becoming an Apple is really hard (well here the answer maybe they would much prefer it is just really hard).

And (big and) if you do not want to do your own Laptop, you still need to keep a good relation with the Asus-MSI of the world that put your gpu in their laptop, a lever a EVGA did not had.

Considering how card supply issue they have right now despite the giant help from really good AIBs, probably far away as a plan to ever get rid of them.

We are also talking thin margin for them versus the capital to put, lot of people talking about not doing some 270mm gaming card to concentrate more on AI booming business, imagine spending time and money on rma, global supply chain of assembling gaming graphic card instead.... Do it for the new product, digits like workstation and gaming pc and keep the partner relation alive for the laptop market (and not having to do all the dgpu and supports after sales yourself)

It is a bit contradictory, but I think at this point, the partners WANT to work with Nvidia, and Nvidia's happy to let it happen. For the partners, they get to build GPUs to service the vast majority of the market and keep getting their brand out there. For companies like Asus, they have a following willing to pay up for "ROG" branding, for example, and this rounds the product stack. They can also sell their premier brands and premier prices and the customer, so far, has been happy to pay. Good relations with Nvidia might also help when they want to build complete systems or laptops, for example. For Nvidia, they get to expand their reach beyond what they can do on their own, and they don't have to worry as much about handling their own inventory and things like that, which reduces liability and cost on their end. They'll profit well on the FE cards that they do sell. They'll likely build enough FE cards they can sell but not so many that they risk getting stuck with excess inventory for when the next series launches.
 
I swear, 20 years down the road economics classes are going to be talking about AMD's GPU division and using it as an example of how NOT to run your business. Rumors are popping up that the 9070XT is going to be $599 at launch, but by the time it launches people will have either upgraded, or hit up ebay for a used 40-series GPU. C'mon AMD, stop it, take some of the cash flow from your CPU division and funnel it into RTG, you claim you care about gamers, now show you care about "gamers," and make a good product stack with features that're ready to go. *sigh* Hell, Nvidia's gaming GPU isn't even their primary business, it's like bottom of the totem pole in terms of importance and they still release products that're good to go day 1.

The problem with this is that no one is actually waiting for this launch. The people that would have already upgraded by the time this card launches weren't buying Radeon anyway, and the people that are looking to buy 40 series cards at this point are not in a rush and will have to wait for used inventory to make its way on to the market. We also don't know what availability is going to look like on day one for the 50 series. We don't know how many scalpers there are ready to snatch up what's available on day one, and how many people are willing to pay a premium above MSRP to upgrade to the 50 series from their perfectly functional 40 series cards to solve problems for them that largely don't currently exist.

Radeon fans might be waiting for this, the few that there are, but I don't think the delay is something that's going to negatively impact AMD the way a lot of people thing it is. A terrible launch will have a much more negative impact than a delayed one.

Let's be real. Most people are upset about this delay because they want AMD to put pricing pressure on Nvidia to create an impetus for Nvidia to lower the price of the 5070, which they will then buy instead. You know it, I know it, and AMD knows it. They're not in a rush.
 
I don't know if you caught it but it was already shown to be functioning in Ratchet & Clank:RA. HUB had a vid of it at CES.
FSR4 is a driver update ontop of FSR3.1 so any titles that already have FSR3.1 will work with the FSR4 algorithm. That was already confirmed so AMD at least managed to get that right.
Yes that help and make it possible to work at launch, but that HUB vid they still called it AMD's AI Technology Demo and research project.

The fact that it was shown functionning and the game does not have to add new input to it mean it could be final product ready in March, the fact it was a single-very limited demo called Research project mean it could not be ready yet, both those facts point toward it could be possible for it to be the reason.

Wanting to know for sure what the 5070 do.... or drivers issue in general not just for FSR being the only others one that come to mind.

but I think at this point, the partners WANT to work with Nvidia, and Nvidia's happy to let it happen.

In their datacenter line, for the new GB300 announced, NVidia will do significantly less and let pattern do more of the work as they wish:
https://semianalysis.com/2024/12/25...asoning-inference-amazon-memory-supply-chain/

Those clients know more and have more need-wishes than desktop consumer, Now with GB300, hyperscalers are able to customize the main board, cooling, and much more. And with all the power-cooling issue noise about the blackwell deployment, I am not sure Nvidia mind, they still threw a bone (or remove themselves one thing to do) with having no 5070Ti founder edition, the skus that could be the sweet spot for consumer but also for what AIBs can do with it, having a lot of OC room from the base specs)
 
Last edited:
Yes that help and make it possible to work at launch, but that HUB vid they still called it AMD's AI Technology Demo and research project.

The fact that it was shown functionning and the game does not have to add new input to it mean it could be final product ready in March, the fact it was a single-very limited demo called Research project mean it could not be ready yet, both those facts point toward it could be possible for it to be the reason.

Wanting to know for sure what the 5070 do.... or drivers issue in general not just for FSR being the only others one that come to mind.



In their datacenter line, for the new GB300 announced, NVidia will do significantly less and let pattern do more of the work as they wish:
https://semianalysis.com/2024/12/25...asoning-inference-amazon-memory-supply-chain/

Those clients know more and have more need-wishes than desktop consumer, Now with GB300, hyperscalers are able to customize the main board, cooling, and much more. And with all the power-cooling issue noise about the blackwell deployment, I am not sure Nvidia mind, they still threw a bone (or remove themselves one thing to do) with having no 5070Ti founder edition, the skus that could be the sweet spot for consumer but also for what AIBs can do with it, having a lot of OC room from the base specs)
The HUB video is literally called "Hands-On With AMD FSR 4 - It Looks... Great?"

View: https://youtu.be/xt_opWoL89w?si=RyMvP2dD43CDYB30
 
The HUB video is literally called "Hands-On With AMD FSR 4 - It Looks... Great?"
yes that HUB choice of word, has AMD said FSR 4 with be there for RDNA 4, perfectly fine guess that what they are looking at

But look at the middle of the screen the AMD tablet, it is written AMD graphics Research AI based upscaling, pretty sure that what they call it the demo at CES exclusively.
 
Had AMD launched at whatever price nVidia would have countered just like it did before CES. nVidia made two significant changes once it got whiff of AMD's performance. It increased the performance of the 5070 Ti and dropped the price of the 5070 below that of what the 4060 launched at. Had AMD went ahead and launched every card would have been DOA. Then what would it do? Drop the price afterwards which would have made everyone complain at that too. AMD is trying to make sure the cards sell, not stroke it's ego.
AMD should have launched already with an MSRP of no more than $499 for the 9070XT (while keeping the prior naming scheme of 8700XT), and scaled down from there for the 9070. AMD needs to eat low margins for multiple generations while having better products than Nvidia in equivalent or better performance tiers or it will get totally squeezed out of the market. Even before the latest delays and the multiple earlier delays, RDNA 4 already had two strikes against it with the first being the initially very over-priced and under performing RDNA 3 upper range that consumer expect AMD to make up for with the next generation to prove that AMD is flailing and failing, while the second strike is the cancelling of RDNA 4's upper range.
 
Talk about what the MSRP should be or not, sound empty without 9070xt third party review and 5070, 5070ti third party review.

What if the 9070xt is overall (despite the lower vram) a preferable product to buy than the 7900xtx, would $599, a $100 more than a 7800xt jump for that kind of generation jump (or $270-300 rebate from the 7900xtx lowest current price) not something that could do well enough to sales all the perfect 64 out of 64 working CU on a 390mm die they think they can make.

And let the 9070 (if it is a better product to buy than a 7900xt) sit at its right place, has the 30% faster than the 7800xt in raster, bit more in RT with better FSR at the same price point, a quite good generational upgrade.
 
Last edited:
AMD should have launched already with an MSRP of no more than $499 for the 9070XT (while keeping the prior naming scheme of 8700XT), and scaled down from there for the 9070. AMD needs to eat low margins for multiple generations while having better products than Nvidia in equivalent or better performance tiers or it will get totally squeezed out of the market. Even before the latest delays and the multiple earlier delays, RDNA 4 already had two strikes against it with the first being the initially very over-priced and under performing RDNA 3 upper range that consumer expect AMD to make up for with the next generation to prove that AMD is flailing and failing, while the second strike is the cancelling of RDNA 4's upper range.
Again Nvidia could easily match that either by changing the specs on the card or going super low on the price just as it already did.

RDNA 3 is currently priced correctly.

As for RDNA4, why make a high end part that no one is going to buy? There's a reason why they didn't make one. In order to compete at high end AMD would have to go chiplet which isn't easy and takes allot of effort and money. Until they unify the architectures it would make sense to stay away from trying to compete with the huge dies that Nvidia is putting out.

AMD is a business not a dinner date that won't buy the steak tartare.
 
yes that HUB choice of word, has AMD said FSR 4 with be there for RDNA 4, perfectly fine guess that what they are looking at

But look at the middle of the screen the AMD tablet, it is written AMD graphics Research AI based upscaling, pretty sure that what they call it the demo at CES exclusively.
So your entire argument is semantics.
 
So your entire argument is semantics.
Well yes saying AMD is using the AMD graphics Research AI instead of FSR 4 at CES, indicating that maybe it is not finish (the word research make it sound like still in development) is 100% semantic.

Do you disagree with that possibility and that choice of word making it sound more likely that if they would have called it FSR 4 ?
 
Again Nvidia could easily match that either by changing the specs on the card or going super low on the price just as it already did.

RDNA 3 is currently priced correctly.

As for RDNA4, why make a high end part that no one is going to buy? There's a reason why they didn't make one. In order to compete at high end AMD would have to go chiplet which isn't easy and takes allot of effort and money. Until they unify the architectures it would make sense to stay away from trying to compete with the huge dies that Nvidia is putting out.

AMD is a business not a dinner date that won't buy the steak tartare.
"Currently" and even that is debatable. Launching at $1,000 for the XTX and worse $900 for the XT were laughably bad screw ups. Then looking down the product stack, initial review recommendations tended to be to buy whichever of the prior RDNA 2 generation that would deliver the same FPS of a given RDNA 3 card for far better value compared to most RDNA 3 launches.

As for the high-end, there used to be an expression called "win on Sunday, sell on Monday." Not having a performance winning halo product is bad (or even if it just barely comes in second, yet has such a large price differential as to make the paper-winner look terrible by comparison). Having it be publicly known that twice in a row you were trying for a halo and after fumbling your first shot now on the second shot failed so, so badly that even after delaying your product line for months to keep working on it, you've ended up completely abandoning that portion of market by preemptively conceding defeat. These are not signals that broadcast confidence and quality to the consumer as a product worth buying. Moreover, because dev kits and software design for the large companies start from working with the most powerful hardware, not being able to compete further incentivizes non-adoption of your product leading to a lack of software optimization, placing further handicaps on future hardware releases.
 
That's a bit too optimistic. Chances are high that AMD was blindsided by more than just one thing, but a combination of lower than expected pricing, DLSS 4 and improved 2 slot size of the 5000 series.

That's my best guess taking into account how squirrely AMD has been with this release.
Let's say for the sake of the argument that the Radeon RX 9070/9070 XT is in a ready-to-launch state (which it isn't), why would AMD delay the launch to March?

AMD can get ahead of NVIDIA and cut the prices later if necessary.
 
"Currently" and even that is debatable. Launching at $1,000 for the XTX and worse $900 for the XT were laughably bad screw ups. Then looking down the product stack, initial review recommendations tended to be to buy whichever of the prior RDNA 2 generation that would deliver the same FPS of a given RDNA 3 card for far better value compared to most RDNA 3 launches.

As for the high-end, there used to be an expression called "win on Sunday, sell on Monday." Not having a performance winning halo product is bad (or even if it just barely comes in second, yet has such a large price differential as to make the paper-winner look terrible by comparison). Having it be publicly known that twice in a row you were trying for a halo and after fumbling your first shot now on the second shot failed so, so badly that even after delaying your product line for months to keep working on it, you've ended up completely abandoning that portion of market by preemptively conceding defeat. These are not signals that broadcast confidence and quality to the consumer as a product worth buying. Moreover, because dev kits and software design for the large companies start from working with the most powerful hardware, not being able to compete further incentivizes non-adoption of your product leading to a lack of software optimization, placing further handicaps on future hardware releases.
Yea and the 4080 was a fantastic deal too right.
 
  • Like
Reactions: kac77
like this
Let's say for the sake of the argument that the Radeon RX 9070/9070 XT is in a ready-to-launch state (which it isn't), why would AMD delay the launch to March?

AMD can get ahead of NVIDIA and cut the prices later if necessary.
Every single time they have went first, and adjusted price. They have gotten lambasted. WTF would they want to do that again if they can help it.
Why march. Cause the competing Nvidia card doesn't launch till march either. (sorry "late" Feb)

I think AMD maybe wise to let Nvidia ship their 5070, more importantly let their partners ship their 5070 non TI cards that will be selling more for $600-700. Then drop their card frankly a week later. Late Feb - March we aren't talking about a crazy gap here.
 
this is what I'm saying. Price it competitively with whats on the market now. Then when 5070/ti come out, price drop to a better deal.
That's exactly my point: many people are (incorrectly) assuming that AMD is delaying the launch because it fears NVIDIA.

If anything, that would force AMD to move up the launch.
 
this is what I'm saying. Price it competitively with whats on the market now. Then when 5070/ti come out, price drop to a better deal.

That would make too much sense :ROFLMAO:

That could be dangerous for AMD and not overly sustainable, Jensen is VERY competitive, and Gaming is his baby he's nostalgic for it, it is not a grip he will relinquish easily.
In a race to the bottom Nvidia can and absolutely would crush AMD in pricing, shareholders aside picture this.

If AMD were to sell the cards at cost, or at a loss it would be a significant portion of their annual revenue, Nvidia on the other hand could give the 5000 series away for free and have it work out to less than 5% of their yearly revenue given what their workstation and datacenter orders are already looking like for the year.

Any price that AMD goes to sell their cards for Nvidia can meet or beat if Jensen decides that AMD is gaining more ground than he is happy with.

There are posts online in Reddit from supposed Microcenter employees saying that AMD had the MSRP for the 9070XT at something like $799 USD. Which even the current market prices would not sustain and to lower the prices they would be selling those GPU's at a loss as they already paid AMD for them assuming they were selling at that $799.

AMD fudged up hard and priced too high by a lot, and now they are completely scrambling. Retailers will not let go of brand-new inventory at a loss, they will return it to their suppliers before they do that.
 
Back
Top