Join us on November 3rd as we unveil AMD RDNA™ 3 to the world!

I see you have mentioned this before as well. Sure, you can source ALL the other components for A computer for 600, but it will fall victim to the same thing the 4090 did, and that's CPU bottlenecks. You aren't building a system that can push either card (if we are to believe the 1.7x) for $600.
Not if you get a 13600k or 5800x3d. You can easily build a good gaming PC for $600 that won't be CPU bottlenecked.

Now Stay way from anything else imo.
 
Edit: I get it. You guys are FROTHING at the mouth for an AMD card and just hoping and praying it's as fast, or faster than Nvidia. You want it so bad! Unfortunately, I don't see it happening and AMD intentionally not even showing it's hand before reviewers isn't a "smooth move, because who cares about FPS anyway", it's a silly move that stinks of low performance numbers.
Not really. I am all for new technology which this is. No other company has a MCM design for a GPU. It a great step forward.

But, if I were in the market for a GPU. I would want to get a 4090....problem is I would need to upgrade my platform if want to get the most performance out of it. Also right now with the economy the way it is, I cannot just buy a 4090 right now.
 
Not if you get a 13600k or 5800x3d. You can easily build a good gaming PC for $600 that won't be CPU bottlenecked.

Now Stay way from anything else imo.
5800X3D is $350 alone. What do you propose for the other $250 left? It's not enough to build a system that can push a 4090 or this as always lacking amd card.
 
Not really. I am all for new technology which this is. No other company has a MCM design for a GPU. It a great step forward.

But, if I were in the market for a GPU. I would want to get a 4090....problem is I would need to upgrade my platform if want to get the most performance out of it. Also right now with the economy the way it is, I cannot just buy a 4090 right now.
I'm right there with you wondering what the hell they are thinking with a $1500 video card... But then I remember last gen when the 3090 was also $1500 and sold like hot cakes.

You and I just aren't the ones they are for, I guess.
 
5800X3D is $350 alone. What do you propose for the other $250 left? It's not enough to build a system that can push a 4090 or this as always lacking amd card.
From what I understand AMD dropped the price to $329? So, ifs $329 that gives you $270. Mobo 80ish? 16gb 3200 memory $55-60? 750w PSU $60-80? $70 for a case and there ya go?

Of course if you are in the market for a 7900xtx you will probably buy the top of the line parts. But, It is possible to do it.
 
I'm right there with you wondering what the hell they are thinking with a $1500 video card... But then I remember last gen when the 3090 was also $1500 and sold like hot cakes.

You and I just aren't the ones they are for, I guess.
The 4090 RTX has a $1600 MSRP.
 
I see you have mentioned this before as well. Sure, you can source ALL the other components for A computer for 600, but it will fall victim to the same thing the 4090 did, and that's CPU bottlenecks. You aren't building a system that can push either card (if we are to believe the 1.7x) for $600.

Edit: I get it. You guys are FROTHING at the mouth for an AMD card and just hoping and praying it's as fast, or faster than Nvidia. You want it so bad! Unfortunately, I don't see it happening and AMD intentionally not even showing it's hand before reviewers isn't a "smooth move, because who cares about FPS anyway", it's a silly move that stinks of low performance numbers.
I love how conversations on here always break down and people lose it a bit. Yeah 600 dollars isn't enough money to buy a PC suitable for a 7900 XTX in all likelihood. I don't care which company I get my GPU from, I'm just into new tech and I'd prefer nvidia have some real competition. I'm not sure they will at the high end, seems unlikely with how coy AMD is being with their numbers. But, they don't really have to, I don't see Nvidia lowering the price on the 4080 even if the 7900 XTX beats it in pure raster, they'll just tout their software and superior RT performance. For a lot of people, it might be enough, hell it might sway me a bit of the RT performance is that bad. I really like RT, it's transformative in Control and Cyberpunk IMO.
 
I'm right there with you wondering what the hell they are thinking with a $1500 video card... But then I remember last gen when the 3090 was also $1500 and sold like hot cakes.

You and I just aren't the ones they are for, I guess.
The 3090 at $1500 was a stroke of genius. Mining was taking off and everyone was stuck at home due to the COVID mess. In any other market conditions, that GPU would have sold a quarter of what it actually did.

The 4090 at $1600 works because Nvidia was successful with the 3090 at $1500.

Funny how gullible we are...
 
I love how conversations on here always break down and people lose it a bit. Yeah 600 dollars isn't enough money to buy a PC suitable for a 7900 XTX in all likelihood. I don't care which company I get my GPU from, I'm just into new tech and I'd prefer nvidia have some real competition. I'm not sure they will at the high end, seems unlikely with how coy AMD is being with their numbers. But, they don't really have to, I don't see Nvidia lowering the price on the 4080 even if the 7900 XTX beats it in pure raster, they'll just tout their software and superior RT performance. For a lot of people, it might be enough, hell it might sway me a bit of the RT performance is that bad. I really like RT, it's transformative in Control and Cyberpunk IMO.
When it comes to Nvidia, how its usually been. If you want the best you have to pay for the best. It's the reason I got a 3090 on release day. Same goes for the 4090 RTX. It is the fastest card and will be once the 7900xtx is released imo.
 
I'm right there with you wondering what the hell they are thinking with a $1500 video card... But then I remember last gen when the 3090 was also $1500 and sold like hot cakes.

You and I just aren't the ones they are for, I guess.

These statements always lack any real numbers. Exactly how many did they sell? Reality is there are going to be very few full chips that are functional enough to sell on each wafer. Most are going to be cut down chips in the lower brackets. I don't know a single person among my friends that has a 3090 as they all said it was way to expensive and most were shocked that I bought a 6900XT. Mining seriously distorted what the actual demand was last generation and I feel on this new node Nvidia is going to have trouble supplying a decent number of 4090 cards and thus the high price. AMD might be able to produce in serious volume with this new design which I think will be a bigger threat to Nvidia.
 
It's $500 less with no stats besides "1.7x faster". It means nothing without reviews. Nothing.

Reviews are king. What's cool about this is how much performance AMD's underlying tech is projecting.

Let's be honest, neither company wants to be making any of these cards.

I said it before, this round is about who loses the least, not necessarily who wins.
 
The way I look at something like a 4090 is this....
Its $1600... and realistically I'm not going to find one at that price or anywhere close so its more like $1800 for a decent not the best AIB model.

Lets assume I was still a kid and gamed 4 hours a day.... 5 days a week (I do not because I'm an adult). That is 960 hours a year.
If history has taught me anything... 3 years. That is how long I can expect a 4090 to feel top end, and perhaps I could get another year or two out of it. So 2,880 hours of game play (again assuming I'm a 4hour a day child) $1.44 an hour for my GPU. (I know that math doesn't square where I live you get to pay taxes when you spend money 1800 bucks = 2k)
Ya no fuck that.
I mean Nvidia managed to sell what they had pretty fast... I don't get it. Either a lot of kids have a lot more disposable income then I thought. (Or Visa and Mastercard are doing happy dances when Nvidia launches a new Halo product). Or a lot of older gamers like myself that could if we choose plunk down 2k on a GPU... are really really bad at math.

Man I'm turning into a cranky old man. $1k though... I don't like it but assuming MSRP is obtainable. 72 cents an hour... still painful but not as crazy.
 
These statements always lack any real numbers. Exactly how many did they sell? Reality is there are going to be very few full chips that are functional enough to sell on each wafer. Most are going to be cut down chips in the lower brackets. I don't know a single person among my friends that has a 3090 as they all said it was way to expensive and most were shocked that I bought a 6900XT. Mining seriously distorted what the actual demand was last generation and I feel on this new node Nvidia is going to have trouble supplying a decent number of 4090 cards and thus the high price. AMD might be able to produce in serious volume with this new design which I think will be a bigger threat to Nvidia.
I live in buttfuck nowhere Canada and I know more than a dozen people running 3090s. Every store near me was sold out over and over when they came in. Even now, I can buy a 3080 local, but not a 3090.
 
It's $500 less with no stats besides "1.7x faster". It means nothing without reviews. Nothing.
While I don't disagree on principal, you also then can't make the claim that it was worth it to buy a 4090 at launch either. Can't have it both ways. I realize you're not the one that said that, but there is some hypocrisy there if you're going to call me out and not say the same thing to imsirovic5.
Only AMD knows for sure, but their pricing is obviously targeting their performance and placement in the market. And is also very likely to be more aggressively priced in terms of price to performance. I think there is zero shot a 7900xtx beats a 4090. But I would gamble a decent sum of money that it wins in terms of dollar to performance vs 4090.
And that essentially was what my comment was to begin with, and also why I said if you can buy a 4090 to begin with you probably don't care about dollar to performance, only absolute performance at any cost. It's subtext and implication.

As much as I want this to be true, I'd say it's wishful thinking. AMD would've simply said that the 7900xtx competes with the 3090 for $600 less. There's no reason for them not to.
 
Last edited:
Am I a horrible person for noticing this?

1667534138003.png
 
Reviews are king. What's cool about this is how much performance AMD's underlying tech is projecting.

Let's be honest, neither company wants to be making any of these cards.

I said it before, this round is about who loses the least, not necessarily who wins.
This is a beta test for AMDs chiplet tech imo.
I believe its going to be a great product don't get me wrong. I just think RDNA 3 is akin to Zen 1. Its ground breaking... but the real OH shit moment will be come the second and even third gen when they really refine it and basically release cards that hit any market point they want.... high end add a couple chips or up the cache. Cheaper version could even use MCD on perhaps even older processes in later gens. I mean why not build a mid range card with a mix of brand new Second gen chips with older controllers and cache chips. (I might be remembering wrong but didn't AMD do that with Zen reuse some controller chips from one gen to the next) OK perhaps that is a crazy pipe dream, maybe. I think Nvidia should be worried about just how agile this makes AMD design wise. Heck they could do crazy shit like mix and match actual chiplets between gens... imagine a mid range 8700 card with 2 chiplets from a 7900 and 2 chiplets from a 8900. Will be interesting to see how Nvidia does moving to chiplets next gen.
 
AMD didn't show performance compared to Nvidia products.
AMD didn't show performance compared to their own products.

Performance is THE. ONE. THING. that matters. They didn't even give us a hint.

This comes across as AMD being INCREDIBLY embarrassed about what they have. They priced it low, but didn't give us any reason to believe its a good deal. They made it small, but didn't let us know how much we're giving up for that size. They talked about features, but not what those features are offering.


I was honestly NOT expecting AMD to be so embarrassed. This is giving me VEGA launch vibes. This is giving me Raja vibes.

None of those things are good.
i'm watching it right now , what are you talking about? right before that he said assasins creed running 8k at 96fps
compare.jpg
 

"Whats up gamers" reminds me of that Steve Buchetti meme.

Why is resolution terminology so hard for the industry. First you have "2k gaming" which should really be more like 2.5k gaming as it refers to 1440p and not 1080p.

Now we have "8k ultrawide" which sounds like even more pixels than 8k but is really just 2 4k screens side by side. "8k ultrawide" has a pixel count closer to 5k in reality.
 
  • Like
Reactions: ChadD
like this
While I don't disagree on principal, you also then can't make the claim that it was worth it to buy a 4090 at launch either. Can't have it both ways. I realize you're not the one that said that, but there is some hypocrisy there if you're going to call me out and not say the same thing to imsirovic5.
Only AMD knows for sure, but their pricing is obviously targeting their performance and placement in the market. And is also very likely to be more aggressively priced in terms of price to performance. I think there is zero shot a 7900xtx beats a 4090. But I would gamble a decent sum of money that it wins in terms of dollar to performance vs 4090.
And that essentially was what my comment was to begin with, and also why I said if you can buy a 4090 to begin with you probably don't care about dollar to performance, only absolute performance at any cost. It's subtext and implication.


As much as I want this to be true, I'd say it's wishful thinking. AMD would've simply said that the 7900xtx competes with the 3090 for $600 less. There's no reason for them not to.
I can't quote everyone lol, I see random posts and reply to them.
 
I live in buttfuck nowhere Canada and I know more than a dozen people running 3090s. Every store near me was sold out over and over when they came in. Even now, I can buy a 3080 local, but not a 3090.

I'm in California and the local stores never had either card really in stock even AMD. Most forum people I knew that bought one were using it for mining or to scalp, not games and they bought multiples. None of my actual gaming friends actually was willing to pay for one. Perhaps were just all too old to give a crap anymore about the most FPS. But your town has the same name most of my friends thought Nvidia wanted to do to them for a 3090.
 
Pretty sure Nvidia just broke out the champagne glasses at HQ; they won.

What did they win exactly? I'm assuming you are referring to Nvidia execs when you say "Nvidia". Wouldn't they pop champagne bottles when they make the high profit margins? Not sure if having the fastest gpu/cpu/car/boat/whatever in the industry guarantees that.

Why do gamers think that executives have the same mindset as their customers?
 
Last edited:
The way I look at something like a 4090 is this....
Its $1600... and realistically I'm not going to find one at that price or anywhere close so its more like $1800 for a decent not the best AIB model.

Lets assume I was still a kid and gamed 4 hours a day.... 5 days a week (I do not because I'm an adult). That is 960 hours a year.
If history has taught me anything... 3 years. That is how long I can expect a 4090 to feel top end, and perhaps I could get another year or two out of it. So 2,880 hours of game play (again assuming I'm a 4hour a day child) $1.44 an hour for my GPU. (I know that math doesn't square where I live you get to pay taxes when you spend money 1800 bucks = 2k)
Ya no fuck that.
I mean Nvidia managed to sell what they had pretty fast... I don't get it. Either a lot of kids have a lot more disposable income then I thought. (Or Visa and Mastercard are doing happy dances when Nvidia launches a new Halo product). Or a lot of older gamers like myself that could if we choose plunk down 2k on a GPU... are really really bad at math.

Man I'm turning into a cranky old man. $1k though... I don't like it but assuming MSRP is obtainable. 72 cents an hour... still painful but not as crazy.
Don't forget the power it consumes over those 3,000 ish hours too - please add the cost of powering the damn thing and you've got yourself a pricey little bugger.
 
When it comes to Nvidia, how its usually been. If you want the best you have to pay for the best. It's the reason I got a 3090 on release day. Same goes for the 4090 RTX. It is the fastest card and will be once the 7900xtx is released imo.
I don't care about having the best, the 3060 ti was a great value proposition and I got one at launch, they have other great products in their stack. Honestly if the 7900 XT or whatever is significantly faster than the 3090 ti that's probably enough for me TBH. They're still going for around 900 or so new / used. I'm not willing to spend 1600 bucks, it's a great card but that's out of my league.
 
FSR isn't in a position to woo people away from DLSS
DLSS 3.0 looks like trash with it's interpolation. FSR's not bad it basically does the same thing and they announced FSR 3.0 will be dropping in 2023 which they're saying can give up to 2x fps at 4k and then they got this Hyper-RX thing coming too.

just nvidia thinking $1600 is a great price to charge for graphics cards is enough to "woo" me to AMD
 
What did they win exactly? I'm assuming you ate referring to Nvidia execs when you say "Nvidia". Wouldn't they pop champagne bottles when they make the high profit margins? Not sure if having the fastest gpu/cpu/car/boat/whatever in the industry guarantees that.

Why do gamers think that executives have the same mindset as their customers?
An interesting take on what I said. Here's another point of view.

When you have the fastest GPU on the market, and continue to do so year after year after year, you gain something called "mindshare" in the consumers that purchase your products. This mindshare means that when new products are released, consumers will think about them more than your competitor's. If a consumer is thinking about your product more than your competitor's, they are more likely to ONLY consider your products and forget about your competitor's lineup. When this happens, you sell more products. Thus, your revenue increases.

If you play the market right, mindshare = profit.

Nvidia still has the fastest product on the market AND the mindshare of the consumers. Therefore, they win.

Did I get that one right, boss?
 
I don't care about having the best, the 3060 ti was a great value proposition and I got one at launch, they have other great products in their stack. Honestly if the 7900 XT or whatever is significantly faster than the 3090 ti that's probably enough for me TBH. They're still going for around 900 or so new / used. I'm not willing to spend 1600 bucks, it's a great card but that's out of my league.
plus w/ 4090 you got to look at posibly buying a new power supply and worrying about connector melt. and whether you even got room the for that thing? heck i still run a soundcard and like to have a slot open for a nic because i've had lighting take em out before.
 
I am not so certain about RDNA 3 better efficiency at all, a 350w (80p) 4090 do this:

VuGaGUV8EwofumnTgfDBfk-1200-80.png.webp


Virtually what a 450 w 4090 do, when talking about laptop that what the 4090 do the performance when locked at 50-80% that give some idea.

People keep showing stuff like this so you have to ask yourself, why would Nvidia crank it to 450w when it can get 90-95% of the performance at 70% of the power consumption.

Please don't suggest that Nvidia didn't know about this.

It suggests to me that the AMD flagship was much to close to comfort for Nvidia so they cranked everything to 11.
 
I heard her saying something about AM4 living on at the start, with it out selling everything even the blind can see!
 
imagine a mid range 8700 card with 2 chiplets from a 7900 and 2 chiplets from a 8900. Will be interesting to see how Nvidia does moving to chiplets next gen.

I wonder if they could spin off RT onto its own die.

X0#*

X -- series

0 -- tier

# -- raster

* -- RT

X, XT, and XTX could denote other features. Base model, accelerators, clock speed/power envelope.

DLSS 3.0 looks like trash with it's interpolation. FSR's not bad it basically does the same thing and they announced FSR 3.0 will be dropping in 2023 which they're saying can give up to 2x fps at 4k and then they got this Hyper-RX thing coming too.

They gotta get this shit taken care of at the driver level like they suggested earlier this year, or late last year, I forget.

If it's just a switch in a game profile they beat DLSS, it's that simple.

Obviously, simple is not "that simple," but you know what I mean.
 
Don't forget the power it consumes over those 3,000 ish hours too - please add the cost of powering the damn thing and you've got yourself a pricey little bugger.
I am a little lucky in that where I am we have cheap hydro power (relatively cheap). However 5% to the federal gov,7% to the provincial gov.... and I'm pretty sure since the last GPU I bought they are now also collecting a EHF (Environmental handling fee) which is only a few bucks. :/

The real added cost... is having my wife yelling at me for the noise if I run my case without a side as a 4090 would have about 1" of air circulation. (or I would have to break down and buy a new case) Also PSU would probably be fine ? More then likely if I was to be crazy enough to drop 4090 money I would also be looking at expenses there. (or at min a decent power cable to avoid an adapter with my old PSU)
 
I wonder if they could spin off RT onto its own die.

X0#*

X -- series

0 -- tier

# -- raster

* -- RT

X, XT, and XTX could denote other features. Base model, accelerators, clock speed/power envelope.



They gotta get this shit taken care of at the driver level like they suggested earlier this year, or late last year, I forget.

If it's just a switch in a game profile they beat DLSS, it's that simple.

Obviously, simple is not "that simple," but you know what I mean.

I think Nvidia would maybe have an easier time splitting the two... as they use their tensor cores for RT (no matter what BS they spin about RT cores... they are tensor cores). In fact I expect Nvidia is perhaps better suited to chiplets in that regard. The issue they will have is simply zero experience with such designs. I think it would suit their tech though... tensors on one chiplet raster on another. Want a Data center package you add 5 tensor chips and 1 general compute. Want a gaming chip you add 5 raster and 1 tensor.

AMD and Intel both do the RT calculations in the same Raster pipe as I understand it. Looking at these very general marketing materials for the 7000 AMDs second gen looks a lot more like Intels first gen RT setup to me. I think it would be hard to split it off. However AMD could do something like a 8000 which they would still call RDNA 3... but they could create a new chiplet that was super AI heavy (too heavy for a pure gaming card) BUT they could just take a few chiplets from the 7900, and mix them new RT heavy chiplets, perhaps add a new controller bit on the same old node that has enhanced cache/memory controller bits on the same 6nm fab that would cost even less to fab 1-2 years on. (the stuff on the 6nm chip now would see no uplift from a die shrink anyway) Chiplets opens up some interesting in between arch change products and refresh options.

For that matter AMD is probably looking at new data center stuff to try and compete there. They could probably design a chiplet for that market... and the 8000 or 7950/7850 type refresh parts could simply add in 1-2 of the chiplets from that project. Instant gaming refresh with damn near zero gaming focused R&D.
 
Last edited:
  • Like
Reactions: Axman
like this
People keep showing stuff like this so you have to ask yourself, why would Nvidia crank it to 450w when it can get 90-95% of the performance at 70% of the power consumption.

It suggests to me that the AMD flagship was much to close to comfort for Nvidia so they cranked everything to 11.
Not sure about 11, they seem to have left a lot of room (some OC strix edition reach the 570 watt type, so the 600 watt edition talk seem to have a plan at some point, but like it show very diminishing return already)

Yes has we saw, at least on some title AMD does come really close that those 5-6% for absurd power pay off for them (and if they build the cards for a giant power anyway before knowing how efficient their affair would be and what AMD plan were, may has well), the general point was more that lovelace do seem has potentially quite efficiency, yet to be certain despite the talk around RDNA 3 that they will have an edge there at least one that change things. That AMD would get equal with a worst node and the surcharge added by the interprocess would be quite impressive already, even with the ram difference.
 
  • Like
Reactions: DPI
like this
Back
Top