confirmed: AMD's big Navi launch to disrupt 4K gaming

This would be way too late, and easily countered with a 3070 Super/Ti.

Seriously, besides brand fanatics, who's gonna wait for this?
People who have something that works fine and don't need to upgrade? People who are on a waiting list to get an out of stock item that have no choice? People who prefer to see the competition before making a large purchase? I don't know, obviously it's just brand fanatics like you said and not anyone sensible that wants to see what the market has to offer before jumping on the first thing. I will wait, I will see if it is worth the money or not, and then I'll make a decision based on #'s and $$. My decision very well could be nvidia, but that doesn't mean i'm going to jump on something that's untested and unknown just because it was announced first.
 
People who have something that works fine and don't need to upgrade? People who are on a waiting list to get an out of stock item that have no choice? People who prefer to see the competition before making a large purchase? I don't know, obviously it's just brand fanatics like you said and not anyone sensible that wants to see what the market has to offer before jumping on the first thing. I will wait, I will see if it is worth the money or not, and then I'll make a decision based on #'s and $$. My decision very well could be nvidia, but that doesn't mean i'm going to jump on something that's untested and unknown just because it was announced first.

I hope there are enough ppl that are willing to wait to make it work for AMD.
Not exactly how you take back market share tho.
 
I hope there are enough ppl that are willing to wait to make it work for AMD.
Not exactly how you take back market share tho.

This sort of makes sense (not sure if it is true)

AMD could position
6800 against 3070 ($550-$650 msrp 8gb or 16gb options)
6900 against 3080/3080ti ($800 to $1200 12gb, 16gb, or 24gb options)

The problem is for those who are looking for a cheaper option to 3070 would have to wait till march

And if you are looking for a cheaper option to 3060 it might be a refresh of 5700 to 6500 in june next year !!
 
  • Like
Reactions: Auer
like this
This sort of makes sense (not sure if it is true)

AMD could position
6800 against 3070 ($550-$650 msrp 8gb or 16gb options)
6900 against 3080/3080ti ($800 to $1200 12gb, 16gb, or 24gb options)

The problem is for those who are looking for a cheaper option to 3070 would have to wait till march

And if you are looking for a cheaper option to 3060 it might be a refresh of 5700 to 6500 in june next year !!
That wait is not a good thing for AMD.
 
I wouldn't worry about it. All of this is rumormill as far as I'm concerned until actual people have actual hardware in their hands.
Even everything nVidia has announced is vaporware until again, real people have real hardware in their hands.
Although it's not without precedent that AMD hasn't been able to launch a full product stack at once, we're still just taking third party information and trying to figure out if any of it is real or not. It's not even first hand information from AMD themselves.

We also still have zero performance numbers from either AMD or nVidia as far as I'm concerned because there has be zero third party testing. So there is nothing to do here but wait.
 
Poor white trash like me is waiting for the 3060. Meanwhile I have high end GPU releases to entertain me, in lieu of playing games.
..and that's a valid purchase for valid reasons if you're currently running sub par "gaming" equipment and have already been waiting for "what's next" .. I blew my wad on a couple "what's next"s ago so I can play 2k at reasonable framerates now so I don't mind waiting until next year sometime .. maybe by then I'll be able to afford some upgrades and I'll have more options before me at that time.
 
I wouldn't worry about it. All of this is rumormill as far as I'm concerned until actual people have actual hardware in their hands.
Even everything nVidia has announced is vaporware until again, real people have real hardware in their hands.
Although it's not without precedent that AMD hasn't been able to launch a full product stack at once, we're still just taking third party information and trying to figure out if any of it is real or not. It's not even first hand information from AMD themselves.

We also still have zero performance numbers from either AMD or nVidia as far as I'm concerned because there has be zero third party testing. So there is nothing to do here but wait.
Digital Foundry's 3080 looked pretty real.
And we have launch dates.
You still think it's all smoke and mirrors?
 
Digital Foundry's 3080 looked pretty real.
And we have launch dates.
You still think it's all smoke and mirrors?

Well we all used to call paper launches paper launches. If AMD announces in two weeks with availability same day.... isn't that the same actual launch date.
If AMD does mange to ship a card that slightly edges 3080.... I have no doubt the Nvidia boosters are going to talk about the 3090 and how AMD still loses even though you can't buy a 3090.
 
Digital Foundry's 3080 looked pretty real.
And we have launch dates.
You still think it's all smoke and mirrors?
I wouldn't go that far. nVidia generally as a company is pretty consistent. But I will say that for both AMD and nVidia they have done paper launches in the past 10 years and had plenty of vaporware as well. Then there is also the issue of quantity. Even if all these products launch "on time" that doesn't mean there will be significant quantity either. Meaning every review website might get them but there could be next to zero availability in all the relevant channels.

My point is generally that people get carried away with the hype machine (whether good or bad) and it's rare to see people actually have a level head through all of this. Which is of course what these companies want. Jensen didn't get up on a stage with a massive presentation for nothing while feeding the rumormille for the past year. And we, again, have zero info from AMD themselves about what they're even doing but people are drawing a lot of "conclusions" based on information of at least questionable sources.

I'm just saying, take a step back. Breathe. Wait and see. Maybe we'll get the full product stack all at once. Maybe we'll only get a part of it, piece by piece over the next 6 months. But either way there is no way to predict this other than consulting your crystal ball, so why do it at all?
 
Digital Foundry's 3080 looked pretty real.
And we have launch dates.
You still think it's all smoke and mirrors?
Whether or not it's smoke and mirrors isn't relevant. I still can't buy one. Also, until they come out and we see what kind of availability there will be, launch dates and promises mean nothing. Heck, I still consider the 3300x from AMD vaporware even though a few reviewers managed to get their hands on them and scalpers are offering them for more money than you would pay for a 3600 (or even 3600x is some cases!). Once they are out and available in quantities with hands on reviews from legit reviewers (not just a single NVIDIA friendly review site*) and I don't see 3080's selling for over $1k due to limited stock, then I'll take it more seriously. Same with AMD stuff, if they put out a 6600xt and nobody can get their hands on it, it's useless. I feel this may happen due to zen3, xbox + ps5... I just feel limited quantities are going to be a thing for them for a while as well, but until it happens I don't know for sure.

*This may or may not be the case, but they happened to be the single review site that looked into DLSS and "missed" the quality issues it had (which weren't major, but still, ,glaring omission intentional or otherwise). It's hard to believe they would be the only ones not to notice, and if they were, then they are incompetent, which isn't much better. This isn't meant to start an argument, they do reviews, as do others, but I'm not going to blindly beleive a single source with minimal testing. It was an exclusive hands-on, meaning nvidia let them test it on purpose with what I assume wasn't free reign (aka, these are the benchmarks and settings you can run). I'm sure it's indicative of actual performance, but probably in a better light than a normal review would be. I won't know or trust until I read through multiple reviews. Since GPP I've been a bit less trusting (and I'm already weary of companies) of nvidia and their assistance with companies to show their products.
 
I wouldn't go that far. nVidia generally as a company is pretty consistent. But I will say that for both AMD and nVidia they have done paper launches in the past 10 years and had plenty of vaporware as well. Then there is also the issue of quantity. Even if all these products launch "on time" that doesn't mean there will be significant quantity either. Meaning every review website might get them but there could be next to zero availability in all the relevant channels.

My point is generally that people get carried away with the hype machine (whether good or bad) and it's rare to see people actually have a level head through all of this. Which is of course what these companies want. Jensen didn't get up on a stage with a massive presentation for nothing while feeding the rumormille for the past year. And we, again, have zero info from AMD themselves about what they're even doing but people are drawing a lot of "conclusions" based on information of at least questionable sources.

I'm just saying, take a step back. Breathe. Wait and see. Maybe we'll get the full product stack all at once. Maybe we'll only get a part of it, piece by piece over the next 6 months. But either way there is no way to predict this other than consulting your crystal ball, so why do it at all?
Consistent? Yes, they release cards. Will their be quantities? I know the 2080ti took some time from "release" to being available at normal quantities and prices. I'm hoping this cycle is better. They have been consistent in not having enough quantity at launch, as has AMD. I'm not drawing conclusions based on questionable sources, I'm speculating based on unknown facts ;). That said, I use common sense as well with known quantities. 5700xt = 40cu's... 5700xt = 2070 performance. IF they come out with 72CU's, it should be ~80% faster than the 5700xt if they don't have any increase in clock speeds or architecture improvements. So, worse case it's just below 3080 speeds. IF it has 80CU's, or a clock speed increase, or architectural improvements (I would be surprised if 2 of the 3 aren't the case), then it should match or beat the 3080. It's not full of conjecture and he said she said. It's just looking at reasonable assumptions. Seeing as PS5 is hitting 2.23GHz... I would hope to see least see some frequency increases from RDNA2. Also, architectural improvements will hopefully be a few % faster as well, which is typical (typical whether it's amd, nvidia or intel). Until we get specifics and benchmarks, the best we can do is speculate, but we can be reasonably assured it'll be at or around 3080 levels depending on how many boundaries end up getting pushed. We have no real indication on RT performance, so I won't even speculate on that one.
 
Consistent? Yes, they release cards. Will their be quantities? I know the 2080ti took some time from "release" to being available at normal quantities and prices. I'm hoping this cycle is better. They have been consistent in not having enough quantity at launch, as has AMD. I'm not drawing conclusions based on questionable sources, I'm speculating based on unknown facts ;). That said, I use common sense as well with known quantities. 5700xt = 40cu's... 5700xt = 2070 performance. IF they come out with 72CU's, it should be ~80% faster than the 5700xt if they don't have any increase in clock speeds or architecture improvements. So, worse case it's just below 3080 speeds. IF it has 80CU's, or a clock speed increase, or architectural improvements (I would be surprised if 2 of the 3 aren't the case), then it should match or beat the 3080. It's not full of conjecture and he said she said. It's just looking at reasonable assumptions. Seeing as PS5 is hitting 2.23GHz... I would hope to see least see some frequency increases from RDNA2. Also, architectural improvements will hopefully be a few % faster as well, which is typical (typical whether it's amd, nvidia or intel). Until we get specifics and benchmarks, the best we can do is speculate, but we can be reasonably assured it'll be at or around 3080 levels depending on how many boundaries end up getting pushed. We have no real indication on RT performance, so I won't even speculate on that one.

36 CU PS5 hitting 2.2 GHz is a far cry from 72+ CU's hitting it. 52 CU XBSX and 36 CU PS5 each have ~300W PSU AFAICT. Sony overclocking their smaller PS5, part is likely using just as much power and delivering less performance than underclocked 52 CUs in the XBSX.

Good news is this does tell us is there lot of tune-able range in the process/architecture, but power usage will almost certainly be too high to run 72-80 cores at 2.2 GHz.

My guess, is that AMD has seen NVidias performance, and are tweaking their cards (clock speed, fan profile, power usage) to come as close to the 3080 as they can, without power going through the roof.

I expect AMD will deliver a 16GB HBM card, that essentially matches 3080 performance (because they have decent tuning headroom) with similar prices.

Perhaps $650. Certainly not the BS $550 made up by Coreteks.

$650 for 3080 performance and 16GB of VRAM would be compelling for very many.

Power I expect will be 300W+.
 
36 CU PS5 hitting 2.2 GHz is a far cry from 72+ CU's hitting it. 52 CU XBSX and 36 CU PS5 each have ~300W PSU AFAICT. Sony overclocking their smaller PS5, part is likely using just as much power and delivering less performance than underclocked 52 CUs in the XBSX.

Good news is this does tell us is there lot of tune-able range in the process/architecture, but power usage will almost certainly be too high to run 72-80 cores at 2.2 GHz.

My guess, is that AMD has seen NVidias performance, and are tweaking their cards (clock speed, fan profile, power usage) to come as close to the 3080 as they can, without power going through the roof.

I expect AMD will deliver a 16GB HBM card, that essentially matches 3080 performance (because they have decent tuning headroom) with similar prices.

Perhaps $650. Certainly not the BS $550 made up by Coreteks.

$650 for 3080 performance and 16GB of VRAM would be compelling for very many.

Power I expect will be 300W+.

I agree, that's why I said hopefully there will be a small uptick in frequency, I don't expect it to hit as high as the PS5. I am not convinced AMD will go the HBM route on this one... I guess it's not a big deal either was as long as they match the bandwidth to the performance. 512-bit DDR6 would be just as well as HBM. I also agree, it will be near or above the 300w range. AMD aren't affraid to build high wattage cards (Vega 64, VII). Yeah ,if it's 3080 performance AMD won't price lower than around $50 difference to it. If it's slightly faster probably price it the exact same. The 16GB may be a make/break for some. I do some blender work and some other misc development and having the extra vram can come in handy. The fact that used VII's are still selling for $700-$800 tells you something about there being a market for a 16GB card.
 
If I want to buy Nvidia card but to use it I need to buy and PCU or if I switch to AMD whit same performance but whit old PCU - which way will be better ? :)
Nvidia offer RTX I/O chip, but wait didn't they saw it from competition? Which one will work better?
DLSS is game changer - are you sure? Rdna2 train coming, yes it coming slow, but it coming.

Before ten months I saw that AMD play a puzzle game and said on my friends, saved you money because this year we will see game changer technologies.
I am not AMD fen but they grow very strong.
 
36 CU PS5 hitting 2.2 GHz is a far cry from 72+ CU's hitting it.

A theoretical 80 CU Big Navi at 2.2Ghz delivers 23Tflops. The 3080 is boosting to 1.95Ghz for 33Tflops. That’s quite the difference. Of course flops alone doesn’t mean much but I wonder whether we’re underestimating AMD’s numbers.
 
A theoretical 80 CU Big Navi at 2.2Ghz delivers 23Tflops. The 3080 is boosting to 1.95Ghz for 33Tflops. That’s quite the difference. Of course flops alone doesn’t mean much but I wonder whether we’re underestimating AMD’s numbers.

You really have to ignore the FLOPs for gaming now with the new NVidia SMs. It gets those numbers only on pure FP32 loads. It hasn't translated fully into gaming, not even close really.

Instead just look at gaming performance.

3080 is up to 80% faster than 2080 on Raster games. That puts it somewhere around double 5700Xt.

Double 5700 XT CU's to 80, with normal clocks and you are looking at something right around 3080 performance.

So it's easy to see Big Navi equaling 3080, at least until we get to power. That is where the balancing act comes in.

Even at stock clocks, 80 CUs would be 450 Watts which seems like a non starter, sure there will be some efficiency gains here.

But my guess. Lower clocks were going to get some/most of those savings. TSMC says 15% for 7nm+. That gets you to maybe 380W.

HBM saves what? 40 Watts (total guess): 340W. Were are looking good, a few more tweaks back off on clocks and 295 Watts... Golden. But probably before 3080 Reveal...

Tweak clocks up a bit to better match 3080, and come in just under 3080 power at 315W. :D

Bottom line, I expect close on performance and power.
 
Do we know that they've doubled everything? I've only heard double CUs so far.

That's kind of splitting hairs, CU's contain most of the functionality important to GPU gaming peformance, double those and assuming you don't choke the memory bandwidth you could reasonably expect up to double the gaming performance.

They are synonymous with NVidia SMs, if you NVidia doubled SMs that would more likely double gaming performance, than only doubling FP32 ALUs, and only situationally at that.
 
That's kind of splitting hairs, CU's contain most of the functionality important to GPU gaming peformance, double those and assuming you don't choke the memory bandwidth you could reasonably expect up to double the gaming performance.

They are synonymous with NVidia SMs, if you NVidia doubled SMs that would more likely double gaming performance, than only doubling FP32 ALUs, and only situationally at that.

Both SMs and CUs only contain ALUs and TMUs. Everything else important to games (rasterizers, ROPs, tessellation, geometry setup) is outside the CU.
 
If rdna 2 by clocks and cu alone is 3080 territory in theory, then add in lots of other optimizations and the card sits between 3080 and 3090 which is a winner if the price is good.
 
In before biggest navi lands between 3070 and 3080 using more power than a 3080 priced at $549.99, just to have NV release a 3070Ti that sits just above biggest navi while sipping power in comparison for $599.99.

Just my prediction, I could be totally wrong; but just how many times does history have to back hand slap people before we learn the lesson?1
 
AMD Teases Radeon 6000 'Big Navi' Graphics Cards With Fortnite Easter Egg
1599342024357.png


https://www.pcmag.com/news/amd-teases-radeon-6000-big-navi-graphics-cards-with-fortnite-easter-egg
 
Nonsense... AMD always optimizes their old cards/architectures years after release ;).
My Vega 56 are not agree whit you. Clocked whit old drivers I have same or better FPS than whit last drivers (depend of game/bench).
 
So you're saying AMD will be competitive with NV 30 series in 3 years? Yikes!
You put the word competitive in, that was not in my sentence. Please re-read.

My Vega 56 are not agree whit you. Clocked whit old drivers I have same or better FPS than whit last drivers (depend of game/bench).
If you'd like to sell it for cheap, PM me ;). Anyways, it was meant in jest, hence the wink emoji. They do tend to release crap drivers at launch and then get them up to where they should have been when it released.
 
You put the word competitive in, that was not in my sentence. Please re-read.


If you'd like to sell it for cheap, PM me ;). Anyways, it was meant in jest, hence the wink emoji. They do tend to release crap drivers at launch and then get them up to where they should have been when it released.

The famous "Fine Wine" xD
 
My Vega 56 are not agree whit you. Clocked whit old drivers I have same or better FPS than whit last drivers (depend of game/bench).
All computer hardwarte performances degrades to some degree with time independent of the state of driver quality. Your Vega 56 is likely 3 years old and on its best day in September 2020 it will be a smidgeon less in performance on its best day 3 years ago,.
 
They do tend to release crap drivers at launch and then get them up to where they should have been when it released.
Yes, AMD did it often.
I just wanna mention that "AMD fine wine" is maybe true sometimes but not always :)
And this is not the same as when they launch their products whit beta driver :D

All computer hardwarte performances degrades to some degree with time independent of the state of driver quality. Your Vega 56 is likely 3 years old and on its best day in September 2020 it will be a smidgeon less in performance on its best day 3 years ago,.
I am not sure that I understand you but probably you are right.
 
Back
Top