You know.... I thought I wanted a 3080. But. If AMD comes through with stock...

FYI - Frg says he has information that AMD will have quite a bit of stock coming out through q4... so, launch day should not be as 'end of the world' as Nvidia's launch was. ETailers should restock periodically based on his post in the AMD doesn't want to make the same mistake' thread.

AMD already made the same mistake.
 
AMD already made the same mistake.
To be fair - Frg said 'there's no way AMD could have predicted Nvidia's complete botch' so launch day will be 'tight'. The takeaway from his post is that AMD backloaded most of the cards to AIBs, and they're supposed to be steadily pushing cards through Q4.

If that plays out - they'll have won the 'supply to consumer' fight for the quarter.
 
I am hoping so, hopefully the stock curbs the miners from screwing everyone else again. I had already accepted that I likely wont be getting a new card in my hands until maybe Jan 2021 if i am lucky, this old RX580 is just starting to really show its age now.
 
1605712467817.png
 
So AMD's initial launch stock seems to be even worse than the 30XX cards though Kyle has cross posted around that there should be real stock soon. Still, the 6800XT's performance being a step behind the 3080s in so many games for so many reviewers is a minus for me unless a whole ton of XT's come on the market fast.

Also means that the 6900XT is almost certainly well overpriced and the 3080ti (if released soon with the 3090's core count and 20GB of 6x VRAM) will be the card to buy for this generation.
 
  • Like
Reactions: Epos7
like this
So AMD's initial launch stock seems to be even worse than the 30XX cards though Kyle has cross posted around that there should be real stock soon. Still, the 6800XT's performance being a step behind the 3080s in so many games for so many reviewers is a minus for me unless a whole ton of XT's come on the market fast.

Also means that the 6900XT is almost certainly well overpriced and the 3080ti (if released soon with the 3090's core count and 20GB of 6x VRAM) will be the card to buy for this generation.
I'm seeing that its on par with rasterization, but (predictably) falling behind in RT. Are you seeing something different?
 
I'm seeing that its on par with rasterization, but (predictably) falling behind in RT. Are you seeing something different?
Techpowerup, Hexus, J2cents, and I'm going through others. They are showing some games with the 6800xt's average framerates being tied in some games, ahead in a couple, and seemingly behind in more though I would need to go and manually count. The 3080 looks to have more of an edge in 1% / minimum framerates.

Reviewers (2 cents excluded) are showing SAM mode is helping a bit, but Nvidia will supposedly have a competing feature soon.

I'm ignoring raytracing in general for this generation because of the still present extreme performance penalty.
 
The way I see it:
So AMD's initial launch stock seems to be even worse than the 30XX cards though Kyle has cross posted around that there should be real stock soon. Still, the 6800XT's performance being a step behind the 3080s in so many games for so many reviewers is a minus for me unless a whole ton of XT's come on the market fast.

Also means that the 6900XT is almost certainly well overpriced and the 3080ti (if released soon with the 3090's core count and 20GB of 6x VRAM) will be the card to buy for this generation.
I think RT is Nvidia's only saving grace right now. With the 6900XT coming in, AMD is right behind Nvidia's butt. Think about it this way, if the competition isn't this close, we probably would not have seen the 3080ti priced at around the $1K mark. Yay for competition but damn them all for sucky availability. I literally have to make a daily routine to check stock/restock to finish my builds.
 
wut stock?
Grin

My snarky self thought that NVidia would dump the current iteration 3080s (knowing they'd have ti's out) just to steal AMDs thunder.

It must be hard to chisel these cards out of marble and homespun golden fleece
 
I was going to wait for the 6900XT before opening my 3090 (and thus not being able to return it without penalty), but holy shit am I over these pathetic launches. I highly doubt the 6900XT launch will be any different than today’s 6800XT launch.

I’m about to just finally put the 3090 in my system, enjoy some games, and call it a day for computer upgrades. I no longer have it in me to fight bots, scalpers, and just about everyone else that wants to upgrade over the next few weeks/months.
 
Can't imagine some people are still F5ing so long after release. If your time has any value, the card you MIGHT be able to score will have cost a fortune.
 
I am hoping so, hopefully the stock curbs the miners from screwing everyone else again. I had already accepted that I likely wont be getting a new card in my hands until maybe Jan 2021 if i am lucky, this old RX580 is just starting to really show its age now.
Check the Tech Yes City review on crypto. It is nerfed on these 6800 cards. As I mentioned, that rumor was BS.
Grin

My snarky self thought that NVidia would dump the current iteration 3080s (knowing they'd have ti's out) just to steal AMDs thunder.

It must be hard to chisel these cards out of marble and homespun golden fleece
Hard to ship what never existed.
 
Check the Tech Yes City review on crypto. It is nerfed on these 6800 cards. As I mentioned, that rumor was BS.

Hard to ship what never existed.
I noticed on Newegg and BB that all the cards listed with AIB names look the same; they're all the reference design, then, correct? We haven't seen the AIB custom cards yet
 
AMD presentation showed 6800XT to be closer to 3080 than what reviewers have shown us.
6800XT is slower than 3080 at 4k. In some games by quite a lot. I guess VRAM isn't as much of a holy grail as people here claim to be.
 
AMD presentation showed 6800XT to be closer to 3080 than what reviewers have shown us.
6800XT is slower than 3080 at 4k. In some games by quite a lot. I guess VRAM isn't as much of a holy grail as people here claim to be.
It was never a real issue in current games.
 
I noticed on Newegg and BB that all the cards listed with AIB names look the same; they're all the reference design, then, correct? We haven't seen the AIB custom cards yet
All AIB cards are MBA cards currently for 6800/XT. AIB custom cards will start showing up on 11/25 or so.
 
AMD presentation showed 6800XT to be closer to 3080 than what reviewers have shown us.
6800XT is slower than 3080 at 4k. In some games by quite a lot. I guess VRAM isn't as much of a holy grail as people here claim to be.

VRAM is sometimes quickly an issue, but is usually more of a problem for people who buy a high end or nearly high end card and then hold on to it for 3+ years. The 10GB in a stock 3080 could easily become an issue by next winter's games are released and not everyone wants to drop more than the price of a new console yearly to keep up with VRAM.
 
AMD presentation showed 6800XT to be closer to 3080 than what reviewers have shown us.
6800XT is slower than 3080 at 4k. In some games by quite a lot. I guess VRAM isn't as much of a holy grail as people here claim to be.

People often equate RAM with speed. When it comes to graphics cards, memory limitations usually only come into play when that card is two or three years old. This wasn't hard to figure out as the RTX 3090 had more CUDA cores and more than twice as much VRAM and yet only manages to be 10-15% faster.
 
VRAM is sometimes quickly an issue, but is usually more of a problem for people who buy a high end or nearly high end card and then hold on to it for 3+ years. The 10GB in a stock 3080 could easily become an issue by next winter's games are released and not everyone wants to drop more than the price of a new console yearly to keep up with VRAM.
With card prices being what they are - leeching every bit of life from the card seems smart. See my 970 thread - it is actually better than functional at 1080P, and it is ooold. So for people are gaming at 1440 or below, this gen's cards should last 'a lifetime'. At 4k I'd say you get a few years as well - presuming you don't require RT.

Because panel tech is the issue - 4k is not yet widely adopted in the PC space, but that will be changing starting this Gen. And if you look - there are lots of titles pushing 4k past 100 fps. Folks trying to push 4k to 240 - they'll need next Gen cards.

But since RT enabled is the the new Shiney (and I argue it is still the new, regardless of the 2080ti) - folks who MUST HAVE RT are the market for the next two gen's. B/c some of the benches I've seen show current RT enabled cards still only getting 30 frames - and that's a problem needing a next new card
 
I think we all need a good article to show exactly how VRAM is utilized in terms of loading and unloading.
If RTX3080 with 10GB of GDDR6X can load, unload and refresh textures during gameplay in a shorter timespan than a 6800XT with 16GB of GDDR6, this might all be irrelevant.
I run racing simulations...GTR2, GT Legends, Rfactor2, Assetto Corsa, Assetto Corsa Competizione, Race Room, Automobilista 1/2, Race07, etc...
VRAM amount...more so than VRAM speed, is most important to allow successful running with 30+ cars on track.
For this reason alone, I'm looking at the 6800XT.
Anybody here tried the RTX3080 with any racing sims?
What was the experience?
 
I think we all need a good article to show exactly how VRAM is utilized in terms of loading and unloading.
If RTX3080 with 10GB of GDDR6X can load, unload and refresh textures during gameplay in a shorter timespan than a 6800XT with 16GB of GDDR6, this might all be irrelevant.
I run racing simulations...GTR2, GT Legends, Rfactor2, Assetto Corsa, Assetto Corsa Competizione, Race Room, Automobilista 1/2, Race07, etc...
VRAM amount...more so than VRAM speed, is most important to allow successful running with 30+ cars on track.
For this reason alone, I'm looking at the 6800XT.
Anybody here tried the RTX3080 with any racing sims?
What was the experience?
Good article or not people will still argue one way or the other. It is such a hot button issue.
 
I think we all need a good article to show exactly how VRAM is utilized in terms of loading and unloading.
If RTX3080 with 10GB of GDDR6X can load, unload and refresh textures during gameplay in a shorter timespan than a 6800XT with 16GB of GDDR6, this might all be irrelevant.
I run racing simulations...GTR2, GT Legends, Rfactor2, Assetto Corsa, Assetto Corsa Competizione, Race Room, Automobilista 1/2, Race07, etc...
VRAM amount...more so than VRAM speed, is most important to allow successful running with 30+ cars on track.
For this reason alone, I'm looking at the 6800XT.
Anybody here tried the RTX3080 with any racing sims?
What was the experience?
Or you just waste your money on a 3090 and not worry about either VRAM capacity or speed. :D
 
While it is not normally a concern, I do remember hearing that the RTX3080/3090 fans are very, very expensive.
That AMD RX6000 series also uses a shrouded blade design.
Both solutions seem to work very well but....
Is anybody else concerned as to whether replacements will be available in the event of a failure?
Are both companies using dual ball-bearing fans?
I would hope so...given the cost of the cards.
 
While it is not normally a concern, I do remember hearing that the RTX3080/3090 fans are very, very expensive.
That AMD RX6000 series also uses a shrouded blade design.
Both solutions seem to work very well but....
Is anybody else concerned as to whether replacements will be available in the event of a failure?
Are both companies using dual ball-bearing fans?
I would hope so...given the cost of the cards.
I think the expectation is that you would simply move on to something like a water block; I mean how often do people actually repair ... I was going to say cards, but really, ANYTHING... these days?
Frg told us this; AMD sent cards to AIBs that agreed to meet certain goals through Q4... the other ones? They'll get theirs later.
 
While it is not normally a concern, I do remember hearing that the RTX3080/3090 fans are very, very expensive.
That AMD RX6000 series also uses a shrouded blade design.
Both solutions seem to work very well but....
Is anybody else concerned as to whether replacements will be available in the event of a failure?
Are both companies using dual ball-bearing fans?
I would hope so...given the cost of the cards.
I have not had a fan fail on me in years, 10 maybe? more i think, my old msi 980 is in my wifes computer and in use daily and the fans seem fine. that just seems like such a non issue. my rad fans have been running 24/7 since 2016.
 
Oof...after looking at the reviews I’m going to take a pass on AMD GPU for now. While it has excellent rasterization performance, everything else is a step down or to be developed.
AMD wins in rasterization performance for games developed with AMD hardware in mind.

Next gen consoles will be on AMD hardware. AMD will be on top.
 
Who cares. I mean really lol. Being able to buy one of these cards right now is like winning the lottery. I am a bit bitter that June of 21 is probably when the scalpers will let up. Till then, it's vaporware to me.

Why am I so bitter!? First world problems bruh.
 
Who cares. I mean really lol. Being able to buy one of these cards right now is like winning the lottery. I am a bit bitter that June of 21 is probably when the scalpers will let up. Till then, it's vaporware to me.

Why am I so bitter!? First world problems bruh.
I just acted like a lunatic for 30 minutes because Walmart had the Series X in stock... and I had to pick up the kids from school, wait for a late school bus, toddle just behind a slow bicyclist, ran inside, got on the app - got one in the cart, and the app froze during checkout... OOS.

Couldn't try on the phone b/c my whole neighborhood is in the 'shadow' of three cell towers, and web performance is worse than 1994 dialup.

Yeah - we're all FWP on this site.
 
AMD wins in rasterization performance for games developed with AMD hardware in mind.

Next gen consoles will be on AMD hardware. AMD will be on top.

I hope they go toe-to-toe, but right now they have a long, long way to go. As far as being on top, time will tell.
 
I hope they go toe-to-toe, but right now they have a long, long way to go. As far as being on top, time will tell.
They don't actually care about being 'on top' given the vast amount they'll make via marketshare. It's nice that they're pushing the envelope again, and proving they can make capable cards - but being the tip of the spear isn't really their focus
 
Nvidia wins. Every new game coming out will have raytrace just cuz it’s the new hotness. So why would you spend this amount of money and turn it off? And DLSS is that saving grace that will take you from 45ps average at 4k with raytracing on to above 60 FPS. Seen it in tons of titles now..Control, Tomb Raider, Metro Exodus. If you want to play at 60 FPS with raytracing on at 4k, Nvidia is your only option.
......and I suspect Cyberpunk will be this way too.
 
Nvidia wins. Every new game coming out will have raytrace just cuz it’s the new hotness. So why would you spend this amount of money and turn it off? And DLSS is that saving grace that will take you from 45ps average at 4k with raytracing on to above 60 FPS. Seen it in tons of titles now..Control, Tomb Raider, Metro Exodus. If you want to play at 60 FPS with raytracing on at 4k, Nvidia is your only option.
......and I suspect Cyberpunk will be this way too.
Then again AMD wins even in raytracing if the game is developed with AMD in mind.

And we know what hardware next gen consoles are running on.
 
Nvidia wins. Every new game coming out will have raytrace just cuz it’s the new hotness. So why would you spend this amount of money and turn it off? And DLSS is that saving grace that will take you from 45ps average at 4k with raytracing on to above 60 FPS. Seen it in tons of titles now..Control, Tomb Raider, Metro Exodus. If you want to play at 60 FPS with raytracing on at 4k, Nvidia is your only option.
Then again AMD wins even in raytracing if the game is developed with AMD in mind.

And we know what hardware next gen consoles are running on.
Minecraft is owned by Microsoft. Go look how Minecraft RTX performs in 6800 xt
 
Back
Top