Radeon 6000 series speculation

They look good (always have been a fan of their design). Not liking the 850w recommendation (I've currently got a 750w). Hoping the Red Devil will work with a 750w.
Man there needs to be like a sticky for power supplies and the reality of them. I can almost guarantee that the 3800XT would work with a good quality 500W power supply. Seasonic has an awesome power supply tool. The thing is the differences in power supply quality can be HUGE. A cheap power supply at 500W might only provide like 30 amps of 12V making it effectively a 360W power supply while a Seasonic would be able to provide like 98% of the total rating on the 12V rail. So they say 850W so that no matter whether you buy a Seasonic Prime Titanium or a $25 Walmart special 850W power supply it will work.
 
stop thinking you know better than the people that make the parts. buy the minimum recommended for the gpu at minimum, add more if you oc.
 
Let's review. Search for Power Supply problems and continue to underbuild in that area because "you're smart".
 
It's a waste of money and completely unecessary.
6B007B9F-D083-4818-A537-5705FB43ADFF.gif
 
They will over recommend every time because of people using cheap PSUs. You have to factor in the lowest common denominator.

Look at all the reports of the 3080 working fine with a 650w PSU.

That said it will give you more room with a power supply that is acting up and losing capacity.
 
They will over recommend every time because of people using cheap PSUs. You have to factor in the lowest common denominator.

Look at all the reports of the 3080 working fine with a 650w PSU.

That said it will give you more room with a power supply that is acting up and losing capacity.
I think psu's are a touchy subject with people, kinda like ram, they want to just make sure system X will never go above PSU Y. They buy something far higher than could ever be acheived, using 1 videocard, and always have ample headroom. I've always been in the camp that you want to use most of your psu's headroom for efficiency purposes, but also have bought for sli / cfx setups since 2004. You'd be surprised how hard some psu's can push their limit, within reason. I've only ever set one computer on fire, and that was because a 'friend' of mine sent me a junk chinese no name psu from china that was rated for 700w, but literally caught on fire the first time under full load (case fans are very efficient at pouring black smoke into the room apparently). If anything, buy a psu from a trusted source.

Unless you have a higher core count chip (10+), and are overclocking it, you'd be quite surprised what your typical sustained loads are. Run some prime or long form cinebench to really test the heat and stability. Furmark never deceives either.

If you had a 3600/5600x class cpu, i bet you could run a 3080 stock on a 500w psu no issue.
 
I think psu's are a touchy subject with people, kinda like ram, they want to just make sure system X will never go above PSU Y. They buy something far higher than could ever be acheived, using 1 videocard, and always have ample headroom. I've always been in the camp that you want to use most of your psu's headroom for efficiency purposes, but also have bought for sli / cfx setups since 2004. You'd be surprised how hard some psu's can push their limit, within reason. I've only ever set one computer on fire, and that was because a 'friend' of mine sent me a junk chinese no name psu from china that was rated for 700w, but literally caught on fire the first time under full load (case fans are very efficient at pouring black smoke into the room apparently). If anything, buy a psu from a trusted source.

Unless you have a higher core count chip (10+), and are overclocking it, you'd be quite surprised what your typical sustained loads are. Run some prime or long form cinebench to really test the heat and stability. Furmark never deceives either.

If you had a 3600/5600x class cpu, i bet you could run a 3080 stock on a 500w psu no issue.

I know a lot of people won't like that but I agree. I think the concern is more if it blows out what does it take with it. But I've never had a problem in the last 10 years with any power supply I've used.

And I agree with the RAM also. People are way too concerned with Cas 14 vs 16 etc and pay out the nose for essentially minimal gains.
 
I think psu's are a touchy subject with people, kinda like ram, they want to just make sure system X will never go above PSU Y. They buy something far higher than could ever be acheived, using 1 videocard, and always have ample headroom. I've always been in the camp that you want to use most of your psu's headroom for efficiency purposes, but also have bought for sli / cfx setups since 2004. You'd be surprised how hard some psu's can push their limit, within reason. I've only ever set one computer on fire, and that was because a 'friend' of mine sent me a junk chinese no name psu from china that was rated for 700w, but literally caught on fire the first time under full load (case fans are very efficient at pouring black smoke into the room apparently). If anything, buy a psu from a trusted source.

Unless you have a higher core count chip (10+), and are overclocking it, you'd be quite surprised what your typical sustained loads are. Run some prime or long form cinebench to really test the heat and stability. Furmark never deceives either.

If you had a 3600/5600x class cpu, i bet you could run a 3080 stock on a 500w psu no issue.

If you're running stock, I completely agree. As soon as you overclock, however, the power curve gets increasingly steep. It's not unheard of to see a 15% power increase on a component for a 5% overclock. I think a moderate overclock is what most chip makers target for their recommendations.
 
The recommended PSU obviously includes the manufacturer overshooting to account for tons of accessories and or a shit/aged power supply.

The card is rated for ~300 watts which is ~25 amps on the 12v line. A seasonic 650 can do 54amps on the 12v line so this leaves plenty for the rest of the system assuming it's pulling around ~20 amps.

A good quality 500-650 will run it okay, though you're likely going to be very close to the limit with a 500. Would not advise.

This is basic math/common sense guys. Nothing to really argue about.
 
i had 4x rx480s on a quality 500w in a mining config. way outside of recommended, never had a power issue. if you can do your own math, do what you want. if you build based on recommended, that is fine too. no one is telling you you HAVE to under shoot a PSU. just that if you are on a budget, some extra math may save you a buck.
 
If you're running stock, I completely agree. As soon as you overclock, however, the power curve gets increasingly steep. It's not unheard of to see a 15% power increase on a component for a 5% overclock. I think a moderate overclock is what most chip makers target for their recommendations.
That's the key, what your expected load will be if you are going to overclock. We're at the point where manufacturers are factory overclocking their products near the limit, and any extra performance comes at the expense of massive power and heat overhead. Even a 10% overclock can push you up 100+ watts for the extremely limited gains. Intel cpus especially incur a steep penalty for power consumption at present.
 
so an overclocked 6800 XT outperforms the 3090??...seriously?...would getting a 6800 XT make any sense if I have a G-Sync monitor?...would the brute force power and FPS increase make up for the lack of the variable refresh rate?...I currently have a GTX 1070
 
I am going to get the 6800xt when It is available. I have an evga 750. I think I will be fine Ill watch wattage on my APC backup. I dont have the PC in my sig anymore lol . I run a 3900x now with a gtx 1080. It runs surprisingly low wattage even when folding@home is running the gpu and the cpu set to use 22 out of the 24 threads. The only other thing I have is a second HD for games and 32gb ram. I dont see this new card stressing my 750......edit btw I will at some point get a new power supply and I agree anyone should follow the power recommendations they say to use.
 
Last edited:
stop thinking you know better than the people that make the parts. buy the minimum recommended for the gpu at minimum, add more if you OC
Man there needs to be like a sticky for power supplies and the reality of them. I can almost guarantee that the 3800XT would work with a good quality 500W power supply. Seasonic has an awesome power supply tool. The thing is the differences in power supply quality can be HUGE. A cheap power supply at 500W might only provide like 30 amps of 12V making it effectively a 360W power supply while a Seasonic would be able to provide like 98% of the total rating on the 12V rail. So they say 850W so that no matter whether you buy a Seasonic Prime Titanium or a $25 Walmart special 850W power supply it will work.

johnnysd gave the correct answer here, while pendragon1 completely misses the boat.

AMD requires such a high output power supply because they have to account for the fact that there is extreme variance in power supply quality.
A high quality brand would cut it at 500W, a random cheapass one might need 850.
AMD has to cover their ass for this possibility.

It has nothing to do with AMD being "wrong" or people 'knowing better than AMD".

EIther you understand the extreme variance in power supply quality or you don't.

This isn't an esoteric concept. A few seconds testing will show you in way you cannot deny or forget
 
yeah yeah think you know better than the people that design them. yes there is a variance between crap and good psus, but ill listen to the guys that make the gpu and there is proof in this forum about why. plenty of people trying to minimize wattage and having all sorts of problems.
 
yeah yeah think you know better than the people that design them. yes there is a variance between crap and good psus, but ill listen to the guys that make the gpu and there is proof in this forum about why. plenty of people trying to minimize wattage and having all sorts of problems.

It's ok, just say you don't understand.
 
yeah yeah think you know better than the people that design them. yes there is a variance between crap and good psus, but ill listen to the guys that make the gpu and there is proof in this forum about why. plenty of people trying to minimize wattage and having all sorts of problems.

If you don't want to go through the hassle of calculate wattage needs then yes, adhere to manufacturer guidelines.
 
has AMD been conservative in their Big Navi and Zen 3 pre-release performance numbers?...seems like both are way better then the initial early rumors...previously Big Navi was rumored to maybe be even with or slightly better then a 2080 Ti...now a 6800XT is rumored to be close to a 3090!...WTH?
 
has AMD been conservative in their Big Navi and Zen 3 pre-release performance numbers?...seems like both are way better then the initial early rumors...previously Big Navi was rumored to maybe be even with or slightly better then a 2080 Ti...now a 6800XT is rumored to be close to a 3090!...WTH?

Dunno where you heard that. Maybe if the 6800XT is overclocked. Apparently Big Navi overclocks really well, getting to 2.4ghz or maybe even 2.5ghz if you push it.

But the numbers people compare are based on STOCK configurations. So at stock speeds the 6800XT trades blows with the 3080 and the 6900XT trades blows with the 3090 although wiht "rage mode" and "SAM" activated.

Personally I feel like the 6800XT looks like the best value. And if you overclock it of course, you can squeeze out more performance. Because the difference between the two "top tier" parts is very close,~10%, I'm not surprised that an overclock could get a 6800XT close to a STOCK 3090.

I'll be hunting on launch day whichever card can get into my hands the 6800XT or the 3080 I will take. The performance is close enough and I honestly don't feel like the ram difference matters cos I'm not gaming at 4k anyways. It's a nice bonus but it's not going to make a difference to me with my 1440p ultrawide monitor.
 
Dunno where you heard that. Maybe if the 6800XT is overclocked. Apparently Big Navi overclocks really well, getting to 2.4ghz or maybe even 2.5ghz if you push it.

But the numbers people compare are based on STOCK configurations. So at stock speeds the 6800XT trades blows with the 3080 and the 6900XT trades blows with the 3090 although wiht "rage mode" and "SAM" activated.

Personally I feel like the 6800XT looks like the best value. And if you overclock it of course, you can squeeze out more performance. Because the difference between the two "top tier" parts is very close,~10%, I'm not surprised that an overclock could get a 6800XT close to a STOCK 3090.

I'll be hunting on launch day whichever card can get into my hands the 6800XT or the 3080 I will take. The performance is close enough and I honestly don't feel like the ram difference matters cos I'm not gaming at 4k anyways. It's a nice bonus but it's not going to make a difference to me with my 1440p ultrawide monitor.

early rumors had the best Big Navi card on par with or slightly better then a 2080 Ti...now we have AMD pretty much tied with or surpassing them with both the 6800XT (versus 3080) and the 6900XT (3090)...if true AMD has made a huge leap forward in both the CPU and GPU market...the CPU side was expected but the GPU performance is much better
 
early rumors had the best Big Navi card on par with or slightly better then a 2080 Ti...now we have AMD pretty much tied with or surpassing them with both the 6800XT (versus 3080) and the 6900XT (3090)...if true AMD has made a huge leap forward in both the CPU and GPU market...the CPU side was expected but the GPU performance is much better
I've been reading the rumors on various websites from multiple sources for the past few months. After weeding out the ones that didn't pan out and including the more plausible ones:

I believe the original +15% performance over the 2080 Ti rumors were for the 6800XT as the rumors also suggested that the "biggest" Navi (6900XT) was AMD exclusive and kept a tight secret. Also could explain why the 6900XT is releasing 3 weeks after the 6800/6800XT: AIBs didn't get their hands on them as soon as they did the 6800/6800XT. I'm actually still not entirely sure if AIBs are releasing 6900XT's at launch at this point.

AMD apparently was sending AIBs intentionally gimped drivers (by about 10-15%) so that the true performance of the 6800XT wasn't known outside of AMD. 2080 Ti + 15% Rumor + 10-15% Gimped Driver Rumor = roughly RTX 3080 performance.
 
I've been reading the rumors on various websites from multiple sources for the past few months. After weeding out the ones that didn't pan out and including the more plausible ones:

I believe the original +15% performance over the 2080 Ti rumors were for the 6800XT as the rumors also suggested that the "biggest" Navi (6900XT) was AMD exclusive and kept a tight secret. Also could explain why the 6900XT is releasing 3 weeks after the 6800/6800XT: AIBs didn't get their hands on them as soon as they did the 6800/6800XT. I'm actually still not entirely sure if AIBs are releasing 6900XT's at launch at this point.

AMD apparently was sending AIBs intentionally gimped drivers (by about 10-15%) so that the true performance of the 6800XT wasn't known outside of AMD. 2080 Ti + 15% Rumor + 10-15% Gimped Driver Rumor = roughly RTX 3080 performance.
AMD has to bin the 6900xt - hence the delayed release. They are essentially identical cards to the 6800xt / xl with simply more cores enabled, so it doesn't have anything to do with aib's.
 
AMD has to bin the 6900xt - hence the delayed release. They are essentially identical cards to the 6800xt / xl with simply more cores enabled, so it doesn't have anything to do with aib's.
Slight tangent; are AIBs releasing their cards at launch? I don't recall seeing anything about that.

I still tend to believe the rumors that AMD didn't provide 6900XT's to AIBs until a later date. And the early rumors were indeed based off of the 6800XT. Everything seems to add up that way and will especially seem to hold true if AIBs aren't releasing their versions of the 6900XT at launch.
 
Yes I believe that the AIBs will get 6900XT chips but not until later. So AMD will be the only one selling them on launch day. The question is how will the 6900XT hold up to the challenge of the 3080 Ti. Right now it looks like the 6900XT is a WAY better value than the 3090 being priced $500 less but trading blows with Nvidia's top card.
 
Slight tangent; are AIBs releasing their cards at launch? I don't recall seeing anything about that.

I still tend to believe the rumors that AMD didn't provide 6900XT's to AIBs until a later date. And the early rumors were indeed based off of the 6800XT. Everything seems to add up that way and will especially seem to hold true if AIBs aren't releasing their versions of the 6900XT at launch.
talk is that many of the high end models will be delayed a week or more, similar to rtx 3080 cards
 
so an overclocked 6800 XT outperforms the 3090??...seriously?...would getting a 6800 XT make any sense if I have a G-Sync monitor?...would the brute force power and FPS increase make up for the lack of the variable refresh rate?...I currently have a GTX 1070
They could possibly be compatible, check forums to see if anyone has tried it out.
 
They could possibly be compatible, check forums to see if anyone has tried it out.

my monitor is 100% not FreeSync compatible...I was just wondering (coming from a 1070) if the brute force fps increase would make up for the lack of a variable refresh rate...
 
my monitor is 100% not FreeSync compatible...I was just wondering (coming from a 1070) if the brute force fps increase would make up for the lack of a variable refresh rate...
I use one of those G-Sync/VRR LG OLEDs. Best thing about it as freesync is not supported but you can just use CRU and make a free range .Looks like RTX 3080 has to go.
 
Back
Top