NVIDIA reportedly wants to cut TSMC orders for next-gen RTX 40 GPUs 5nm wafers amid lower demand

Status
Not open for further replies.
As the joke goes, when developing the space program the US spent hundreds of thousands of dollars (government kickbacks to a favorable company I'm sure) developing a pen that writes in zero-gravity... the Soviet Union simply used a pencil...

of course that could be because that was the apex of their technology.
Which is great until a pencil snaps and you have a small piece of graphite floating around and it gets in some equipment and shorts it out and causes a fire.

But the story is a myth, in reality NASA got in trouble for wasting tax payers money buying mechanical pencils.

https://www.scientificamerican.com/article/fact-or-fiction-nasa-spen/
 
Which is great until a pencil snaps and you have a small piece of graphite floating around and it gets in some equipment and shorts it out nearly causing a fire and forcing the crew to scrub the mission.
Its a good point of course... we have multiple art supply type options that could have worked. Not all pencils use graphite. I believe a hard charcoal would work just fine. I am actually pretty sure the Russian solution was 100% safe... I think it was a hard charcoal they used. So although I agree with you... I think the Russian anecdote is still valid. $100,000 anti gravity pen... Russians... .25c art supply. Both valid solutions.
 
Its a good point of course... we have multiple art supply type options that could have worked. Not all pencils use graphite. I believe a hard charcoal would work just fine. I am actually pretty sure the Russian solution was 100% safe... I think it was a hard charcoal they used. So although I agree with you... I think the Russian anecdote is still valid. $100,000 anti gravity pen... Russians... .25c art supply. Both valid solutions.
Ok replying to myself cause I was actually curious.

Turns out the entire story is a bit of BS. Truth is the americans used pencils till someone leaked they where paying $1000 per mechanical pencil. Fisher pen company spent 1 million developing the pressure (over gravity) flow pen on their own. They sold them to Nasa in bulk for like 3 bucks each. The Russians also ordered them and used them as well. :) lol
 
Ok replying to myself cause I was actually curious.

Turns out the entire story is a bit of BS. Truth is the americans used pencils till someone leaked they where paying $1000 per mechanical pencil. Fisher pen company spent 1 million developing the pressure (over gravity) flow pen on their own. They sold them to Nasa in bulk for like 3 bucks each. The Russians also ordered them and used them as well. :) lol
Non wooden, non flaking, and non flammable, pencils and pencil leads are expensive.

The Apollo 1 fire was a big deal at the time and NASA took that to heart.
 
Non wooden, non flaking, and non flammable, pencils and pencil leads are expensive.

The Apollo 1 fire was a big deal at the time and NASA took that to heart.
Yep its all logical in retrospec. The public didn't think paying a grand per pencil + almost as much for refills was a good deal. Fisher saved the day cause they where working on something anyway. Realty is never as exciting. :)
 
IM margin as an example... is 1.5%. They take care of all the warehousing and just in time delivery for the majority of the North American retail electronics market. I would assume from GPU card MFGs they are probably getting product to move along with a 10% or so margin. With margins like that dead stock is a killer.

My only point is ya if companies like them are sitting on thousands of 3000 cards.... the pressure on Nvidia to drag their heals on 4000 is going to be HUGE. Cause frankly. The distribution business is small enough that if 3 or 4 of the biggest players tell Nvidia flat out if you launch the 4000 we will purposely order next to zero. Nvidia really has no choice but to delay. Or get shit from consumers for paper launching. Nvidia can't handle the demand of being the only one shipping retail cards... and Card manufacturers don't have the infrastructure to fill just in time orders for companies like Best Buy. Versions of exactly that have happened on smaller scales before... there have been a few launches where companies have been called paper launchers even though they sold a handful themselves.... if the distribution chain Says na na we are moving this stuff first, that is what will happen one way or another. Even if Nvidia launches the 4000... its possible supply will appear "very very bad" for months as distribution tries to unload 3000s. (That is assuming Nvidia doesn't just agree to offer steep rebates... or even take product back to make the distributers happy)

Oh well, sounds like their problem, not mine.

At the end of the day they need my money more than I need their luxury item. I have a list of interests longer than my finances allow, so I just cross off PC gaming and do something else, no skin off my back.

I still say this whole thing is entirely on the back of AMD/NVidia. They left their core customers out to dry. The GPU market has primarily been supported by PC gamers for decades. They did nothing to help their long time reliable customers obtain their products. AMD and NVidia could have went to their distributors like Ingram Micro and told them if they continue to ship airplane loads of cards out to miners at the expense of availability to the reliable customer base that has supported them for 20+ years that they will not let them distribute future products, rather the distribution allocation will go to a competitor distributor, and then that competitor can get the best buy contract instead of them. All these cards would still have been sold either way, it's not like there was a lack of demand for GPUs so in that case it's generally advisable to make sure to take care of your reliable customer base. Extra income is great, but not at the expense of alienating a long time customer base.

What they did is roughly the equivalent of a business that does lawn service and has been cutting your grass for 15 years ghosting you during a lawn care service provider shortage so they could take on caring for a big lucrative mega property that is just a short term flash in the pan, and then after that 2 year flash in the pan contract is up coming back and saying they are available now again. Well, sorry Jack, you ditched me for your big shiny contract, I don't need you any more. Maybe you should go ask the owner of that big shiny property to hire you? Oh, they don't need you any more? Well, I don't need you either - that sounds like a you problem, so shove off.


Personally I won't be buying a 3000 series card, they can eat them all, or at least the one I would have bought. By buying one, all the purchaser does is encourage this type of behavior from them. Yes, I am just one person, however it's not just me that feels this way. See: https://www.techspot.com/news/95637...-card-prices-even-further.html#commentsOffset They crapped on their long time customers to make a quick buck. Now they need to deal with the repercussions of their choices. One person being upset doesn't really matter, however when a sizable portion of your customer base being so mad they won't buy your product, that is a big problem.

1660762205068.png


1660762650370.png
 
Last edited:
Oh well, sounds like their problem, not mine.

At the end of the day they need my money more than I need their luxury item. I have a list of interests longer than my finances allow, so I just cross off PC gaming and do something else, no skin off my back.

I still say this whole thing is entirely on the back of AMD/NVidia. They left their core customers out to dry. The GPU market has primarily been supported by PC gamers for decades. They did nothing to help their long time reliable customers obtain their products. AMD and NVidia could have went to their distributors like Ingram Micro and told them if they continue to ship airplane loads of cards out to miners at the expense of availability to the reliable customer base that has supported them for 20+ years that they will not let them distribute future products, rather the distribution allocation will go to a competitor distributor, and then that competitor can get the best buy contract instead of them. All these cards would still have been sold either way, it's not like there was a lack of demand for GPUs so in that case it's generally advisable to make sure to take care of your reliable customer base. Extra income is great, but not at the expense of alienating a long time customer base.

What they did is roughly the equivalent of a business that does lawn service and has been cutting your grass for 15 years ghosting you during a lawn care service provider shortage so they could take on caring for a big lucrative mega property that is just a short term flash in the pan, and then after that 2 year flash in the pan contract is up coming back and saying they are available now again. Well, sorry Jack, you ditched me for your big shiny contract, I don't need you any more. Maybe you should go ask the owner of that big shiny property to hire you? Oh, they don't need you any more? Well, I don't need you either - that sounds like a you problem, so shove off.


Personally I won't be buying a 3000 series card, they can eat them all, or at least the one I would have bought. By buying one, all the purchaser does is encourage this type of behavior from them. Yes, I am just one person, however it's not just me that feels this way. See: https://www.techspot.com/news/95637...-card-prices-even-further.html#commentsOffset They crapped on their long time customers to make a quick buck. Now they need to deal with the repercussions of their choices. One person being upset doesn't really matter, however when a sizable portion of your customer base being so mad they won't buy your product, that is a big problem.

View attachment 501691

View attachment 501693

I agree. Screw em. lol

I was just trying to point out a reality for GPU distribution most people aren't aware of... and how it effects a company like Nvidia. Nvidias customers are actually OEMs like ASUS, Gigabyte ect. Those companies have customers as well... companies like Ingram Micro. Ingram Micro also doesn't deal with you or me... they sell product to companies like Micro Center, Best Buy ect. Most people just assume when they walk into their local computer part shop that they are buying their cards from Asus ect... they are not. They get them from a wholesaler.

There are 3 levels between Nvidia and you and me. During the Crypto craze those 3 levels all got greedy and tried their hand at selling direct to farms. So the OEMs and the Wholesalers got shafted with big orders. I agree screw them. However that is not a line Nvidia can take. When a company like Ingram Micro says Nvidia don't release a new card today... and if these 3000s don't move (OR you give us a massive rebate to adjust our costs). We are not going to touch your new cards. They sort of have no choice but to listen. There is a very high chance Nvidia is going to be delaying the 4000 cards for some time... unless AMD forces there hand.
 
Lol. Everyone loves to hate nVidia.

I wouldn't say they did nothing. They made mining specific cards, was a way to use up chips that would have been scrap otherwise. Made financial sense, less waste. But, I think the yields were pretty good, so the point that perfectly good chips were used for mining cards I believe is true. But, I wouldn't say they did nothing.

They did what they could to maximize profits, which is their primary responsibility, not clamoring to play up to the desires of whiny bitchy forum posters and tweeters. Sucks but is true.

The reality is that PC gaming wouldn't be where it is without nVidia. We do need them, probably more than they need us (at this point), as they are in more markets than just PC gaming now. Facts be inconvenient these days huh.

And when that new shiny comes out, you know you will want it. Which makes this thread all the more hilarious.
 
Last edited:
Lol. Everyone loves to hate nVidia.

I wouldn't say they did nothing. They made mining specific cards, was a way to use up chips that would have been scrap otherwise. Made financial sense, less waste. But, I think the yields were pretty good, so the point that perfectly good chips were used for mining cards I believe is true. But, I wouldn't say they did nothing.

They did what they could to maximize profits, which is their primary responsibility, not clamoring to play up to the desires of whiny bitchy forum posters and tweeters. Sucks but is true.

The reality is that PC gaming wouldn't be where it is without nVidia. We do need them, probably more than they need us (at this point), as they are in more markets than just PC gaming now. Facts be inconvenient these days huh.

And when that new shiny comes out, you know you will want it. Which makes this thread all the more hilarious.
I think you have the understanding wrong. Everyone loves to hate corporations that take advantage of the principal consumer base that made them what they are today.

That now includes AMD as well as nVidia.

There is something to be said about actually providing product in quantities to the people that you built your business on. Alas, there is no loyalty to consumers any more. None of the companies care, including AMD. One of these days, if they have their way, AMD will be worse than Intel in regards of anti-competitive practices.

Took me two years to get a Graphics card at MSRP. I'm done for a while. 16Gigs of graphics memory should be fine for a while and the card I have should push my 4K display well until either it dies or the card shits the bed. I was just getting 176-194 FPS in Wasteland 3 Maxed out at 4K, all settings turned on and maxed to 11 (on a 10 point scale). For as much as I game these days, I think I'm good for a couple generations of cards.
 
Lol. Everyone loves to hate nVidia.

I wouldn't say they did nothing. They made mining specific cards, was a way to use up chips that would have been scrap otherwise. Made financial sense, less waste. But, I think the yields were pretty good, so the point that perfectly good chips were used for mining cards I believe is true. But, I wouldn't say they did nothing.

They did what they could to maximize profits, which is their primary responsibility, not clamoring to play up to the desires of whiny bitchy forum posters and tweeters. Sucks but is true.

The reality is that PC gaming wouldn't be where it is without nVidia. We do need them, probably more than they need us (at this point), as they are in more markets than just PC gaming now. Facts be inconvenient these days huh.

And when that new shiny comes out, you know you will want it. Which makes this thread all the more hilarious.

You can buy them all, I am out the next generation of cards no matter how supposedly good they are. Unless prices come down to something less stupid they can just rot on the shelves and I also could care less who makes them, owned Matrox, Nvidia, ATI, AMD and Promise. I have hardly ever felt the need to upgrade my card every single time a new generation comes out anyway. Also PC gaming wouldn't be anywhere without standards like Glide, Vulkan and DirectX. Who made these cards mattered far less.
 
You can buy them all, I am out the next generation of cards no matter how supposedly good they are. Unless prices come down to something less stupid they can just rot on the shelves and I also could care less who makes them, owned Matrox, Nvidia, ATI, AMD and Promise. I have hardly ever felt the need to upgrade my card every single time a new generation comes out anyway. Also PC gaming wouldn't be anywhere without standards like Glide, Vulkan and DirectX. Who made these cards mattered far less.
Yeah, until prices start coming back down to planet earth... I'm done too. I paid about the same for the 6900XT as I did for the 2080Ti Black Edition, a grand each. I don't really want to do that again. Anything over that... Is total madness. I did it twice now, don't need anymore. I can game at 4K at stable framerates, good enough for me. As the prices escalate they've lost a customer in me. I remember when you could buy a graphics card that was good enough to play games at damn near maximum resolution or at max if you bought 2 for like 100-150 bucks each. Gone are the days of value in the graphics market.
 
Yeah, until prices start coming back down to planet earth... I'm done too. I paid about the same for the 6900XT as I did for the 2080Ti Black Edition, a grand each. I don't really want to do that again. Anything over that... Is total madness. I did it twice now, don't need anymore. I can game at 4K at stable framerates, good enough for me. As the prices escalate they've lost a customer in me. I remember when you could buy a graphics card that was good enough to play games at damn near maximum resolution or at max if you bought 2 for like 100-150 bucks each. Gone are the days of value in the graphics market.

The kicker is, now you can buy a brand new 6900XT for $750 and used for even less. Timing is not always where I would like it to be, since I paid more for my 6800XT than I could have the the 6900XT, 3 months later. :) How do we know if the prices for the next gen AMD / Nvidia cards are going to be more, unless that has been hinted at with numbers?

As for the attempted TSMC order cut that will not happen, Nvidia can afford it and they will work it out. Instead, we, as the consumer, should be pleased that the back and forth competition is bringing us increased performance in rapid fashion, overall.
 
The kicker is, now you can buy a brand new 6900XT for $750 and used for even less. Timing is not always where I would like it to be, since I paid more for my 6800XT than I could have the the 6900XT, 3 months later. :) How do we know if the prices for the next gen AMD / Nvidia cards are going to be more, unless that has been hinted at with numbers?

As for the attempted TSMC order cut that will not happen, Nvidia can afford it and they will work it out. Instead, we, as the consumer, should be pleased that the back and forth competition is bringing us increased performance in rapid fashion, overall.
I hadn't seen the 6900XT's for 750... I have gotten some good use time out of my card... but damn... It's like 250 bucks less than I got it for!

Nvidia can do whatever they want to at this point. As far as I'm concerned, if they ever had any desire to adhere to efficiency it's gone now. The amount of power required to run these monsters is bordering on stupid. It's funny, because the 2080Ti I have generates a stupid amount of heat. I have to open any chassis I put it into or it will eventually impact my thermals to the point my CPU beings throttling. Don't have the issue with the 6900XT at all. I can game for hours and the amount of heat generated is easily kicked out of my case.

We see these new (and the upcoming 4000 series) card drawing enough power that they had to design their own special power connectors and the draw is running unchecked...
I guess, what I'm saying is either Nvidia is running into process limitations or their architectures are getting crappier. Insane Waste heat has been a thing with Nvidia since the 1080Ti (for me), my paired 970's didn't even come close to that one card. But this is getting stupid. I hope AMD doesn't go the same route, but it kinda looks like efficiency is out the window, for everyone, moving forward.
 
Nvidia can do whatever they want to at this point. As far as I'm concerned, if they ever had any desire to adhere to efficiency it's gone now. The amount of power required to run these monsters is bordering on stupid. It's funny, because the 2080Ti I have generates a stupid amount of heat. I have to open any chassis I put it into or it will eventually impact my thermals to the point my CPU beings throttling. Don't have the issue with the 6900XT at all. I can game for hours and the amount of heat generated is easily kicked out of my case.

What? If the airflow in your case can't handle a 2080Ti it won't handle a 6900XT. The 6900XT will put more heat into the case than the 2080Ti, it uses more power.

Are you sure it's your CPU throttling and not your 2080Ti? The EVGA black edition suffers from GPU throttling if you don't use a pretty aggressive fan curve. It was the cheapest 2080Ti for reason, the fans they used were poor.
 
The only way the 2080Ti/6900XT comparison makes sense in my head is if the *2080Ti was a dual/triple fan dumping its heat back into the case and the 6900XT was the hybrid blower/AIO style.
 
Last edited:
What? If the airflow in your case can't handle a 2080Ti it won't handle a 6900XT. The 6900XT will put more heat into the case than the 2080Ti, it uses more power.

Are you sure it's your CPU throttling and not your 2080Ti? The EVGA black edition suffers from GPU throttling if you don't use a pretty aggressive fan curve. It was the cheapest 2080Ti for reason, the fans they used were poor.
This is clearly not the case at all. The 6900XT I have doesn't "light on fire" like my 2080Ti (or even my 1080Ti) does. It may be pulling the juice but the way it dissipates the heat is vastly superior to my 2080Ti. Hell, I can touch the 6900XT while under load in a game and I don't lose the skin on my fingertips. I have made the mistake of touching an Nvidia card under load before... Won't do that again.

The 6900XT will pull up to 300 watts per spec; the 2080Ti pulls 250 watts per spec.

In real world applications, the 6900XT I own is quiet, cool and beats the living hell out of my old 2080Ti.

I can tell when my CPU is throttling vs my Graphics card. I don't recall complaining about my performance from the 2080Ti here (which I was not). I was simply stating actual facts of waste heat being dumped into every single case I have had the card in. I suspect everything in my case was being throttled by the 2080Ti...

To put things into perspective I was running a 5600X with the 2080Ti... I put the 6900XT into that case and upgraded the CPU to the 5900X and everything is running cool and fast. So, I doubled my CPU Cores and that pushed the minimum draw on the CPU from 65 to 105 Watts (much higher under load) and Put a Graphics Card in it that draws 50 Watts more than my old one and the entire rig is cool and fast. I see no thermal issues at all.
 
Last edited:
The only way the 2080Ti/6900XT comparison makes sense in my head is if the 1080Ti was a dual/triple fan dumping its heat back into the case and the 6900XT was the hybrid blower/AIO style.
I think you can look at raw TDP but the actual node efficiency that the 6900XT (or any 6000 series card for that matter) has over Nvidia is a HUGE ONE. IIRC the 2080Ti is a "Stretched Silicon" 14nm process that yielded a "12nm" node. AMD is sitting on a Mature 7nm node with the 6000 Series. It's the same kind of parallel that we can make with AMD desktop processors vs the Intel 14nm+++++++ and even their new 10nm++ designs. The intel chips are power hogs and generate shitloads of heat, the AMD chips are efficient in general and cooler in just about every respect.
 
zhrooms on overclock.net OC'd the crap out of a 2080 Ti(535W vs 250W) and got a 30% improvement. What kind of performance improvements do we get with similar power percentage* increases to the 6900XT?
 
I wouldn't say they did nothing. They made mining specific cards, was a way to use up chips that would have been scrap otherwise.
Bullshit
Made financial sense, less waste. But, I think the yields were pretty good, so the point that perfectly good chips were used for mining cards I believe is true. But, I wouldn't say they did nothing.
Where did Nvidia say that these mining cards were made from faulty chips?
They did what they could to maximize profits, which is their primary responsibility, not clamoring to play up to the desires of whiny bitchy forum posters and tweeters. Sucks but is true.
It's also not the responsibility of the whiny bitchy forum posters to buy their new RTX 4000 series cards when their RTX 3000 series is going to be so much cheaper, thanks to the used market. Of course I'll be buying an AMD, once the price is right.
The reality is that PC gaming wouldn't be where it is without nVidia.
You mean 3Dfx and Sony as Nvidia didn't play a role until afterwards.
We do need them, probably more than they need us (at this point), as they are in more markets than just PC gaming now. Facts be inconvenient these days huh.
So how much money did you lose from Nvidia's stonks?
And when that new shiny comes out, you know you will want it. Which makes this thread all the more hilarious.
Everyone else is going to buy their RTX 3000 cards for a fraction of the price once they hit the used market. RTX 3060 is currently $350 off Ebay, and the RX 6600XT is $270. Getting there but needs more desperation.
 
This is clearly not the case at all. The 6900XT I have doesn't "light on fire" like my 2080Ti (or even my 1080Ti) does. It may be pulling the juice but the way it dissipates the heat is vastly superior to my 2080Ti. Hell, I can touch the 6900XT while under load in a game and I don't lose the skin on my fingertips. I have made the mistake of touching an Nvidia card under load before... Won't do that again.

The 6900XT will pull up to 300 watts per spec; the 2080Ti pulls 250 watts per spec.

Could very well depend on what cooler each card has. I am assuming your 6900XT has a superior cooler design than your particular Nvidia GPUs.
 
This is clearly not the case at all. The 6900XT I have doesn't "light on fire" like my 2080Ti (or even my 1080Ti) does. It may be pulling the juice but the way it dissipates the heat is vastly superior to my 2080Ti. Hell, I can touch the 6900XT while under load in a game and I don't lose the skin on my fingertips. I have made the mistake of touching an Nvidia card under load before... Won't do that again.

The 6900XT will pull up to 300 watts per spec; the 2080Ti pulls 250 watts per spec.

In real world applications, the 6900XT I own is quiet, cool and beats the living hell out of my old 2080Ti.

I can tell when my CPU is throttling vs my Graphics card. I don't recall complaining about my performance from the 2080Ti here (which I was not). I was simply stating actual facts of waste heat being dumped into every single case I have had the card in. I suspect everything in my case was being throttled by the 2080Ti...

To put things into perspective I was running a 5600X with the 2080Ti... I put the 6900XT into that case and upgraded the CPU to the 5900X and everything is running cool and fast. So, I doubled my CPU Cores and that pushed the minimum draw on the CPU from 65 to 105 Watts (much higher under load) and Put a Graphics Card in it that draws 50 Watts more than my old one and the entire rig is cool and fast. I see no thermal issues at all.
Just one question before I reply to this. Do you have a reference 6900XT or an AIB one, like the Gaming X trio?
I think you can look at raw TDP but the actual node efficiency that the 6900XT (or any 6000 series card for that matter) has over Nvidia is a HUGE ONE. IIRC the 2080Ti is a "Stretched Silicon" 14nm process that yielded a "12nm" node. AMD is sitting on a Mature 7nm node with the 6000 Series. It's the same kind of parallel that we can make with AMD desktop processors vs the Intel 14nm+++++++ and even their new 10nm++ designs. The intel chips are power hogs and generate shitloads of heat, the AMD chips are efficient in general and cooler in just about every respect.

You are very mixed up. The 2080Ti uses less power than the 6900XT. The one the produces the most heat is the 6900XT, there is no getting away from thermodynamics. The 7nm process that AMD is using for RDNA 2 is more efficient than the 14nm process used in Turin. But, that doesn't change the power consumption of both cards.
 
What? If the airflow in your case can't handle a 2080Ti it won't handle a 6900XT. The 6900XT will put more heat into the case than the 2080Ti, it uses more power.

Are you sure it's your CPU throttling and not your 2080Ti? The EVGA black edition suffers from GPU throttling if you don't use a pretty aggressive fan curve. It was the cheapest 2080Ti for reason, the fans they used were poor.
Curious, for the same settings and if you cap the FPS the same, shouldn't the 6900xt use less power? (not saying this is what Legendary Gamer was necessarily doing, just curious)
 
Curious, for the same settings and if you cap the FPS the same, shouldn't the 6900xt use less power? (not saying this is what Legendary Gamer was necessarily doing, just curious)

Techpowerup has done something like this since 2020 in their GPU reviews. Check out any recent graphic card review and go to power consumption. I believe they use the same games and settings across all the cards.

The 6900XT will use less power if you cap the frame rate, but so will the 2080Ti.
 
Just one question before I reply to this. Do you have a reference 6900XT or an AIB one, like the Gaming X trio?


You are very mixed up. The 2080Ti uses less power than the 6900XT. The one the produces the most heat is the 6900XT, there is no getting away from thermodynamics. The 7nm process that AMD is using for RDNA 2 is more efficient than the 14nm process used in Turin. But, that doesn't change the power consumption of both cards.

It can be a matter of how the video card heatsink dumps the heat, some designs dump the heat into the case which can cause issues with overheating other components vs. a heatsink that vents the heat outside the case.
 
Techpowerup has done something like this since 2020 in their GPU reviews. Check out any recent graphic card review and go to power consumption. I believe they use the same games and settings across all the cards.

The 6900XT will use less power if you cap the frame rate, but so will the 2080Ti.
Thanks. Couldn't find fps cap comparison, but I do the see the power effiency page (perf / watt). Kinda similar
https://www.techpowerup.com/review/amd-radeon-rx-6900-xt/36.html
 
Just one question before I reply to this. Do you have a reference 6900XT or an AIB one, like the Gaming X trio?


You are very mixed up. The 2080Ti uses less power than the 6900XT. The one the produces the most heat is the 6900XT, there is no getting away from thermodynamics. The 7nm process that AMD is using for RDNA 2 is more efficient than the 14nm process used in Turin. But, that doesn't change the power consumption of both cards.
I an MSI Gaming X Card.

Well, You guys can all disagree with me. However, I know what my system's thermals are like and everything is better with the 6900XT. I throw the 2080Ti into my case and all the fans crank up and the room is actually hotter and I have to pull the side door off the Case to get my system to cool down. So, in my real world experience, my system is running cooler and faster than with the 2080Ti.

Whether I am undervolting the card or not, my system runs cooler and faster. Nothing in my case has to work very hard to run fast without the 2080Ti. That's the actual reality of my situation.

I have never disputed the power consumption of the cards. Not once. I do believe I even cited that the 6900XT draws at a minimum 50 more watts.

Perhaps, in my use case, I am not stressing the card at all. Compared to the 2080Ti.
 
Well it would probably have to be, regardless.

I wonder if part of the equation is some quirk of the exhaust airflow of one of the coolers interacting with the CPU HSF in some unforeseen way.

Sounds like it could take some extensive testing to be sure. Look, I'm willing to give it a shot. I'll PM you my receiving address shortly. Your welcome. Be sure to pack both cards with care.
 
Status
Not open for further replies.
Back
Top