Thunderdolt
Gawd
- Joined
- Oct 23, 2018
- Messages
- 949
Don't forget the 150-300w heat output of the bag of meat between keyboard and chair while gaming. More if also drinking Redbull.
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature currently requires accessing the site using the built-in Safari browser.
Generally more like 75-150w; but yeah, it's non-zero.Don't forget the 150-300w heat output of the bag of meat between keyboard and chair while gaming. More if also drinking Redbull.
2080ti to 3090 was 50%.I think we've about hit the peak of what humanity is capable of when it comes to high end silicon based computer hardware. CPU performance has stagnated for quite some time now - it's actually kind of sad how little its progressed since Sandy Bridge, and now we're only going to see a 25% performance improvement after 2 years with GPUs, but not only that - with also 25% more power consumed and 25% larger coolers. And that's coming after the jump between the 2080Ti and 3090 wasn't that big either (I think it was around ~30% if I recall?). At least 4K (per eye) VR at 90fps is within reach with where we are now.
Seem to be around 65% for the same size of die that has about 2.7 time more transistor by mm:I think we've about hit the peak of what humanity is capable of when it comes to high end silicon based computer hardware. CPU performance has stagnated for quite some time now - it's actually kind of sad how little its progressed since Sandy Bridge, and now we're only going to see a 25% performance improvement after 2 years with GPUs,
I'm not an idiot. Yes it's the same heat. I think everyone including you knew what I was getting at whether or not the post was phrased the best way. And yes I picked an arbitrary point for power draw based on what USED to be considered a high watt system setup. The constant draw possible with a 3080 Ti and an i9-12900k is frankly huge. I also based my number on what I considered a high electric bill in the US to run that system at around $.15 per kilowatt hour for 8 hours a day. I was actually using 450w as a total system constant average power draw number. Many gaming systems can easily hit this number today when playing a AAA game. Mine can.Small room, large room, same heat.
How much AC? Lol wtf are you talking about? That's not how things work.
Nifty thing about CPU and GPU-- If it pulls 450w, then it dissipates 450w.
Lifespan is not calculable by any review or influencer. This would literally have to be taken at the manufacturer's word.
Bake expensive hardware?
At any power draw above or below 300w, there is a cost to run. Why did you decide to set 300w as this arbitrary point? Do you get free electricity if your GPU draws ≤300w?
Your final statement shows your lack of understanding of how these things work.
Should probably just delete your whole post.
Yeah, I definitely agree here.I think we've about hit the peak of what humanity is capable of when it comes to high end silicon based computer hardware. CPU performance has stagnated for quite some time now - it's actually kind of sad how little its progressed since Sandy Bridge, and now we're only going to see a 25% performance improvement after 2 years with GPUs, but not only that - with also 25% more power consumed and 25% larger coolers. And that's coming after the jump between the 2080Ti and 3090 wasn't that big either (I think it was around ~30% if I recall?). At least 4K (per eye) VR at 90fps is within reach with where we are now.
For a 150lb 25-35yo male, it's 80w while asleep. Even casual conversation can bring that up to 120w. I assume any game where fps matters would involve a higher state of engagement than that. Not a ton of heat, but at least as much as an AMD card for every 150lbs of meat in the room (don't forget the 75lb dog). Heat load just from the metabolism of room occupants is an actual input to HVAC system design. Don't forget the 60-100W from each screen, 150W from the rest of the system, and 20W from room lighting. We're at almost 500W before we add in the GPU. With his 300W baseline, we're now at 800W of heat in the room. At this point, what difference does an extra 150W actually make? The room is going to heat up regardless (unless you use a fan to exchange air between the room and the rest of the home - and then this all becomes moot).Generally more like 75-150w; but yeah, it's non-zero.
If you're having trouble paying for $0.54/day to pay for the electricity used by a 450w GPU and you're gaming for 8 hours every day, you could also try getting a job.I also based my number on what I considered a high electric bill in the US to run that system at around $.15 per kilowatt hour for 8 hours a day.
I've gotta say, this is one of the biggest reaches for an anti-progress argument that I've seen in a long time. People have cats and cats clog up filters and clogged filters increase case temps and increased case temps lead to increased component temps and increased component temps make everything die instantly and nobody anywhere over the last 30 years has thought to maybe have components self-throttle when temps cross certain thresholds, so therefore technological progress should be halted!How many of these new GPUs and CPUs are possibly "too much hardware" for many of the people being marketed to?
Maybe there won't be an issue... but I'm sure going to be watching to see how many RTX 3080 and above systems are dying at the 4 mark instead of the 7-10 year mark. If the hardware is going to be this expensive, people are going to EXPECT it to last. Whether or not that expectation is realistic.
The average watts per hour being consumed is going up with these cards each generation, not down. You don't buy a 3080 Ti to lock the FPS at 60 and play at 1080p to try save power when rendering. You're buying one to run 1440p or 4k as fast as you can with as much eye candy as you can.
Back to the room heat issue.
In a smaller room it takes less time to kick the temp up 10 or even 15 degrees. For me a whole lot less than one gaming session. It's not unusual for me to look up during a game and suddenly realize my office is at 85F. Hence the need for window AC frequently or constantly. And not all houses are perfectly efficient. This room leaks just enough that I can get away with it longer in winter as long as the central vent is closed and the room is never heated by central air. But I have to have a window AC available to supplement even when central cooling is active because the central air thermostat is obviously basing off the rest of the house, not just this room. The central air system turns on and off based on the temp and needs of the rest of the house. The office keeps getting heat dumped into it from the PC for however long a game is running.
If your PC happens to be in a fortunate place in your house for room size, airflow, and position relative to your central air intake then you may just have no noticeable problem.
Is it too late to say "pandemic recovery check!" still?Of course, one could ask how you were able to buy that $3k gaming setup to begin with.
Seem to be around 65% for the same size of die that has about 2.7 time more transistor by mm:
https://wccftech.com/nvidia-geforce...k-2077-dlss-3-cuts-gpu-wattage-by-25-percent/
Will see with final result but, 7950x seem to be a massive jump from the previous 5950x:
View attachment 513200
The 3090 average FPS was about 44% higher than a 2080TI with a significantly smaller die (628mm vs 754mm) :
https://www.techspot.com/review/2105-geforce-rtx-3090/
I am not sure how much stronger you need to be to gain 44% (has there is other element than purely the GPU going on, 50-55% more powerful ? but the giant memory bandwidth boost could be a major part of that performance gain).
Forgot about this one. Now that the results are out and they match my estimates, I'm here to lol.I want you to reply to FrgMstr 's reply to you.
You speak with an authoritive voice, but aren't one. The one you attempt to rebut however, is. So what's up? What do you know that a 30 year+ industry veteran doesn't? What's your source?
You didn't forget about anything. You chose to not answer.Forgot about this one. Now that the results are out and they match my estimates, I'm here to lol.
Looks like I was 100% correct regarding 600W as well. The stock BIOS allows you to go +33% on TDP. Can anyone here tell me what 133% of 450W is? I keep coming up with 598.5W, but everybody seems to be telling me I'm wrong.
It's almost like I've actually designed electronics before. Go figure.
You are 100% correct. I said that it was never designed to run at 600W at stock settings. That's simply not how design specs work. When you intend to load a bolt to 4,000lbs, you don't spec in a bolt with a strength of only 4,000lbs. You spec one that can hold 8,000lbs or more.You didn't forget about anything. You chose to not answer.
And you were disputing FrgMstr knowledge of 600w. You said no way would that be happening. So what the hell are you on about?
When your design spec is exactly your intended use case, your design will fail. Guaranteed
Outside a perfect 0% tolerance quality in all the elements achieved, I feel that almost trivially true, no ? Has the smallest error in the wrong direction will make you fail.This is not true. And I speak on this as an expert in an industry that has a long list of design specifications, rules, and laws, and engineering.
What's your explanation for the increase of the cooler size and the resulting comically gigantic models, for a card that uses about or less power than a 3090Ti?Forgot about this one. Now that the results are out and they match my estimates, I'm here to lol.
Looks like I was 100% correct regarding 600W as well.
LOL! Yeah. "600W is your design spec." "OK, it will break when it hits 601W." "We good."You are 100% correct. I said that it was never designed to run at 600W at stock settings. That's simply not how design specs work. When you intend to load a bolt to 4,000lbs, you don't spec in a bolt with a strength of only 4,000lbs. You spec one that can hold 8,000lbs or more.
Power delivery on a board works exactly the same way. When you need 450W of continuous power delivery, you have to spec a maximum capacity that is well above that 450W. This is because you have to account for component tolerances, momentary spikes, running inside of a case with no air flow, the impact of 10 years of aging and abuse, component thermals, minimizing coil whine, cooler assembly tolerances, allowing enthusiasts some overclocking headroom, etc etc.
When your design spec is exactly your intended use case, your design will fail. Guaranteed.
Ding! Ding! Ding!What's your explanation for the increase of the cooler size and the resulting comically gigantic models, for a card that uses about or less power than a 3090Ti?
I fail to see the concern with power on Ethusiast hardware. 600w+ is nothing compared to systems past with SLI or CF, with 2, 3 or 4 cards. Pushing over 1000w with multiple GPUs was easily done, some systems used multiple power supplies.
If you need more A/C, get it or buy the lower power options, maybe a console is in order for you.
I pushed more watts in the past, today hardware is much faster and lower power than what I had before.
I fail to see the concern with power on Ethusiast hardware. 600w+ is nothing compared to systems past with SLI or CF, with 2, 3 or 4 cards. Pushing over 1000w with multiple GPUs was easily done, some systems used multiple power supplies.
If you need more A/C, get it or buy the lower power options, maybe a console is in order for you.
I pushed more watts in the past, today hardware is much faster and lower power than what I had before.
Even the 290X was pushing 300+ watts at a time when 250W was still common. That was 9 years ago.This isn't new. Not sure why all the whining. How soon we forget the 295X2. 500w beast. Not to mention $1500 MSRP.
Halo gonna halo.
Crossfire-SLI solutions like the 295x2 had latency and consistency issues and many of the halo (titan-3090) had relativement insignificant enough performance gain over the "reasonable" ultra high-priced regular gaming card to be easy to not care about them, it was a bit of a luxury tax on the people that want to have the best, very little by $ gained.This isn't new. Not sure why all the whining. How soon we forget the 295X2. 500w beast. Not to mention $1500 MSRP.
Halo gonna halo.
If the required power is an issue, the card isn't for you. This generation uses much more power than 3 or 4 generations ago, and it's 3 or 4 times as fast, soooooooo.600 watts for 1 piece of hardware is new issue. Also very few did SLI and only the tiniest fraction ever had 3 cards or more. You also have to deal with all this heat and electrical demand and I don't feel like redoing my house because I upgraded my computer. Only reason you used more power back in the day was you were stuffing you PC with more hardware, now just 1 piece is sucking down the power. To me it just shows just how much of a wall they are starting to hit to increase the performance now.
What a great comparison.I fail to see the concern with power on Ethusiast hardware. 600w+ is nothing compared to systems past with SLI or CF, with 2, 3 or 4 cards. Pushing over 1000w with multiple GPUs was easily done, some systems used multiple power supplies.
I take the stance of if their house is scared of a high wattage gaming pc, they probably dont run a vacuum much either lolIf the required power is an issue, the card isn't for you. This generation uses much more power than 3 or 4 generations ago, and it's 3 or 4 times as fast, soooooooo.
Often people bitching about the power requirement are the ones who will never buy one anyway, so what?
Insert a sarcasm tag, I'm sure.What a great comparison.
All this talk about 600W when even AIB 4090 cards are using less power than the FE 3090 Ti.If the required power is an issue, the card isn't for you. This generation uses much more power than 3 or 4 generations ago, and it's 3 or 4 times as fast, soooooooo.
Often people bitching about the power requirement are the ones who will never buy one anyway, so what?
Unless the game is using RT and other stuff, the 4090 is using less power than the 3080!All this talk about 600W when even AIB 4090 cards are using less power than the FE 3090 Ti.
No kidding. When the 4090 reviews leaked and the 4090 was walking all over the 3090ti AND using similar or less power, I was impressed.All this talk about 600W when even AIB 4090 cards are using less power than the FE 3090 Ti.
No kidding. When the 4090 reviews leaked and the 4090 was walking all over the 3090ti AND using similar or less power, I was impressed.
And then out of the woodwork come the usual "OMMGGGG POWER USE QQ".
New gen, significantly faster, same or less power. I don't see the issue.
I must have missed single GPU examples that pushed anywhere near 600 W. Oh, but my dual socket quad SLI setup pulled some l337 watts, man, for real, yo!Insert a sarcasm tag, I'm sure.
If the comparison is bad for you, move down one or two posts and you'll see other halo cards that pulled significant power at their release nearly a decade ago.
If the required power is an issue, the card isn't for you. This generation uses much more power than 3 or 4 generations ago, and it's 3 or 4 times as fast, soooooooo.
Often people bitching about the power requirement are the ones who will never buy one anyway, so what?
Well if someone overclock it for very little gain, but being a power hog or not is quite subjective here, a bit like the new Ryzen it is a power hog in a way but it seem to also be one of the most efficiant by watt gpu ever.No crime was committed by pointing out it's a power hog,
Well if someone overclock it for very little gain, but being a power hog or not is quite subjective here, a bit like the new Ryzen it is a power hog in a way but it seem to also be one of the most efficiant by watt gpu ever.
If the talk of a 850 watt PSU being what is needed for many model true, I do wonder how large the market would be for a $1600+ USD video card that do not already have a 850 watt or higher PSU or mind to finally change their 600-750 watt model.
That said actual heat would still be an issue even if the power peak reduced the PSU side of the issues.
A 300 watt 4090 FE double the performance on an 6900XT a 320 to 330 watt card.
I fully get that paying that much for a card to lose some performance (any amount) by limiting it's power would not be realistic for almost everyone, you will want it, but going to 600watt versus an overclock seem a self-imposed issue for people that actively want it, here, it is already boderline that you need to push it to 450-460 instead of 300-350 or 400 max.
4K average fps at different power target:
https://allinfo.space/2022/10/11/nv...y-fast-even-with-less-tdp-and-without-dlss-3/
The jump from 50% more power of 300 to 450 seem limit to use (we see a good jump for the first 50w and not much else for the next 100watt already), I would like to see how the smaller 33% jump in a way less efficiant power band to go from 450 to 600 would do.
- GeForce RTX 4090 FE @ 450 Watt99.9
- GeForce RTX 4090 FE @ 350 Watt96.8
- GeForce RTX 4090 FE @ 300 Watt90.6
- GeForce RTX 3090 Ti @ 450 Watt58.9
- GeForce RTX 3090 Ti @ 350 Watt56.1
- Asus GeForce RTX 3090 Strix OC55,9
- Nvidia GeForce RTX 3080 Ti FE52,6
- GeForce RTX 3090 Ti @ 300 Watt49.8
- Nvidia GeForce RTX 3080 FE47.0
- XFX Radeon RX 6900 XT Black45,5
- AMD Radeon RX 6900 XT43.1
- AMD Radeon RX 6800 XT39.1
- AMD Radeon RX 680033.0
- Nvidia GeForce RTX 3070 FE28.5
- AMD Radeon RX 6700 XT25,7
- Nvidia GeForce RTX 3060 Ti FE25,2
- Asus GeForce RTX 3090 Ti TUF OC0,0
It it obvisouly to worry about power envelope, but sometime people overdo it, like CPU under light workloads when you are playing a video game, where even the 12900K did without using that much power and is a not really an issue for the new Ryzen at least for an message board user with Hard in the name).