Performance per watt/value proposition, GPU upgrade question.

jdempsey

Limp Gawd
Joined
Oct 26, 2017
Messages
159
So we've had some abstract convos about this in the past, but I'm ready for a GPU upgrade (1080ti) finallly, and I'm hoping for either affirmation of my conclussion, or some input on an alternative if I'm missing something.

Before I mention the conclussion I've come to; some backstory for those that don't know, to explain my criteria/caveats: I live completely off-grid in a remote part of the Appalachian mountains, 40 or so minutes from the nearest gas station, let alone a grocery store. I do not have electric service, produce my own either solar, or generator when solar isn't sufficient, due to the fact that this is also one of the highet precipitation areas of the country, so optimizing electrical consumption is definitely way up there in terms of my priorities. Pure technical FPS per input watt isn't necessarily king here, so much as balancing bang-for-buck and minimizing unnecessary power consumption, but this specifically is why I'm seeking a second opinion here, as I know this can be more complex than looking at the raw TPD at a given target FPS. Final caveat worth mentioning is that I am trying to stay below 320W, as that's the largest DC powered Pico style ATX PSU I can find that can be directly powered by my 48V battery bank. I'll be using a single dedicated 320W max PSU for only the GPU, with another powering my MB/CPU/etc, I'm mentioning this only so there isn't any confusion, not asking for input on whether this works or is a good idea (it does, and is, for me). However, I don't really trust these PSUs to their "rated" wattage, at least not continuously, so leaving some overhead here, feels prudent.

I recently acquired an LG Oled (B2) for a screaming good deal as well, which has an impact on decision, since previously I wouldn't have been necesarily fixated on 4k performance. I still likely won't game entirely/primarily on this TV however, since it's a big wattage consumer, but if I am going to upgrade GPU, I'd like to be able to utilize it reasonably well.

I was previously considering a 7800XT, but considering the new TV, and with the recent new models released, and looking at raw numbers, the 4070 Super looks unbeatable in terms of TDP (220W) for FPS at 4K, but am I missing something? I could stretch budget up to higher skus (but definitely not over $1k, and realistically anything over $800 would be very difficult to rationalize), but I'm not seeing the payoff in terms of the raw numbers, I mean, having more FPS is nice, having more headroom is great, and I know that there's a scenario where a higher end card could in theory actually use less power for the same FPS, which is part of why I'm asking, though I'm not sure how easily that can be objectively quantified?

I will say that, I give absolutely zero shits about any of the nvidia vs radeon features atm, more power to you if you do, but I've lived without RT and DLSS just fine for many years. I'm sure if I go with Nvidia this time, I'll try those features, and maybe enjoy them, but my primary metric is getting performance for as little watt hours as possible, while landing in comfortable territory to utilize the new 4k Oled, or push 120 Hz 1440p Ultrawide at high settings, and I definitely will not/do not give a shit about non-gaming features, other than perhaps CAD performance (but I'm not going out and buying a quadro or whatever for that alone, I do most of my design work on a T14 (amd) laptop in freecad lol).

So 220W, able to roughly handle 4k ultra at 60fps (at least on paper) for approx $600? Seems like a no-brainer, but admittedly, that's going to be pushing the whole 220w, can any other card under $1k offer me better power consumption in that scenario? I mean obviously the "gamer" part of my brain wants to find a way to justify a 7900XTX, and I could swing it price wise prob, especially for a used one, but I'm not sure that is actually a smart move. 4070 ti or 4070 ti super could be an option perhaps.

Anyway, I appreciate any insight, legitimate criticism, or just affirmation anyone may be willing to share, thanks!
 
My two cents about this is from my own experience tuning systems for image consistency and low noise, but steps are same to reduce power consumption (which is not a factor for me other then reducing noise). Take what you want from that and see what you can use for your project. :)

I have two gaming systems at home:

One
I build just as a HTPC system (components was chosen to try ChimeraOS/Steam OS for fun as a homemade console). That has an AMD 5800X3D CPU (undervolted) and an AMD 6950XT I bought cheap on sale (also undervolted). Recently I have installed Windows 11 on it again to tryout features for fun. This system is connected to a 4K TV.

Second system is connected to a 49" 5120x1440 ultrawide display. Has an old Nvidia 2080 TI (Asus Rog Strix) and an AMD 7800X3D CPU. Plan was to upgrade GPU on that, but I game most on TV when it comes to graphic intensive games that support controllers and more Strategy, RPG and similar games on the second system, so haven´t bothered upgrading yet (takes a bit tinkering, because of the size of GPU and want to keep cabinet, since I love that cabinet). I use this also to stream VR to a Oculus 3 headset.

Both systems are tuned for whatever game I play. I want a target framerate that suits the type of game I play and I dont want much deviation from the target framerate. I tune for noise with fan profiles and chose components accordingly, but will not include that here.

Following things you might want to use from my experience in tuning for both systems are:

- Forget the thought about 4K Ultra 60FPS in all games without compromise (upscalers, reducing settings, turning off features etc.). Instead focus on smooth gameplay at decent enough eyecandy. Then call it a day and enjoy the games.
-Undervolting is your friend. Sometimes you even get higher performance for less wattage.
-Upscalers (DLSS, FSR, XeSS) are your friends. In many cases they are decent enough, especially at 4K when base resolution is already higher. They can give you higher framerate for little sacrifice. In return, less power usage for you at same framerate.
-Frame limiter is your friend (MSI afterburner with RTSS tuner is my favorite for Nvidia and on AMD the control panel is fantastic for tweaking). When you have reached your target framerate (I cap the framerate where I can have least amount of dips). In return, less power usage and smooth console like frametimes. Make sure the framerate cap is within the GSYNC/Freesync range of your LG B2 for optimal results.

System specific tweaks:

System one
, the HTPC system connected to TV:

The AMD system is the most comfortable to tweak. The control panel makes it so easy to apply tweaks per game basis. It creates profile for the game once you start it and you can hit alt+R any time to adjust tweaks real time. Undervolting is done within control panel and you can also have different profiles for each games for undervolting and overclocking (I only use a global profile). If you find that you want to upscale a game that does not have temporal upscaling (FSR 2, 3 or XeSS) ingame, the control panels FSR (spacial upscaling) is better then your TVs upscaler. Radeon Chill gives you a decent framerate limiter. Freesync will make sure you get a smooth tearfree experience. Your TV support Freesync Premium and will give you LFC in cases of lower framerate at times. But still CAP with a bit overhead, so you get as little variation as possible for smoothest gameplay.

Use AMDs own metric overlay and make sure that you also activate graph for frametimes. Makes it easier to visualize frame consistency. You can also check power consumption for both GPU and CPU if you have AMD CPU (perhaps also on Intel CPU, but havent tested that).

System two, connected to 49" ultrawide:

MSI Afterburner can be used for undervolting GPU. Its a bit more hazzle, but once done, you can just play the game.
Use MSI Afterburner with RTSS tuner. You need to setup profiles manually in RTSS tuner for each game by navigating to .exe file.In RTSS tuner, make sure that you have graph selected for framerate/frametimes in OSD. You can also activate power consumption there, so you can see how much power the GPU and CPU uses. Run the game and see what you might need adjust in ingame settings to have for target framerate. Then alt-tab out of game and set that as framerate CAP for the game profile. Check if stable (as smooth line as possible on graph). Adjust accordingly. Enjoy game.

If you are not going to use more then 120fps, you can globally set a framerate cap for all games in Nvidia control panel at 117FPS, if you dont want to tweak all games. But that might not be best choice if you are going to use frame generation to get to 120fps. At least set it to something, if you dont play competitive shooters and aim for 4-500 or more FPS to get least amount of input lag. This way you save power on some games.

With Nvidia, you will have access to DLSS in many games, which is a good upscaler. In some cases, you can use frame generation if you hit 60fps and just want to game at 120fps. For some games that have no upscaler, its better to use Nvidias NIS in their control panel instead of your monitors upscaler.

As for which card to choose, pick your poison. With AMD you might get more performance for less money and an easier life tweaking games. With Nvidia, you get access to DLSS and also this generation Nvidia cards give better performance per watt, so you might be able to squeeze more FPS out of your power budget. Hopefully you can use some of what I use to tweak games for other purposes in your performance per watt quest. :)
 
So we've had some abstract convos about this in the past, but I'm ready for a GPU upgrade (1080ti) finallly, and I'm hoping for either affirmation of my conclussion, or some input on an alternative if I'm missing something.

Before I mention the conclussion I've come to; some backstory for those that don't know, to explain my criteria/caveats: I live completely off-grid in a remote part of the Appalachian mountains, 40 or so minutes from the nearest gas station, let alone a grocery store. I do not have electric service, produce my own either solar, or generator when solar isn't sufficient, due to the fact that this is also one of the highet precipitation areas of the country, so optimizing electrical consumption is definitely way up there in terms of my priorities. Pure technical FPS per input watt isn't necessarily king here, so much as balancing bang-for-buck and minimizing unnecessary power consumption, but this specifically is why I'm seeking a second opinion here, as I know this can be more complex than looking at the raw TPD at a given target FPS. Final caveat worth mentioning is that I am trying to stay below 320W, as that's the largest DC powered Pico style ATX PSU I can find that can be directly powered by my 48V battery bank. I'll be using a single dedicated 320W max PSU for only the GPU, with another powering my MB/CPU/etc, I'm mentioning this only so there isn't any confusion, not asking for input on whether this works or is a good idea (it does, and is, for me). However, I don't really trust these PSUs to their "rated" wattage, at least not continuously, so leaving some overhead here, feels prudent.

I recently acquired an LG Oled (B2) for a screaming good deal as well, which has an impact on decision, since previously I wouldn't have been necesarily fixated on 4k performance. I still likely won't game entirely/primarily on this TV however, since it's a big wattage consumer, but if I am going to upgrade GPU, I'd like to be able to utilize it reasonably well.

I was previously considering a 7800XT, but considering the new TV, and with the recent new models released, and looking at raw numbers, the 4070 Super looks unbeatable in terms of TDP (220W) for FPS at 4K, but am I missing something? I could stretch budget up to higher skus (but definitely not over $1k, and realistically anything over $800 would be very difficult to rationalize), but I'm not seeing the payoff in terms of the raw numbers, I mean, having more FPS is nice, having more headroom is great, and I know that there's a scenario where a higher end card could in theory actually use less power for the same FPS, which is part of why I'm asking, though I'm not sure how easily that can be objectively quantified?

I will say that, I give absolutely zero shits about any of the nvidia vs radeon features atm, more power to you if you do, but I've lived without RT and DLSS just fine for many years. I'm sure if I go with Nvidia this time, I'll try those features, and maybe enjoy them, but my primary metric is getting performance for as little watt hours as possible, while landing in comfortable territory to utilize the new 4k Oled, or push 120 Hz 1440p Ultrawide at high settings, and I definitely will not/do not give a shit about non-gaming features, other than perhaps CAD performance (but I'm not going out and buying a quadro or whatever for that alone, I do most of my design work on a T14 (amd) laptop in freecad lol).

So 220W, able to roughly handle 4k ultra at 60fps (at least on paper) for approx $600? Seems like a no-brainer, but admittedly, that's going to be pushing the whole 220w, can any other card under $1k offer me better power consumption in that scenario? I mean obviously the "gamer" part of my brain wants to find a way to justify a 7900XTX, and I could swing it price wise prob, especially for a used one, but I'm not sure that is actually a smart move. 4070 ti or 4070 ti super could be an option perhaps.

Anyway, I appreciate any insight, legitimate criticism, or just affirmation anyone may be willing to share, thanks!
4070 and 4070 super are both incredible performance per watt. Use DLSS quality at 4K/60 when you can, and you'll be super efficient. Even when you can't, the performance these two cards get for 220 watts max, is unmatched.

4070 ti is worth considering, if you can get it on a closeout deal or open box deal, which makes the price closer to a 4070 super.

All 3 cards should be ok for a decent undervolt, as well. That's YMMV. But, it can work out for you.

4070 ti super ended up being a bit of a dud. As it is only a little bit better than a 4070 ti, plus an extra 4GB of RAM. I don't think its worth considering, really.

4080 on closeout or open box, would be the next rung. Its about 100 watts more power, though. Only worth it if some of your main games have no DLSS. *And I will be the first to tell you, DLSS is not perfect. Varies from game to game. Some games it can still show some distracting issues. But, overall, I think its good enough nowadays to be an important feature.



As for CPU: Ryzen 7800X3D is the best gaming CPU you can buy----and uses basically the same power as a Ryzen 7600.
https://www.techpowerup.com/review/amd-ryzen-7-7800x3d/24.html

And if you need to pinch some pennies----the 7600 is no slouch and great performance for the money. It scratches the heals of the 13600k, for about $80 - $100 less. And uses a fair chunk less power in gaming.
 
Last edited:
Just to pile on some great suggestions so far. IMO if building new the Ryzen 7800x3d with a 4070 is probably the most power efficient and performant combo going at this time per watt. I have a 7900xt which is thirsty if you let it run but frame limiting keeps it in the 220-250 range easily so anything less power hungry should be doable with the suggestions on frame capping and tuning made above.
 
Just to pile on some great suggestions so far. IMO if building new the Ryzen 7800x3d with a 4070 is probably the most power efficient and performant combo going at this time per watt. I have a 7900xt which is thirsty if you let it run but frame limiting keeps it in the 220-250 range easily so anything less power hungry should be doable with the suggestions on frame capping and tuning made above.
You still running @ 1080p?
 
Funny story that... My 7 year old 1440p 144hz monitor died and I replaced it this past weekend with a cheap 165hz LG 1080p while at Walmart for other crap. So for now 1080p is back baby! I'm planning on a house move later this year and not looking to dump hundreds into another monitor before that. Shitty timing for my other screen to crap out. I've been drooling over OLEDs for a while now.
 
4070 and 4070 super are both incredible performance per watt. Use DLSS quality at 4K/60 when you can, and you'll be super efficient. Even when you can't, the performance these two cards get for 220 watts max, is unmatched.

4070 ti is worth considering, if you can get it on a closeout deal or open box deal, which makes the price closer to a 4070 super.

All 3 cards should be ok for a decent undervolt, as well. That's YMMV. But, it can work out for you.

4070 ti super ended up being a bit of a dud. As it is only a little bit better than a 4070 ti, plus an extra 4GB of RAM. I don't think its worth considering, really.

4080 on closeout or open box, would be the next rung. Its about 100 watts more power, though. Only worth it if some of your main games have no DLSS. *And I will be the first to tell you, DLSS is not perfect. Varies from game to game. Some games it can still show some distracting issues. But, overall, I think its good enough nowadays to be an important feature.



As for CPU: Ryzen 7800X3D is the best gaming CPU you can buy----and uses basically the same power as a Ryzen 7600.
https://www.techpowerup.com/review/amd-ryzen-7-7800x3d/24.html

And if you need to pinch some pennies----the 7600 is no slouch and great performance for the money. It scratches the heals of the 13600k, for about $80 - $100 less. And uses a fair chunk less power in gaming.
Appreciate all the replies guys, lots of food for thought.

Funnily enough I had just about convinced myself to spend money on the 4070ti super, after looking at the average wattage consumption vs FPS on the current Tom's GPU recommendation article. Although, since I'm not that concerned with RT, ignoring their aggregated metrics based on assuming RT, but looking at the actual benchmarks in games, I started considering 7900XTXs again, since "street"/used prices seem to be falling drastically for them. It's tough to drop $800 on a 4070ti super, when there's a 7900xtx for the same money *right now* on the For Sale section of the forum.

I find myself wondering whether I'd want the headroom occasionally, for when I'm willing to run the system off inverter, but undervolt/power limiting, the rest of the time...

*sigh* 1st world problems right? o_O
 
BTW, I don't have any delusions about 4k Ultra, I'm just using it as a loose metric (4k ultra at 60fps fwiw), to shoot for, since most of the gpu heirarchy charts out there use aggregate averages to differentiate models anyway. That said, I've been rocking a 1080ti since release (well, on my second one), which frankly I'm still perfectly happy with for 1080p/1440p@100-120hz, since it can still run most recent games with high to ultra settings with just a few things like shadows turned down, and bullshit like blur turned down.

I even had quite good luck with it back before my previous 4k TV (fuck you samsung) started developing massive growing blobs of dead pixels, and with it(1080ti) could usually manage near 60fps with quite high settings in most games after ditching all the AA overhead(admittedly both the 1080ti's I had were silicon lottery wins and OC'd extremely well), though the aforementioned "gpu heirarchies" would make that sound impossible. Using those however, if they're remotely accurate, it seems like I can basically double my performance from the 1080ti, at 4k, with the 4070 super or a higher end GPU, with similar TPD and average consumption? If that's the case, I'll be happy.
 
Just to pile on some great suggestions so far. IMO if building new the Ryzen 7800x3d with a 4070 is probably the most power efficient and performant combo going at this time per watt. I have a 7900xt which is thirsty if you let it run but frame limiting keeps it in the 220-250 range easily so anything less power hungry should be doable with the suggestions on frame capping and tuning made above.
I second that. It sips power compared to other CPUs with similar performance. In addition, when you start capping games, reducing settings, using upscalers and also old games where GPU is more then enough, it have given me more boost in smoothness then many GPU upgrades I have had even with the old RTX 2080 TI (from 5600X to 7800X3D with same 2080 TI).

VR games (high resolution, but VR games are build for and need high framerate), Sim games (sim racing, sim building, space sim etc), strategy games (even old RTS games like Starcraft), online multiplayer games (from FPS shooters like Tarkov to World of warcraft) and some genre of RPG games (like Baldurs gate 3) benefit greatly from the CPU power and 3D cache of the 7800X3D. When you are GPU bound, it doesn't matter, but the moment you start moving out of GPU bound territory, you start seeing the return of your investment.
 
Last edited:
Funnily enough I had just about convinced myself to spend money on the 4070ti super, after looking at the average wattage consumption vs FPS on the current Tom's GPU recommendation article. Although, since I'm not that concerned with RT, ignoring their aggregated metrics based on assuming RT, but looking at the actual benchmarks in games, I started considering 7900XTXs again, since "street"/used prices seem to be falling drastically for them. It's tough to drop $800 on a 4070ti super, when there's a 7900xtx for the same money *right now* on the For Sale section of the forum.
I would still consider 4070 super. Your powersupply is only 320W. GPUs in classes over that have spikes that can trigger your PSU. Even though the 7900 XTX actually gives you higher frames per watt, that GPU also have 20ms spikes up to 391W. The 4070 Super have 239W and on this chart the 4070 TI have 332W. Didnt look at the 4070 TI Super. Aftermarket models might have more or less, this is based upon the founders edition:
https://www.techpowerup.com/review/nvidia-geforce-rtx-4070-super-founders-edition/41.html

Would suck if you have an unstable system because transient spikes and overhead according to the GPUs is not taken into account, so even though you in OP asked not to consider the setup, I find that to be at least worthy of keeping in mind.

Many RTX 3090 and Radeon 6900 owners have had unstable systems and it turned out transient spikes and the PSU was the issue.
 
..but considering the new TV, and with the recent new models released, and looking at raw numbers, the 4070 Super looks unbeatable in terms of TDP (220W) for FPS at 4K...
I think you are right. Any of the 4070's will be good.
I will say that, I give absolutely zero shits about any of the nvidia vs radeon features atm, more power to you if you do, but I've lived without RT and DLSS just fine for many years...
If you have a target framerate you want to reach, DLSS and Frame Gen can be hugely helpful. Let's say your new 4k TV is max 120Hz. In your Nvidia control panel you can set the max FPS (it's best to set it to 3 fps below the actual max), and the card can potentially produce less frames which should result in power savings. This will depend on the game being played and the base resolution. DLSS can render a game at 1080p or 1440p then upscale and it still looks very good and uses less power.

It's an interesting question that I am not sure I have seen anyone test for, before. Be interesting to see what your power draw is on the GPU once you are all set up. You can use GPU-Z to see the power usage in watts of your card. Sensors tab then set the watts sensors to record the Max observed value, then go play something.
 
Last edited:
Of course I have a different opinion to everyone :)

  • A good 550-650W PSU won't be bettered by a picopsu at 250-300W - no matter how you slice it. Picopsus are great for up to 100W or where you need a small wattage machine in a small space, but their regulation is poor and once you creep up, you're not going to get the efficiency gains - something like the Seasonic 650W Prime TX-650 Titanium PSU or a corsair HX650 would be my recommendation - the latter doesn't switch the fan on at low wattages/runs passive
  • I'd lean intel, not AMD cause they idle at lower power, the 14600k or 14700k - and I'd limit the peak wattage. The machine will likely be idling more than it will be running full out.
  • Ironically ram (if run really fast/high wattage) can chew 30W, so be aware of this
  • I'd stick with the 4070 super or ti, but I'd look for a "non overclocked" model
 
I would still consider 4070 super. Your powersupply is only 320W. GPUs in classes over that have spikes that can trigger your PSU. Even though the 7900 XTX actually gives you higher frames per watt, that GPU also have 20ms spikes up to 391W. The 4070 Super have 239W and on this chart the 4070 TI have 332W. Didnt look at the 4070 TI Super. Aftermarket models might have more or less, this is based upon the founders edition:
https://www.techpowerup.com/review/nvidia-geforce-rtx-4070-super-founders-edition/41.html

Would suck if you have an unstable system because transient spikes and overhead according to the GPUs is not taken into account, so even though you in OP asked not to consider the setup, I find that to be at least worthy of keeping in mind.

Many RTX 3090 and Radeon 6900 owners have had unstable systems and it turned out transient spikes and the PSU was the issue.

No the spike issue is a very valid concern, that I hadn't (but should have) considered, just hadn't seen any mention of it or numbers out there, thank you for mentioning it!
 
Of course I have a different opinion to everyone :)

  • A good 550-650W PSU won't be bettered by a picopsu at 250-300W - no matter how you slice it. Picopsus are great for up to 100W or where you need a small wattage machine in a small space, but their regulation is poor and once you creep up, you're not going to get the efficiency gains - something like the Seasonic 650W Prime TX-650 Titanium PSU or a corsair HX650 would be my recommendation - the latter doesn't switch the fan on at low wattages/runs passive
  • I'd lean intel, not AMD cause they idle at lower power, the 14600k or 14700k - and I'd limit the peak wattage. The machine will likely be idling more than it will be running full out.
  • Ironically ram (if run really fast/high wattage) can chew 30W, so be aware of this
  • I'd stick with the 4070 super or ti, but I'd look for a "non overclocked" model
I have a good traditional PSU, 1000W platinum rated, *but* the issue is that I'd have to run it off the inverter, and I'm determined not too; the traditional switchmode PSUs we use, btw, have a maximum efficiency at a very specific wattage that isn't their peak wattage, nor is it consistent across the potential wattage range, add to that the minimum overhead of the inverter, and I will be suffering significant power losses over direct DC to DC conversion. I will be using 2x 320W rated "pico" PSUs that operate from up to 60V DC. This works perfectly for my primary battery bank. These PSU's do work, and will handle 320Ws, the efficiency is quite high because it's doing simple DC to DC conversion, I don't trust them to run maxed out but they have been tested to such. Even if they don't work out, I'll be building a DC to DC supply from scratch, I'm fairly competent with electronics design, and have built all of my setup here from scratch already. I am determined to have all of my daily driver equipment, appliances, etc, running off DC directly, and have built my house/life around such already. This is basically the last box to tick. Other than AC power tools and my washing machine, literally everything else runs directly off DC and/or propane (fridge/freezer). Even if it wasn't actually a net gain in power efficiency, I'd still be determined to do it this way.

This may seem strange in the "PC" realm, but keep in mind, that the overwhelming majority of telco/communications equipment, and a huge swath of industrial equipment is already being directly DC powered, it's very common, and the way we do it currently in the consumer space is pretty convoluted. Commercial equipment (servers etc) have long since adopted a 12VDC PSU standard, and personally I'm beyond ready for Intel's 12VO standard to become mainstream. When it does (and I need to look into it again, last time I looked, there weren't any consumer/enthusiast style MBs that were rocking it), my situation will become very easy, it's nothing for me to provide CV +12VDC to any equipment, it's what all my networking gear is running off of already (I just rip out the AC/DC PSU and solder on an XT60H connector, plug it into my 12VDC bank that I have setup specifically for gear (most electronics) that run natively off 12V). Even if I don't use the aforementioned 320w pico psu, I will power the GPU this way.
 
I have no doubt as to your electronics prowess. The lack of information I had was not knowing you had a DC line to the batteries, so without that, naturally my opinion would be to go with the simplest solution for the average consumer.

That said, it doesn’t invalidate my other points re intel, ram or 4070.

I would also be thinking very carefully about which motherboard you decide on. Fewer power stages, less ancillary gear. Ironically I recommend looking at matx and itx boards for that reason
 
Last edited:
Are you sure you don't want a...laptop? They run of off DC, are generally designed to have super low idle power, are built out of wide GPUs clocked low and binned for good low power performance, etc. If you were building a new rig you could get a 4080 based laptop for under $2K which would slide the *whole machine* into the thermal envelope of a 4070S with guaranteed sub-25W idle.

I think DLSS is more important than you give it credit for, especially as adoption continues to go up. Going forwards there's no way a 4070S will be able to handle 4K - it's barely scraping by 4K medium/high on launch day which means in a couple of years its going to be a tweaking shitshow to get games to run smoothly. But DLSS lets you render internally at 75% resolution then upscale, which is a very legitimate technique - the upscaled image isn't blindly fake from the lower res image, its also conditioned on normals, hits, motion vectors, etc which are not visible in the final image but provide extra information to improve the upscaler performance. Plus, if I understand, DLSS trains the upscaler on individual games to improve performance even more.

It's all sort of moot anyway because Nvidia still has better efficiency than AMD, so within your original design constraints DLSS is a given.

You'll also get better performance by getting a wider GPU and power limiting. The consumer flagships have really high power operating points at stock to try to creep each other in benchmarks, but I bet a 4080S limited to 220W would give you a good time. You would probably have to underclock the VRAM as well - at 220W there's going to be a lot of power overhead from GDDR6X which is not exactly a shining beacon of efficiency.
 
Last edited:
Are you sure you don't want a...laptop?
This is the way to go if you are looking for max efficiency, and also easy DC power. The laptop will want 20VDC input so that is the only DC-DC supply you need, it handles the rest internally. They also have power targets right in your range. The Alienware laptops, which are my personal choice, max at 360 watt PSUs, and they have models with more sedate components that are 330-watt, 240-watt (what I use), and even 130-watt for their smallest. They also often have tools to tune power draw vs performance. They also, of course, have a screen in them, also usually tuned for low power consumption, that you can use.

Now laptop or desktop, people are right if 4k is something you want then DLSS is a must. Heck I find plenty of games where I need DLSS for 4k on a 4090 because they hit so hard. On anything less, you'll need it. At this time, DLSS is better than FSR for quality. Hardware Unboxed did a great comparison over 26 games and the TLDW of it is that in no case did FSR look better, at best it was a tie, usually DLSS looked better, often quite a bit better, and the results went more to DLSS's favor the lower the rez/performance mode was. Games are just demanding more and more GPU power if you want the shiny visuals, and the only way to realistically achieve that at high resolutions is to use upscaling. I know a lot of people like to hate on it as some kind of "hack" but everything we do in graphics is a "hack" we don't actually properly solve the rendering equality and render everything 100% correctly because it is impossibly slow to do so.

Plus something you'll discover is that in modern games, you usually want temporal anti-aliasing. They are designed with it in mind, and it is the only way (other than SSAA which is way too intensive) to eliminate artifacts from all shaders. Guess what DLSS (and FSR) is? TAA, just done from a lower input resolution.
 
This is the way to go if you are looking for max efficiency, and also easy DC power. The laptop will want 20VDC input so that is the only DC-DC supply you need, it handles the rest internally. They also have power targets right in your range. The Alienware laptops, which are my personal choice, max at 360 watt PSUs, and they have models with more sedate components that are 330-watt, 240-watt (what I use), and even 130-watt for their smallest. They also often have tools to tune power draw vs performance. They also, of course, have a screen in them, also usually tuned for low power consumption, that you can use.

Now laptop or desktop, people are right if 4k is something you want then DLSS is a must. Heck I find plenty of games where I need DLSS for 4k on a 4090 because they hit so hard. On anything less, you'll need it. At this time, DLSS is better than FSR for quality. Hardware Unboxed did a great comparison over 26 games and the TLDW of it is that in no case did FSR look better, at best it was a tie, usually DLSS looked better, often quite a bit better, and the results went more to DLSS's favor the lower the rez/performance mode was. Games are just demanding more and more GPU power if you want the shiny visuals, and the only way to realistically achieve that at high resolutions is to use upscaling. I know a lot of people like to hate on it as some kind of "hack" but everything we do in graphics is a "hack" we don't actually properly solve the rendering equality and render everything 100% correctly because it is impossibly slow to do so.

Plus something you'll discover is that in modern games, you usually want temporal anti-aliasing. They are designed with it in mind, and it is the only way (other than SSAA which is way too intensive) to eliminate artifacts from all shaders. Guess what DLSS (and FSR) is? TAA, just done from a lower input resolution.
I have a laptop, have 5 or 6, and it's what I use for non-gaming junk like replying here atm, but afaik with the exception of the few "laptops" that put full power GPUs in them, they don't offer the gaming performance I want. I have looked at SFF's with eGPUs but that's a whole bag of worms, many of them use DC bricks but have logic in the PSUs for sensing different wattage/voltage capacities like the Dell laptops, and are actually quite a pain in the ass to power from anything but their proprietary bricks, plus there's the whole issue of the eGPU caveats, which still wouldn't change what I'm asking even if they weren't an issue.

I'm not building a whole new machine here fwiw, I have a relatively capable desktop that I already game on using a 1080ti, I'm just looking to upgrade GPU. I'm not looking for the absolute minimum power consumption possible either, I'm looking for the minimum setup I can do 4k@60-120 with high/ultra settings, or 1440p ultrawide@120ish with the same, for the minimum amount of power possible. Don't get me wrong, I would *love* if I could find a fairly high spec low tdp ryzen SFF that would run off some nominal DC voltage with no BS, that had a fullsize PCI-E slot with full bandwidth, since I don't really need/want much other BS, but afaik, that doesn't exist.

Personally, I'm rarely a fan of gaming laptops either, none of them that are very comparably performant to a mid-high teir desktop, are anything I'd consider "portable", and there always seem to be major compromises with power and cooling. To each their own of course. Portability isn't a concern here for me, neither is dealing with the PSU situation really, I've got that handled for the most part.

FWIW in case anybody stumbles across this and starts thinking about off-grid laptops, let me *highly* recommend you go with a USB-C PD powered one, as most of the other major manufacturers are using odd-ball proprietary PSUs with varied voltage and amperage, that utilize some silicon to communicate with the laptop's power circuit, and are not easy to power from a DC grid anymore. Older laptops almost always used 19VDC power, and they are indeed easy to power, but no longer, most of the newer ones will "function" if you supply them with the same nominal voltage as the power brick, but will significantly throttle performance without having that logic from the power brick telling it "it's ok" to draw higher amperage. There is however, a very good, cheap USB-C PD module available for less than $10 on the usual suspects, that can handle fairly high wattage if powered with 24VDC, and medium wattage from lower voltage. I'm typing this from a ryzen thinkpad T14, that is continuously powered from one of my battery banks, using that same module (and they've been durable, other than when I've fried them doing something dumb (unfortunately I haven't run across one that'll work with a 48v nominal pack (needs to be capable of up to 60VDC for mine), so don't do what I did and forget).
 
I'm not building a whole new machine here fwiw, I have a relatively capable desktop that I already game on using a 1080ti, I'm just looking to upgrade GPU. for mine), so don't do what I did and forget).
In that case, the 4070 is the way to go. Good performance for watt, and if you give DLSS a chance I really don't think you'll regret it at 4k. One of the things you tend to find for all TAA is that it does better at higher resolutions and frame rates. Goes double for the up sampling ones. DLSS doesn't look that great at 1080, because even in quality mode it is only getting 1280x720 pixels to work with. However it looks real good in quality mode at 4k, because it is playing with 1440 as an input, sometimes even looks better than native in some ways.
 
I have no doubt as to your electronics prowess. The lack of information I had was not knowing you had a DC line to the batteries, so without that, naturally my opinion would be to go with the simplest solution for the average consumer.

That said, it doesn’t invalidate my other points re intel, ram or 4070.

I would also be thinking very carefully about which motherboard you decide on. Fewer power stages, less ancillary gear. Ironically I recommend looking at matx and itx boards for that reason
Yeah I definitely am looking at matx/itx down the road, at the moment I've got a little money coming so I'm trying to knock the big expense, i.e. gpu upgrade out first, use it with the current older, but still reasonably capable MB/cpu, I expect to be cpu bound with the newer card for a bit however.

MSI (fuck I hate them though) apparently has a mATX-12VO socket AM5 motherboard "pending" release, but I have my doubts I'll be able to find one. Apparently ASRock released a Z490 ATX12VO board a couple years ago but I see little evidence that anybody but reviewers ever got their hands on them.
 
In thinking more about it, lets assume I don't have the 320w peak power limitation, still want best performance per watt, and still have a roughly $800 budget, but am fine with buying second hand, would you guys still choose the 4070 super?

I've decided there's no reason for me to even bother with the second pico psu for the GPU, since the aux power for the GPU is single voltage, I'll just build a CV power circuit to handle it seperately, so draw or spikes won't be a concern.
 
In that case, the 4070 is the way to go. Good performance for watt, and if you give DLSS a chance I really don't think you'll regret it at 4k. One of the things you tend to find for all TAA is that it does better at higher resolutions and frame rates. Goes double for the up sampling ones. DLSS doesn't look that great at 1080, because even in quality mode it is only getting 1280x720 pixels to work with. However it looks real good in quality mode at 4k, because it is playing with 1440 as an input, sometimes even looks better than native in some ways.
I'm sure you're right about DLSS, and I probably should have been a little more specific, it's mostly RT that I don't give a shit about, I know it's going to continue to become more prevalent in the future, and I know it gives some people a grand "tingle in their dingle" :dead: but while I find it technically interesting, I also find it generally quite distracting, and rarely seems remotely worth the overhead/compromise even still, in what, the third generation of it? I'm sure part of it is still just developers learning to utilize it intelligently or tastefully, and the teething phase will eventually be over, but I can't help be reminded of the early years of mainstream autotune use in "music".. These days the shit is utilized nearly uiversally and for the most part, inperceptively, unless implemented poorly. Not sure we're really any better off for it though? Anyway, yes, I expect to use DLSS/FSR, but I am one of those guys that leans "purist", and will likely favor native resolution with some tuning where possible. I'm one of those people that will fixate on "perceived" differences with upscaling if I'm not careful. I'm already having to be conscious of and determined to not look for problems with the new OLED tv, as I'm sure I would find them.
 
I'm sure you're right about DLSS, and I probably should have been a little more specific, it's mostly RT that I don't give a shit about,
RT, ya it kinda depends on what you like, and also the game, they use it in vastly different ways. That aside, it isn't a huge concern for that price level as you aren't talking enough power to run the really heavy hitting things like full path tracing.

I'm one of those people that will fixate on "perceived" differences with upscaling if I'm not careful. I'm already having to be conscious of and determined to not look for problems with the new OLED tv, as I'm sure I would find them.
Well that would be something to talk to your therapist about :D. I'll refrain from telling you the issues with OLED :D.

For things like DLSS vs native it really is more of a "pick your poison" kind of thing because there's no perfect solution. The only "perfect" solution is SSAA and that is a total non-starter as it literally takes multiples of power to run, so like 8x SSAA is 8x the rez and needs 8x the power of native. Totally unusable for modern titles in realtime.

Well outside of that, there's no "perfect". You can choose to not use AA (if the title allows it) but then you get a lot of jaggies, shimmer, shader pop-in and so on. All the math we throw at graphics these days means there's aliasing of all kinds unless we do something about it. You can use a post-processing AA like FXAA if the game offers it, but that actually can't help with a lot of the aliasing that happens, like shader AA, so it only helps with jaggies on polygons, and then only a little. Or you can go with TAA, which is basically super sampling but in the time domain. It's what works. Thing is, there are always issues with things like blur, disocclusion, smearing, and so on since you are literally integrating the image over time. Happens if it is a native rez TAA, or if it is DLSS/FSR.

As such, the real answer is to go with the TAA that gets the best results, and that is often DLSS. Often, it's a tradeoff with native TAA where it causes some issues but fixes others. Like in Jedi Survivor, DLSS was notably better at resolving fine details, like distant antennas, that would otherwise be lost/artifacted compared to the native TAA. Again those Aussies over at Hardware Unboxed did a pretty good test and it is back and forth between DLSS or native for image quality. At 4k quality 4 games were a tie and 10 games looked better with DLSS, 10 looked better with native TAA. When you then add in that DLSS is going to run faster, it is something you really want.

Particularly for 4k with a lower spec card. Ya, a 4090 can usually just unga-bunga its way to 60+ fps in games at 4k (though not always, Alan Wake 2 path traced it too intense without DLSS for it to get to 60) but with a 4070Ti, you are just going to find too many that won't handle it without too much visual compromise. DLSS makes that way easier as you are essentially now running at 1440p.

I wish FSR was as good, but it just isn't right now. It is one where you likely legit would notice more issues and find yourself with the annoying tradeoff of either dealing with it, going native and taking the FPS hit or turning details way down.
 
Personally I would be looking at the 4080/4080s, I don’t think anything lower is a 4k card
 
  • Like
Reactions: pavel
like this
So we've had some abstract convos about this in the past, but I'm ready for a GPU upgrade (1080ti) finallly, and I'm hoping for either affirmation of my conclussion, or some input on an alternative if I'm missing something.

Before I mention the conclussion I've come to; some backstory for those that don't know, to explain my criteria/caveats: I live completely off-grid in a remote part of the Appalachian mountains, 40 or so minutes from the nearest gas station, let alone a grocery store. I do not have electric service, produce my own either solar, or generator when solar isn't sufficient, due to the fact that this is also one of the highet precipitation areas of the country, so optimizing electrical consumption is definitely way up there in terms of my priorities. Pure technical FPS per input watt isn't necessarily king here, so much as balancing bang-for-buck and minimizing unnecessary power consumption, but this specifically is why I'm seeking a second opinion here, as I know this can be more complex than looking at the raw TPD at a given target FPS. Final caveat worth mentioning is that I am trying to stay below 320W, as that's the largest DC powered Pico style ATX PSU I can find that can be directly powered by my 48V battery bank. I'll be using a single dedicated 320W max PSU for only the GPU, with another powering my MB/CPU/etc, I'm mentioning this only so there isn't any confusion, not asking for input on whether this works or is a good idea (it does, and is, for me). However, I don't really trust these PSUs to their "rated" wattage, at least not continuously, so leaving some overhead here, feels prudent.

I recently acquired an LG Oled (B2) for a screaming good deal as well, which has an impact on decision, since previously I wouldn't have been necesarily fixated on 4k performance. I still likely won't game entirely/primarily on this TV however, since it's a big wattage consumer, but if I am going to upgrade GPU, I'd like to be able to utilize it reasonably well.

I was previously considering a 7800XT, but considering the new TV, and with the recent new models released, and looking at raw numbers, the 4070 Super looks unbeatable in terms of TDP (220W) for FPS at 4K, but am I missing something? I could stretch budget up to higher skus (but definitely not over $1k, and realistically anything over $800 would be very difficult to rationalize), but I'm not seeing the payoff in terms of the raw numbers, I mean, having more FPS is nice, having more headroom is great, and I know that there's a scenario where a higher end card could in theory actually use less power for the same FPS, which is part of why I'm asking, though I'm not sure how easily that can be objectively quantified?

I will say that, I give absolutely zero shits about any of the nvidia vs radeon features atm, more power to you if you do, but I've lived without RT and DLSS just fine for many years. I'm sure if I go with Nvidia this time, I'll try those features, and maybe enjoy them, but my primary metric is getting performance for as little watt hours as possible, while landing in comfortable territory to utilize the new 4k Oled, or push 120 Hz 1440p Ultrawide at high settings, and I definitely will not/do not give a shit about non-gaming features, other than perhaps CAD performance (but I'm not going out and buying a quadro or whatever for that alone, I do most of my design work on a T14 (amd) laptop in freecad lol).

So 220W, able to roughly handle 4k ultra at 60fps (at least on paper) for approx $600? Seems like a no-brainer, but admittedly, that's going to be pushing the whole 220w, can any other card under $1k offer me better power consumption in that scenario? I mean obviously the "gamer" part of my brain wants to find a way to justify a 7900XTX, and I could swing it price wise prob, especially for a used one, but I'm not sure that is actually a smart move. 4070 ti or 4070 ti super could be an option perhaps.

Anyway, I appreciate any insight, legitimate criticism, or just affirmation anyone may be willing to share, thanks!
If your priority is the most/more efficient gpu for power/performance, then there's only one choice - a gpu from the Nvidia Geforce 40 series - just pick whatever makes the most sense - I would start at min. 4070 Ti and get whatever series you can afford. You can also undervolt the gpu so that sounds like something you'd probably want to do - and it doesn't sacrifice much performance - I'd look into that.
 
Haha, still 4070 super or ti super looks better for me, but it's an interesting release, makes me wonder if we're not going to see another iteration of the XTX as well, either way, I think I'll wait a few weeks to see what shakes up with pricing.
 
Haha, still 4070 super or ti super looks better for me, but it's an interesting release, makes me wonder if we're not going to see another iteration of the XTX as well, either way, I think I'll wait a few weeks to see what shakes up with pricing.
I think the card to buy at the moment is the 4080, as they will be trying to clear inventory for the 4080 supers
 
I think the card to buy at the moment is the 4080, as they will be trying to clear inventory for the 4080 supers
Until I see them for $800 or less, I can't really justify it, mostly on principle, I still think that's the price they should have been to begin with. The reason I've waited this long to upgrade (and the reason I bought a second 1080ti, even after the 30 series release) is because of the pricing insanity. I don't blame nvidia for being as cuthroat as they can be, cashing in as hard as possible when there's one thing after another enabling them to do so, but I don't have to like it or participate. I'm a capitalist purist ideologically, I'm not crying the blues over it, I just refused to shell out my hard earned cash, especially since I consider the strategy short-sighted, and negatively impactful for their segment of the market in general. Obviously many will feel differently, to each their own. I'd take any excuse to give AMD my money again (haven't had an AMD card since Radeon HD7870), though to be fair, I think AMD really fucked up with pricing this round as well, equally shortsighted, and significantly overpromied on power efficiency. So regardless, I'm excited to see competition heat up where it really matters, i.e. pricing.
 
Different strokes for different folks. AMD idle at high power, and have high power consumption generally. Cheaper overall, but cost/benefit isn't there imo.

The 4080, considering existing monitor tech, is placed well price wise for the market. It's a 4k card, which can consistently deliver around 120FPS at 4k for the most part. this places it in the higher echelons of performance, but also, a key thing here is what is needed to support it:

  • You wouldn't generally put it in a system with an i5/7700x or below - so you're in mid-high end processor territory here
  • You wouldn't generally put it a system without a 4k monitor, and I'd argue a 4k/100hz monitor (or above)
If you can afford those two, then it sits nicely where it should be at the price point.
 
I will say that, I give absolutely zero shits about any of the nvidia vs radeon features atm, more power to you if you do, but I've lived without RT and DLSS just fine for many years. I'm sure if I go with Nvidia this time, I'll try those features, and maybe enjoy them, but my primary metric is getting performance for as little watt hours as possible, while landing in comfortable territory to utilize the new 4k Oled, or push 120 Hz 1440p Ultrawide at high settings, and I definitely will not/do not give a shit about non-gaming features, other than perhaps CAD performance (but I'm not going out and buying a quadro or whatever for that alone, I do most of my design work on a T14 (amd) laptop in freecad lol).

So 220W, able to roughly handle 4k ultra at 60fps (at least on paper) for approx $600? Seems like a no-brainer, but admittedly, that's going to be pushing the whole 220w, can any other card under $1k offer me better power consumption in that scenario? I mean obviously the "gamer" part of my brain wants to find a way to justify a 7900XTX, and I could swing it price wise prob, especially for a used one, but I'm not sure that is actually a smart move. 4070 ti or 4070 ti super could be an option perhaps.

Anyway, I appreciate any insight, legitimate criticism, or just affirmation anyone may be willing to share, thanks!

If it's performance per watt that you're after and you need to do about 120 FPS at 4K for around $600... then yes the 4070 Super is hard to beat.

The only other alternative is probably the 7900 GRE which will soon be out for sale for $549.

It still uses more power than the 4070 Super so if you're keen to save on electricity then Nvidia is still probably the best option.

My only caveat would be that the 4070 Super still only comes with 12GB of VRAM. So it's probably not the best choice if you want to 'future proof'. A lot of evidence shows that 16GB is the optimum amount of VRAM for 4k resolution games.

1709247276042.png


Article source: https://www.tomshardware.com/pc-components/gpus/amd-radeon-rx-7900-gre-review/7
 
I think the card to buy at the moment is the 4080, as they will be trying to clear inventory for the 4080 supers
Dunno where you are but in this section of planet earth - it's just scalpers and more scalpers - private scalpers, corporate scalpers....scalpers everywhere. We're talking about the 4080, right? If it's the 4080 Super - same thing.... whoa.
 
I have a laptop, have 5 or 6, and it's what I use for non-gaming junk like replying here atm, but afaik with the exception of the few "laptops" that put full power GPUs in them, they don't offer the gaming performance I want. I have looked at SFF's with eGPUs but that's a whole bag of worms, many of them use DC bricks but have logic in the PSUs for sensing different wattage/voltage capacities like the Dell laptops, and are actually quite a pain in the ass to power from anything but their proprietary bricks, plus there's the whole issue of the eGPU caveats, which still wouldn't change what I'm asking even if they weren't an issue.

I'm not building a whole new machine here fwiw, I have a relatively capable desktop that I already game on using a 1080ti, I'm just looking to upgrade GPU. I'm not looking for the absolute minimum power consumption possible either, I'm looking for the minimum setup I can do 4k@60-120 with high/ultra settings, or 1440p ultrawide@120ish with the same, for the minimum amount of power possible. Don't get me wrong, I would *love* if I could find a fairly high spec low tdp ryzen SFF that would run off some nominal DC voltage with no BS, that had a fullsize PCI-E slot with full bandwidth, since I don't really need/want much other BS, but afaik, that doesn't exist.

Personally, I'm rarely a fan of gaming laptops either, none of them that are very comparably performant to a mid-high teir desktop, are anything I'd consider "portable", and there always seem to be major compromises with power and cooling. To each their own of course. Portability isn't a concern here for me, neither is dealing with the PSU situation really, I've got that handled for the most part.

FWIW in case anybody stumbles across this and starts thinking about off-grid laptops, let me *highly* recommend you go with a USB-C PD powered one, as most of the other major manufacturers are using odd-ball proprietary PSUs with varied voltage and amperage, that utilize some silicon to communicate with the laptop's power circuit, and are not easy to power from a DC grid anymore. Older laptops almost always used 19VDC power, and they are indeed easy to power, but no longer, most of the newer ones will "function" if you supply them with the same nominal voltage as the power brick, but will significantly throttle performance without having that logic from the power brick telling it "it's ok" to draw higher amperage. There is however, a very good, cheap USB-C PD module available for less than $10 on the usual suspects, that can handle fairly high wattage if powered with 24VDC, and medium wattage from lower voltage. I'm typing this from a ryzen thinkpad T14, that is continuously powered from one of my battery banks, using that same module (and they've been durable, other than when I've fried them doing something dumb (unfortunately I haven't run across one that'll work with a 48v nominal pack (needs to be capable of up to 60VDC for mine), so don't do what I did and forget).
Laptop 4090 in terms of overall performance, is like a 16GB 4070 super, which only uses ~170 watts. Pair it with a mobile AMD cpu, and that would be a pretty sweet setup. Especially if you got their mobile X3D cpu. However, noise can definitely be an issue for gaming laptops.
In thinking more about it, lets assume I don't have the 320w peak power limitation, still want best performance per watt, and still have a roughly $800 budget, but am fine with buying second hand, would you guys still choose the 4070 super?

I've decided there's no reason for me to even bother with the second pico psu for the GPU, since the aux power for the GPU is single voltage, I'll just build a CV power circuit to handle it seperately, so draw or spikes won't be a concern.
4070 and 4070 super are the most efficient. I think the 4070 super is actually technically, slightly more performance per watt. But its close enough that it mostly comes down to what you want to pay. A good deal for a new 4070 is around $520 (can be less if Newegg has those cashback promos through zoom pay, etc) and a good deal for a new 4070 super is around $590.

I could eat it on this. But, I don't think 12GB is much of a problem on a 4070. Because situations where 12GB becomes a problem-----are probably situations where the 4070 wouldn't be able to do 60fps anyway. So, you'd have to use DLSS----which lowers VRAM usage. However, framegen increases VRAM usage. So, you do have to be mindful. And that's where 12GB card could choke a bit. When its near VRAM limit----and then you try to sprinkle framegen on top of it.
 
Back
Top