Anyone seriously considering the 4070 TI

Also add VR to this. AMD has worse frame timing so VR will feel worse with an AMD GPU than Nvidia at identical FPS.

But otherwise the 7900XTX easily. It matches/exceeds the 4080 in raster performance and is $200 cheaper.
Yup. It’s getting better but last I checked definitely Nvidia for VR
 
I don't even consider it frivolously.

Last card I considered buying was the 3080 but it was never available at MSRP. Then the 3090TI when I could've got an open box one for $1000 about 3 months ago. But didn't go for that one either, because I didn't like the brand.
 
You can do the math yourself by looking at the cuda cores and clockspeeds.

Best example that puts Ada Lovelace in the best light would be the 4080 at 4k since it is not cpu limited and will not have the scaling issues of the 4090 and then compare the 4080 to the 3080.

The 4080 has 9728 cores and in real world testing in techpowerup review averages 2737 mhz. https://www.techpowerup.com/review/nvidia-geforce-rtx-4080-founders-edition/40.html

The 3080 has 8704 cores and in real world testing in techpowerup review averages 1931 mhz. https://www.techpowerup.com/review/nvidia-geforce-rtx-3080-founders-edition/32.html

9728 x 2737 = 26,625,536

8704 x 1931 = 16,807,424

26,625,536 / 16,807,424 = 58%

That means with unrealistic perfect scaling and no other limitations the 4080 could be 58% faster than the 3080 if they had the same IPC. The overall difference was 49% at 4k where the 3080 was also likely running into vram constraints in a couple games. https://www.techpowerup.com/review/nvidia-geforce-rtx-4080-founders-edition/32.html

This is NOT scientific of course but does give you an idea that IPC is likely the same at best. If you used the 4090 it would look like a huge regression in IPC but big gpus don't scale well and many games run into limitations thus making the huge core count not even come close to being fully utilized.

So the POINT was earlier that if the 3070 and 4070 end up with the same core count it clearly is going to come down to the large clockspeed advantage of the 4070 to makes the difference.
Whatever it is you're comparing here it is decidedly not IPC. IPC means instructions-per-cycle yet not a single term in your maths here involves instruction counts.
Performance at 4K from a benchmark is not a useful metric for computing IPC at all.

You'd really need to run an optimized CUDA compute kernel which reasonably maxes out shader core utilization (and avoids scheduling/pipeline stalls) for a set number of time at a given frequency and count instructions using a profiler tool. Then you can compute:
Code:
IPC = num_insns / (freq_in_hz * runtime_in_seconds)
Do this on both architectures and you might get a reasonable IPC comparison for this specific and spectacularly contrived testcase - because that's the only thing IPC is useful for. How useful it is for anyone who is not specifically working on optimizing a particular workload for a particular architecture is even more questionable.
 
Last edited:
Different amount of VRAM, different bandwidth; it is NOT apples to apples. FULL STOP

Edit: I will also be ignoring this bullshit comparison from here on out as it is already so far off topic that this thread should already get scrubbed.
I get it, you don't understand a bit. Even with all the advantages (except for a slight bandwidth disadvantage) the 4080s performance over the 3080 is almost exactly in line with the clock speed increase. This is only at 4k where the 3080 is definitely at a disadvantage due to likely hitting RAM limits on the card which is yet another big advantage for the 4080. This is every indication that the IPC of the 4000 series is no better than the 3000 series. If the IPC was noticeably better the performance of the 4080 should have been quite a bit better. Higher IPC along with much higher clockspeed act as a multiplier for performance and we're not seeing that.
Whatever it is you're comparing here it is decidedly not IPC. IPC means instructions-per-cycle yet not a single term in your maths here involves instruction counts.
Performance at 4K from a benchmark is not a useful metric for computing IPC at all.

You'd really need to run an optimized CUDA compute kernel which reasonably maxes out shader core utilization (and avoids scheduling/pipeline stalls) for a set number of time at a given frequency and count instructions using a profiler tool. Then you can compute:
Code:
IPC = num_insns / (freq_in_hz * runtime_in_seconds)
Do this on both architectures and you might get a reasonable IPC comparison for this specific and spectacularly contrived testcase - because that's the only thing IPC is useful for. How useful it is for anyone who is not specifically working on optimizing a particular workload for a particular architecture is even more questionable.
It's a rough test to determine if there is an IPC increase, not determining exactly what the IPC increase is. Every indicator shows the IPC is at best equal between the architectures. When roughly taking into account the differences such as clock speed and CUDA cores the performance in gaming is about equal between the two cards even though the 4080 has even more advantages. Again, a strong indicator that IPC hasn't increased.

There's no need for your effectively synthetic test to determine IPC unless you're determining IPC for that specific workload. Your test is only useful for that specific use case which in this particular discussion is moot because we're talking about gaming and the figures used for the gaming comparison are an average of many different games. It's one of the few times I would even look at an averaged figure like that because it tends to have little use otherwise.
 
I get it, you don't understand a bit. Even with all the advantages (except for a slight bandwidth disadvantage) the 4080s performance over the 3080 is almost exactly in line with the clock speed increase. This is only at 4k where the 3080 is definitely at a disadvantage due to likely hitting RAM limits on the card which is yet another big advantage for the 4080. This is every indication that the IPC of the 4000 series is no better than the 3000 series. If the IPC was noticeably better the performance of the 4080 should have been quite a bit better. Higher IPC along with much higher clockspeed act as a multiplier for performance and we're not seeing that.

It's a rough test to determine if there is an IPC increase, not determining exactly what the IPC increase is. Every indicator shows the IPC is at best equal between the architectures. When roughly taking into account the differences such as clock speed and CUDA cores the performance in gaming is about equal between the two cards even though the 4080 has even more advantages. Again, a strong indicator that IPC hasn't increased.

There's no need for your effectively synthetic test to determine IPC unless you're determining IPC for that specific workload. Your test is only useful for that specific use case which in this particular discussion is moot because we're talking about gaming and the figures used for the gaming comparison are an average of many different games. It's one of the few times I would even look at an averaged figure like that because it tends to have little use otherwise.
That's my point - IPC is a completely useless metric when talking about gaming or even just general generation to generation improvements unless it's in the context of specific workloads' IPC/CPI.
 
Also add VR to this. AMD has worse frame timing so VR will feel worse with an AMD GPU than Nvidia at identical FPS.

But otherwise the 7900XTX easily. It matches/exceeds the 4080 in raster performance and is $200 cheaper.
Higher frame times = lower FPS, however - they wouldn't be identical FPS with worse/higher frame times. You may be thinking of frame pacing - the consistency of frame times and delivery.

With that said, having owned a 7900 XTX and traded it for a 4080, the difference is staggering - 7900 XTX only does about 82-84 FPS average, 57.75 FPS 0.3% low in OpenVR Benchmark, while the 4080 does 107 FPS average and 86.39 FPS 0.3% low - the 4080's lowest framerates exceeded the 7900 XTX's average. Ouch, imagine paying $1,000 just for 3080 Ti/3090 levels of performance.

Actual game performance bears this out; I used to hit motion smoothing reprojection hard in DCS and No Man's Sky, and the 4080 has practically eliminated that. The few times I noticed a frame time spike in DCS, it was actually my 12700K not keeping up, rather than the 4080's fault.

I don't feel any losses in pancake gaming, even if 3DMark does measure worse for the most part, so trading was worth it for me, but if VR and RT were not a concern at all, the 7900 XTX is a formidable card. Just make sure you get a non-reference AIB model so it doesn't overheat and throttle itself to XT levels of performance under load when laying horizontally like 95% of PC setups have their expansion cards oriented.
 
Also add VR to this. AMD has worse frame timing so VR will feel worse with an AMD GPU than Nvidia at identical FPS.

But otherwise the 7900XTX easily. It matches/exceeds the 4080 in raster performance and is $200 cheaper.
Is there a modern in depth review of this? 7000 series is obviously having issues at this time and I've reports of many 6000 series owners who had upgraded to 7000 going back to their 6000 for vastly better performance.
 
Is there a modern in depth review of this? 7000 series is obviously having issues at this time and I've reports of many 6000 series owners who had upgraded to 7000 going back to their 6000 for vastly better performance.
Babel Tech Reviews posts frametime graphs, as well as synth/dropped frame graphs for a Valve Index at 100% SteamVR resolution (2016x2240 for a 1440x1600 per eye HMD), 90 Hz.
https://babeltechreviews.com/hellhound-rx-7900-xtx-vs-rtx-4080-50-games-vr/5/ (7900 XTX)
https://babeltechreviews.com/the-1199-rtx-4080-vr-performance-review/2/ (4080, but has 6900 XT results)

It's immediately clear that the 6900 XT has tight frametime plotting, if overall a bit higher than NVIDIA's offerings, but the 7900 XTX had rather fat/wide lines on its frametime graphs like some kind of microstutter.

I didn't really perceive microstutter with the 7900 XTX, though, just high frame times/low framerates and lots of reprojection for failing to keep frame times low enough - not something I want to see in a $1,000 GPU, even if it is still a significant improvement over my old GTX 980.
 
I didn't really perceive microstutter with the 7900 XTX, though, just high frame times/low framerates and lots of reprojection for failing to keep frame times low enough - not something I want to see in a $1,000 GPU, even if it is still a significant improvement over my old GTX 980.
Thanks for the links. It looks like yet another example of AMD rushing these cards out without having time to get the drivers where they needed to be for launch this gen. Hopefully they get things tightened up soon in regards to VR as the 7900 series should be well ahead at 4K/VR resolutions compared to the older gen.
 
Thanks for the links. It looks like yet another example of AMD rushing these cards out without having time to get the drivers where they needed to be for launch this gen. Hopefully they get things tightened up soon in regards to VR as the 7900 series should be well ahead at 4K/VR resolutions compared to the older gen.
Is it possible this is added latency due to the MCM design?
 
Bestbuy really not dropping any 4090s lately. One model here or there limited to regional drop. Newegg seems to be dropping shit load lmao. Wondering what has changed there.
 
Bestbuy really not dropping any 4090s lately. One model here or there limited to regional drop. Newegg seems to be dropping shit load lmao. Wondering what has changed there.
Well when Newegg gets them, the scalper bots from the 3rd party sellers scalpers snatch them up so then you see them all when you uncheck the Sold/Shipped by Newegg options for much more than MSRP.
 
Well when Newegg gets them, the scalper bots from the 3rd party sellers scalpers snatch them up so then you see them all when you uncheck the Sold/Shipped by Newegg options for much more than MSRP.
NA newegg has some model stocks for hours. LIke right now they have gigabyte gaming oc model. Scalpers are nit picking certain models now, usually cheaper ones etc or Asus strix. You can grab one easy if you wanted on newegg on weekly basis now.
 
NA newegg has some model stocks for hours. LIke right now they have gigabyte gaming oc model. Scalpers are nit picking certain models now, usually cheaper ones etc or Asus strix. You can grab one easy if you wanted on newegg on weekly basis now.
100%. That gigabyte gaming oc model has been in stock almost non stop for the last week. A few other models hang around for hours and hours. Like NKD said, not sure is up with BestBuy though, they have not seemed to have any significant drops in a long time.
 
NA newegg has some model stocks for hours. LIke right now they have gigabyte gaming oc model. Scalpers are nit picking certain models now, usually cheaper ones etc or Asus strix. You can grab one easy if you wanted on newegg on weekly basis now.
Oh that's good to know. But I'm just going to wait to 2024/2025 next gen at this point.
 
100%. That gigabyte gaming oc model has been in stock almost non stop for the last week. A few other models hang around for hours and hours. Like NKD said, not sure is up with BestBuy though, they have not seemed to have any significant drops in a long time.
May be BB bought shit load of 4080s and 4070tis they wanna offload before trying to sell 4090s again lmao. Stock seems to be flowing through given how often Newegg drops them and I think scalpers are holding on too them too and backing off a bit.
 
NA newegg has some model stocks for hours. LIke right now they have gigabyte gaming oc model. Scalpers are nit picking certain models now, usually cheaper ones etc or Asus strix. You can grab one easy if you wanted on newegg on weekly basis now.
4090 buyers are looking for MSRP while scalpers are looking for exclusivity. A 4090 with a basic cooler going for $1700 is in no mans land.
 
  • Like
Reactions: NKD
like this
I don't think I am guaranteed life anytime so I just enjoy what I can. I tell my wife, i save for the kids and family. When it comes to me enjoying my hobby don't say I spend too much every few years lmao.
Oh yeah for sure, I agree. I am more coming from the standpoint that I am dissatisfied with the performance of my current 3080 Ti, so by the time I am not, I am sure we'll have newer and better stuff out there.

Hobby at this point is too expensive to be chasing upgrades every gen like I did from the mid-2000s till 2014.
 
Last edited:
Oh yeah for sure, I agree. I am more coming from the standpoint that I am not dissatisfied with the performance of my current 3080 Ti, so by the time I am, I am sure we'll have newer and better stuff out there.

Hobby at this point is too expensive to be chasing upgrades every gen like I did from the mid-2000s till 2014.
That’s the other nice part. I remember upgrading motherboards and CPU to get minor boosts. Like every year. DDR2 and slightly faster? Sold!

Now? 5 year old I7 is still enough to game on high with the right GPU. Folks are still running 1080TI and happy as clams. Longevity is much higher now than it used to be.
 
That’s the other nice part. I remember upgrading motherboards and CPU to get minor boosts. Like every year. DDR2 and slightly faster? Sold!

Now? 5 year old I7 is still enough to game on high with the right GPU. Folks are still running 1080TI and happy as clams. Longevity is much higher now than it used to be.

Makes sense if you are doing low res but not if you have 4k 165hz monitor. Then the new cards make sense.
 
Makes sense if you are doing low res but not if you have 4k 165hz monitor. Then the new cards make sense.
If you’re running 4k 165 you’re in the bleeding edge and spent a lot on a monitor very recently. Hence you’re likely to spend a lot on a card right now, or very recently too. But totally agreed.

I still haven’t jumped because outside of the absurdly expensive, there still aren’t good “everything” 4K screens. Either they’re workstation - or they’re gaming - but not both. I wait for it to hit both at around 1500 to jump. I stay pretty bleeding edge, but not every generation.
 
  • Like
Reactions: NKD
like this
If you’re running 4k 165 you’re in the bleeding edge and spent a lot on a monitor very recently. Hence you’re likely to spend a lot on a card right now, or very recently too. But totally agreed.

I still haven’t jumped because outside of the absurdly expensive, there still aren’t good “everything” 4K screens. Either they’re workstation - or they’re gaming - but not both. I wait for it to hit both at around 1500 to jump. I stay pretty bleeding edge, but not every generation.
Makes sense.
 
"Anyone seriously considering the 4070Ti"

-most of discussion is about 4090, 4080, 7900, 4070, everything except 4070Ti

That would be a resounding "No"!

I wonder how the perception of the 4070Ti is going to be in a couple/few years when its street price is much lower? Because unlike some controversial/disliked GPUs, there's really nothing inherently wrong with it other than the price. It's basically a 3080Ti that uses much less power, that's great! Too bad it also costs the same as a 3080Ti.
 
I wonder how the perception of the 4070Ti is going to be in a couple/few years when its street price is much lower? Because unlike some controversial/disliked GPUs, there's really nothing inherently wrong with it other than the price. It's basically a 3080Ti that uses much less power, that's great! Too bad it also costs the same as a 3080Ti.
Right. Literally zero generation on generation performance per dollar uplift.
 
Right. Literally zero generation on generation performance per dollar uplift.
It is 45% faster than the 3070 ti and costs 33% more so technically it does over a small perf per dollar increase over the card it officially replaces. That said the 3070 ti was already a poor value over the plain 3070.
 
"Anyone seriously considering the 4070Ti"

-most of discussion is about 4090, 4080, 7900, 4070, everything except 4070Ti

That would be a resounding "No"!

I wonder how the perception of the 4070Ti is going to be in a couple/few years when its street price is much lower? Because unlike some controversial/disliked GPUs, there's really nothing inherently wrong with it other than the price. It's basically a 3080Ti that uses much less power, that's great! Too bad it also costs the same as a 3080Ti.
The problem there is that for my $800, I'd rather have a used 3090 (Ti) because I don't care about DLSS 3 and would probably benefit more from literally twice the VRAM and bus width, and then the RX 7900 XTX seemed like a lot more card for $200 more still that I might as well have taken the chance. (Which I did, and soon regretted due to the vapor chamber flaw on top of the subpar VR performance, but it wound up saving me at least $100 on an RTX 4080 in a roundabout way.)

The 4070 Ti should've been a 4060 - that's the tier when we start expecting memory buses smaller than 256 bits wide - and priced to match, to boot ($350 and under). The 1060 6 GB and 3060 12 GB seemed like good cards for their day, bang-for-the-buck champs for people who either don't need all-out GPU performance (likely due to gaming at 1080p still) or would be horribly CPU-bottlenecked otherwise (like a 3060 in anything prior to Zen 3 or Alder Lake).

The 4080 at least has the convenient excuse of beating the 3090 Ti decisively, even with a bit less VRAM and bus width compromising memory throughput somewhat - it's an objective improvement over the past generation, if less so than the 4090. It's just, as you stated, overpriced.
 
Wrong. 3080ti was $1200. This card outperforms it and costs $800. Pretty big price drop!
While true I feel that the problem people are having is that the 3rd card down from the top of the stack is still $800. Seeing a 70 level card for that kind of money is hard to accept for a lot of people. It feels more like a more efficient 3080ti for similar price to me having 12gb of vram. My opinion is with the ‘good card bad price’ crowd on this one. The 3070ti aib cards are in stock all day around here for 649 which is absurd for an 8GB card in 2023. The 4070ti needs to get to $700 at least IMO and that may happen if the 3070ti stock gets depleted.
 
Our last price was absolutely insane, so this seemingly less insane price is such a bargain! Totally not pulling wool over the eyes of my loyal consumers! You'd be foolish not to buy it!
This has been the lie ever since ampere came out. Compare the price to the previous terrible value (2080ti) to make the prices seem like a great value. Completely disregard the linear increase in price/perf.
 
The 3080 Ti hasn't been $1200 for a very long time.

If that was true they would be flying off the shelves. Its disingenuous to compare prices during the COVID retardness. The ever negative Tech Jesus has a good video on why the new cards are not a great value.
 
The problem there is that for my $800, I'd rather have a used 3090 (Ti) because I don't care about DLSS 3 and would probably benefit more from literally twice the VRAM and bus width, and then the RX 7900 XTX seemed like a lot more card for $200 more still that I might as well have taken the chance. (Which I did, and soon regretted due to the vapor chamber flaw on top of the subpar VR performance, but it wound up saving me at least $100 on an RTX 4080 in a roundabout way.)

The 4070 Ti should've been a 4060 - that's the tier when we start expecting memory buses smaller than 256 bits wide - and priced to match, to boot ($350 and under). The 1060 6 GB and 3060 12 GB seemed like good cards for their day, bang-for-the-buck champs for people who either don't need all-out GPU performance (likely due to gaming at 1080p still) or would be horribly CPU-bottlenecked otherwise (like a 3060 in anything prior to Zen 3 or Alder Lake).

The 4080 at least has the convenient excuse of beating the 3090 Ti decisively, even with a bit less VRAM and bus width compromising memory throughput somewhat - it's an objective improvement over the past generation, if less so than the 4090. It's just, as you stated, overpriced.
I think NV did a great disservice to its own products as well as customers by pushing the "4070Ti = 3090/Ti" thing so hard. Like I get it, it makes the 4070Ti sound better while technically being true, kind of, in some situations. But like you said, 3090/Ti is uniquely equipped with the extra VRAM, a niche feature with specific uses that cannot simply be replaced with "higher clocks and more cache". I went with a 3090Ti because I'm somewhat of an edge case with texture mods and maxxing resolutions at low FPS- no 12GB card would work for me, not thru guile nor brute force. If that wasn't the case I may have considered a 4070Ti but NV is just being misleading saying a 12GB card is the same as a 24GB card when there is a 12GB version of the 24GB card right there for direct comparison.

Re: the bus width on AD104... I know I'm probably in the minority here but I honestly don't have a problem with that part. Yeah, historically xx104 cards have had 256-bit GDDR bus. But Lovelace architecture has that major L2 cache which changes things. The cache boost allows 4090 to kick the shit out of 3090Ti with the exact same memory bus, allows 4080 to exceed 3090Ti despite having similar GDDR b/w to an OC 3070Ti, and AMDs similar (in spirit) L3 implementation on Navi2 allowed them to knock down bus widths by a tier across the stack (saving power + cost + size) while mostly maintaining performance. Would 4070Ti be "better" with 256-bit bus? Ofc, but I don't know that a card with that shader performance and TMU/ROP count really needs much more B/W and mem capacity. It just needs to be cheaper.

The 3080 Ti hasn't been $1200 for a very long time.
Arguably, the 3080Ti was almost never a $1200 card. It was like an $1800 card in 2021/ early 2022, then quickly plummeted to $1000 and below in mid 2022. I think that's a big part of the "pricing problem" with RTX 4000- it's not just that prices were very high the past couple years, it's that they were inconsistent. All the high-end cards went from way above MSRP to way below MSRP in a flash (even months and months before Ada was announced). There was no consistent value baseline coming from last-gen, just whatever prices ended up being in any given month. That unfortunately provides an opening for RTX 4000 to artificially follow the same pattern, being released high and then getting lots of price cuts later in the cycle to boost sales once value has been "established" thru high MSRP.
 
Last edited:
If that was true they would be flying off the shelves. Its disingenuous to compare prices during the COVID retardness. The ever negative Tech Jesus has a good video on why the new cards are not a great value.
3080 Ti's are off shelves. You seeing much restock of anything above a 3070 Ti in the 30-series? Yeah me neither.
 
Back
Top