Should I just get a 4070Ti?

I'd buy a 4070ti.... if it was $600 or less. Not $850-1000. I just can not wrap my head around the insane GPU prices. Meanwhile my old 1080ti does fine, my monitor is capped at 60hz anyway and I am not a competitive CS:GO gamer so who cares.

Also I find myself playing PS5 more the last couple years.... as I get older I find that when I work on computers all day long, the last thing I want is to sit in front of another one to game.

THAT being said.... Diablo 4 on a console? Eh I dunno.
 
I'd buy a 4070ti.... if it was $600 or less. Not $850-1000. I just can not wrap my head around the insane GPU prices. Meanwhile my old 1080ti does fine, my monitor is capped at 60hz anyway and I am not a competitive CS:GO gamer so who cares.

Also I find myself playing PS5 more the last couple years.... as I get older I find that when I work on computers all day long, the last thing I want is to sit in front of another one to game.

THAT being said.... Diablo 4 on a console? Eh I dunno.

Diablo 3 on PS5 is great! You should give it a try if you haven't :)
I've been debating PS5 or PC for Diablo 4. Ended up getting it for PC first, but we'll see how it feels during the beta. Blizzard has no issue refunds.
 
Yup. Some people lose the distinction between not being able to afford something, and not buying something out of principle.
Not unlike “budget “ being used as “how much I’m willing to spend “ - setting a budget. And “ how much I can afford to spend” My budget is x because that’s all I have. Or “budget “ meaning cheap or as cheap as possible to get there.

On topic: $800+ GPUs are not budget and even though I’m a working adult with hobby money I won’t budget north of $800 towards a 12g 192bit gpu in 2023. The 4070ti is a terrific performer that is simply priced too high. I don’t care that it hangs with last series 80ti and 90 cards. Good product bad price. Save for the 4080/4090 or XTX if 4K high refresh is your goal.

And yeah those cost too much too! 😆
 
Not unlike “budget “ being used as “how much I’m willing to spend “ - setting a budget. And “ how much I can afford to spend” My budget is x because that’s all I have. Or “budget “ meaning cheap or as cheap as possible to get there.

On topic: $800+ GPUs are not budget and even though I’m a working adult with hobby money I won’t budget north of $800 towards a 12g 192bit gpu in 2023. The 4070ti is a terrific performer that is simply priced too high. I don’t care that it hangs with last series 80ti and 90 cards. Good product bad price. Save for the 4080/4090 or XTX if 4K high refresh is your goal.

And yeah those cost too much too! 😆
Yeah I see the 4070 ti as okay for now, but it won't be a good card two years from now for 4k, it'll be exclusively a 1440p card if that. Developers are going to be pushing over 12gb soon (they already are in some cases).
 
Edit:
Ended up buying 7900 XTX.

I'm a casual gamer, but I game at 4k/120hz. I found out my PS5 gets more gaming use than PC but there are games that I prefer to play on PC, like upcoming Diablo 4 and many MMORPGs. And it just happens that those games are usually not very demanding.

I could probably try and find 3090 for about the same price, but I want a white card and that automatically makes it a lot rarer second hand (I'm from EU).
I'm not anti DLSS(3) and I think Ray Tracing performance is important so this is the reason I'm considering 4070Ti instead of 3090/Ti or AMD. I think AMD will just fall behind in performance as more and more games start implementing RT. AMD really messed up this gen.
While the price of the 4070 Ti is more appealing than a 4080, or 4090, the 4070 Ti is better as a 1440 card that can do 4K in some current games and many more older games that have lower VRAM requirements. If you are wanting 4K then a 4080/4090 will serve you better. The 12GB of VRAM will be a limiting factor in future games and is already limiting in some of the newest games for 4K especially once you add RT to the mix.
 
The 4070 Ti doesn't have a castrated bus, it is inline with the resolution that a 4070 Ti should target, 1440. It has larger cache system (compared to Ampere) like AMD's Infinity Cache, adding larger cache works well until you overun the cache, so 1440 and below resolutions are where the 4070 Ti shines. the 4080/4090 are the 4K cards for modern video games. AMD lowered the main VRAM bus width because of Infinity Cache, works great at lower resolutions, sometimes not as well at 4K. Just becuase it is different than you are used to doesn't mean it is castrated or worse. Are you a GPU engineer? Clearly nvidia didn't think it was a problem. Too many want their cake and to eat it to on a midrange card...
 
"Mid-range card" $800...never stops being funny lol.
I can't buy an F-150 Eddie Bauer edition pickup for 1990's prices anymore either. You are getting a potent GPU for $800, even if that price seems high. TSMC 4nm cost extra for bleeding edge.

You want to talk about what isn't fair, the prices Enterprise equipment sells for! ~$30K for an Ada Lovelace A100.
 
I can't buy an F-150 Eddie Bauer edition pickup for 1990's prices anymore either. You are getting a potent GPU for $800, even if that price seems high. TSMC 4nm cost extra for bleeding edge.

You want to talk about what isn't fair, the prices Enterprise equipment sells for! ~$30K for an Ada Lovelace A100.
Mmm yes nvidia, please step on me more! I'll do whatever you say! This shit keeps up I'm getting a console lol. 6800 XT might be my last video card for gaming.
 
Mmm yes nvidia, please step on me more! I'll do whatever you say! This shit keeps up I'm getting a console lol. 6800 XT might be my last video card for gaming.
They always have to compare bad value cards to other even worse values to justify things. Just ignore the linear increase in price/performance that nvidia has caused. No, now we need to compare it to limited edition trucks and enterprise hardware solutions.

How is that a win?
 
Yeah I see the 4070 ti as okay for now, but it won't be a good card two years from now for 4k, it'll be exclusively a 1440p card if that. Developers are going to be pushing over 12gb soon (they already are in some cases).
Is there a case of a mainstream game pushing over 12 gb of vram for settings for which a 16 gig 4070ti would conformably stay above 60 fps ?

You are getting a potent GPU for $800, even if that price seems high. TSMC 4nm cost extra for bleeding edge.
TSMC 5nm special nvidia edition does cost a lot, but the under 300 mm die size (close to the 276mm of a 3060), 75% the size of a 3070 could take care of a lot of the price increase
 
Is there a case of a mainstream game pushing over 12 gb of vram for settings for which a 16 gig 4070ti would conformably stay above 60 fps ?


TSMC 5nm special nvidia edition does cost a lot, but the under 300 mm die size (close to the 276mm of a 3060), 75% the size of a 3070 could take care of a lot of the price increase
Not sure but there are plenty using over 10 now, and 3080 users aren't happy. When they came out, maybe even a year after, I'm sure you could've made the same argument.
 
I would take the small hit to performance at 4k over the insane power consumption difference between the 4070 ti and 3090 ti.

View attachment 550976
This. Power, efficiency, and silence matter to me. And the 4070 ti is the card that best meets that criteria - it runs cool and is very efficient while being a powerful card; perfect for my home office and for gaming.

I was absolutely convinced I'd buy AMD this generation - all I wanted was a powerful, efficent, upper mid-range card. And was hugely disappointed when the hot running inefficient 7900 XTX and XT cards came out and nothing else. Not interested in another GPU that doubles as a heater. And they weren't even that fast; if I wanted the most powerful card why would I even consider AMD? I'd simply buy the nVida 4090 rtx which is faster and (surprisingly) more efficient and cheaper long term.

But I live in Europe where energy prices are higher and not all of us have air conditioning in all rooms. I supposed if you live in America where energy prices are largely irrelevant and many rooms have air conditoning/climate control, then inefficient hot running GPU's can work. But for now the 4070 ti works for me.
 
Not sure but there are plenty using over 10 now, and 3080 users aren't happy. When they came out, maybe even a year after, I'm sure you could've made the same argument.
3080 user since Dec 2020 reporting in, no issues yet. Hogwarts won't run for me anyway at 4k max rt max at 60 fps even if I had the vram. The 10gb has yet to hold me back on a game I could otherwise play at max settings with RT maxed but for that one (and doom eternal needs to use 2 notches down on textures which just enables texture streaming but still keeps max res textures). I game at 4k60 and am eyeing an upgrade to 4k144 eventually with a new card (raw horsepower).

Nice try. I knew vram would eventually be an issue for me but it's yet to be one in actuality, so my expectations have been exceeded. 12gb on a 4070 Ti is plenty for its horsepower.
 
3080 user since Dec 2020 reporting in, no issues yet. Hogwarts won't run for me anyway at 4k max rt max at 60 fps even if I had the vram. The 10gb has yet to hold me back on a game I could otherwise play at max settings with RT maxed but for that one (and doom eternal needs to use 2 notches down on textures which just enables texture streaming but still keeps max res textures). I game at 4k60 and am eyeing an upgrade to 4k144 eventually with a new card (raw horsepower).

Nice try. I knew vram would eventually be an issue for me but it's yet to be one in actuality, so my expectations have been exceeded. 12gb on a 4070 Ti is plenty for its horsepower.
3080lowVram.png


Source:
Putting 12 GB of vram on an 800 dollar card is absurd and no one should excuse it. Everything at 10 gb or below is getting absolutely hammered. Even without RT games will use over 10 gb of vram soon (as in we'll see it in the next couple of years). You might get lucky with 12 being the limit, but chances are you wont
 
If you have a 3080 or higher video card there's no need to upgrade to 4 Series at all. Wait till 5 series and spend $25,000 on a video card that sounds logical.
 
Last edited:
They always have to compare bad value cards to other even worse values to justify things. Just ignore the linear increase in price/performance that nvidia has caused. No, now we need to compare it to limited edition trucks and enterprise hardware solutions.

How is that a win?
I think you missed my point. I cannot expect to pay the same cost for the same type of item forever, costs increases occur whether we like it or not. $800 for a midrange card is too much for me, so I don't plan buy one. Inflation is a thing. Enterprise gets stuck paying much higher costs than I do for PC hardware. Perspective is important. We can all opine about the good old days when gas was .99 gallon but those days are not coming back. Do gamers always need a new video card? They can vote with their wallets and not buy the latest and greatest until they cannot buy last gen anymore. Can nvidia sell the 4070 Ti for $600 and still make money? More than likely they can, but why bother when people are willing to spend more? None of your statements nullify or make my arguments less valid. What applies to other parts of the economy with material goods also applies to PC hardware. Trying to change that so you can buy a 4070 Ti for $300 or have sour grapes becuase the cost increased is a fool's errand. I cannot afford to buy new vehicles, so I by 5 year old cars that I can afford. Such is life.

Edit: As I put it for someone on Reddit, a xx60 series is a solid 1080 card, could play at 1440 with older, less demanding games, but 1080 for the next few years before fidelity exceptions have to be made.
The xx70 series, a solid 1440 card, with potential for 4K depending on the graphical requirements of the game, and 1080 gaming in the future if you intend to keep it for 5 plus years.
The xx80/90 series are 4K cards for current games, and then dropping to either 1440 for newer games a few years from now or lowering graphical fidelity to maintain 4K.

Yet there are people who think their xx60 or xx70 series should play 4K with max detail everything and should cost $200, $300 at most. See the problem? You expect too much from a card that wasn't designed specifically for pwning games in 4K, unless you play the original Crysis or other such games from 3 or more years ago.
 
Last edited:
Is there a case of a mainstream game pushing over 12 gb of vram for settings for which a 16 gig 4070ti would conformably stay above 60 fps ?


TSMC 5nm special nvidia edition does cost a lot, but the under 300 mm die size (close to the 276mm of a 3060), 75% the size of a 3070 could take care of a lot of the price increase
The whole point of tiering/segmenting your product stack is you get performance inline with what you pay for. Could nvidia make 4070 Ti cards with 16GB of memory? Possibly. I read an article the the memory bus scales with the number of SMs for nvidia so the conclusion to the article is that nvidia cannot make a 16 GB card unless they designed it that way from the beginning. It was a technical article where some of it was over my head to understand unless I understand GPU architecture better. They used to 3060 as an example, either 12 GB or 8 GB were the only options based on memory bus design and memory channels. I'm not a GPUI engineer, so I don't intend to argue about it, I'll leave it up to those who design them and choose a product that works for me in terms of price/performance because complaining about it won't change anything. I vote with my wallet for the product that works for me, even if some concessions are made such as wishing my 3060Ti had 12 GB of memory

Maybe, but that bleeding edge node hardware to make wafers is getting exponentially more expensive, which means, we are not likely to see any significant price drops near to levels we were used to 4 years ago. The price tag for 3nm capable wafer etching machines is eye watering and only going to get worse unless some chip breakthroughs occur that lower the cost to manufacture wafers.
 
Edit:
Ended up buying 7900 XTX.

I'm a casual gamer, but I game at 4k/120hz. I found out my PS5 gets more gaming use than PC but there are games that I prefer to play on PC, like upcoming Diablo 4 and many MMORPGs. And it just happens that those games are usually not very demanding.

I could probably try and find 3090 for about the same price, but I want a white card and that automatically makes it a lot rarer second hand (I'm from EU).
I'm not anti DLSS(3) and I think Ray Tracing performance is important so this is the reason I'm considering 4070Ti instead of 3090/Ti or AMD. I think AMD will just fall behind in performance as more and more games start implementing RT. AMD really messed up this gen.
AMD didn't mess up, their RT performance is close to Ampere with 7000 series, so they are a generation behind nvidia. AMD doesn't see the value of RT performance or as a driver for card sales. Those who want RT performance can buy nvidia, those who care more about rasterization can buy AMD. Each company has different values and that influences design direction for future products.
 
View attachment 554174

Source:
Putting 12 GB of vram on an 800 dollar card is absurd and no one should excuse it. Everything at 10 gb or below is getting absolutely hammered. Even without RT games will use over 10 gb of vram soon (as in we'll see it in the next couple of years). You might get lucky with 12 being the limit, but chances are you wont

Your statement doesn't exactly hold water. Look at the 6650 XT with a mere 8GB, it is doing ok at 1080, why is that? IT isn't strictly VRAM, there are other factors, such as architecture design and it seems more cache helps even when VRAM may be limiting. Cache has much higher bandwidth than VRAM. Even a 3080 with 10GB gets beat by an 8GB VRAM card. A 3060 with 12GB can beat a 3070. This is not as cut and dry as you are trying to make it sound. Not only that, but Hogwarts is not fully optimized.
 
Last edited:
The whole point of tiering/segmenting your product stack is you get performance inline with what you pay for. Could nvidia make 4070 Ti cards with 16GB of memory? Possibly. I read an article the the memory bus scales with the number of SMs for nvidia so the conclusion to the article is that nvidia cannot make a 16 GB card unless they designed it that way from the beginning. It was a technical article where some of it was over my head to understand unless I understand GPU architecture better. They used to 3060 as an example, either 12 GB or 8 GB were the only options based on memory bus design and memory channels. I'm not a GPUI engineer, so I don't intend to argue about it, I'll leave it up to those who design them and choose a product that works for me in terms of price/performance because complaining about it won't change anything. I vote with my wallet for the product that works for me, even if some concessions are made such as wishing my 3060Ti had 12 GB of memory

Maybe, but that bleeding edge node hardware to make wafers is getting exponentially more expensive, which means, we are not likely to see any significant price drops near to levels we were used to 4 years ago. The price tag for 3nm capable wafer etching machines is eye watering and only going to get worse unless some chip breakthroughs occur that lower the cost to manufacture wafers.
Yeah, GPU memory capacity options are fundamentally related to the design of the core. Not Shader Modules exactly (memory controller is related to ROPs though) but the possible GDDR configs are dictated by the number of memory channels from the IMC.

GA106 / AD104 = 6x32-bit channels = 192b bus, so 6GB or 12GB with current 8Gb/16Gb GDDR chips
GA104 / AD103 = 8x32b channels = 256b bus, 8GB or 16GB
GA102 / AD102 = 12x32b channels = 384b bus, 12GB or 24GB

Not all of the channels need to be filled- which is how configs like 10GB GA102 or 8GB GA106 are possible- but doing that cuts gddr bandwidth proportionally (see 8GB GA106 3060 being terrible while 8GB GA104 3060Ti is great). AD104 could theoretically have 16GB with not-yet-available 32Gb GDDR chips on a 128b / 4-channel bus but that would be terribly bandwidth-starved.

The reduced memory channel count on AD104 compared to GA104 does seem to be a play to reduce die size (I suppose the increased L2 cache takes up less space than two GDDR channels?), plus it means lower BOM for board partners.
 
Yeah, GPU memory capacity options are fundamentally related to the design of the core. Not Shader Modules exactly (memory controller is related to ROPs though) but the possible GDDR configs are dictated by the number of memory channels from the IMC.

GA106 / AD104 = 6x32-bit channels = 192b bus, so 6GB or 12GB with current 8Gb/16Gb GDDR chips
GA104 / AD103 = 8x32b channels = 256b bus, 8GB or 16GB
GA102 / AD102 = 12x32b channels = 384b bus, 12GB or 24GB

Not all of the channels need to be filled- which is how configs like 10GB GA102 or 8GB GA106 are possible- but doing that cuts gddr bandwidth proportionally (see 8GB GA106 3060 being terrible while 8GB GA104 3060Ti is great). AD104 could theoretically have 16GB with not-yet-available 32Gb GDDR chips on a 128b / 4-channel bus but that would be terribly bandwidth-starved.

The reduced memory channel count on AD104 compared to GA104 does seem to be a play to reduce die size (I suppose the increased L2 cache takes up less space than two GDDR channels?), plus it means lower BOM for board partners.
Thank you for explaining this better than I could! It highlights what some fail to realize, you cannot just slap as much memory on a card as you want to given your specific architecture design. Sure, they could have increased the bus to 256, but that increases cost and as AMD already demonstrated with Infinity cache, they can get by with less bandwidth while using larger cache size.
 
Everything at 10 gb or below is getting absolutely hammered.
It is a bit strange how different result are from publication to publication:
https://www.techspot.com/review/2627-hogwarts-legacy-benchmark/#RT_1080p-color
RT_1080p-color.png


7900xt beating a 4090 with raytracing on seem to indicate a large issue going on, could have been different part of the game being playing in ? Or some driver-game issue that have been fix, because on techspot test Hogwart run below 60 fps even on 3090 before 10gig of vram become an issue it seem.
 
  • Like
Reactions: noko
like this
It is a bit strange how different result are from publication to publication:
https://www.techspot.com/review/2627-hogwarts-legacy-benchmark/#RT_1080p-color
View attachment 554215

7900xt beating a 4090 with raytracing on seem to indicate a large issue going on, could have been different part of the game being playing in ? Or some driver-game issue that have been fix, because on techspot test Hogwart run below 60 fps even on 3090 before 10gig of vram become an issue it seem.
The specific benchmark numbers posted above your comment are from the same article, but the Hogsmead section, which shows the 3080 10GB performing rather poorly without DLSS to boost framerates to more playable levels. Decko87 used the worst case scenario to try and prove his point, however, Hogwarts is not optimized very well nor are drivers fully optimized for the game. Revisit at the end of the year after some patches and driver updates and see how things shake out. Currently, Hogwarts does not give an accurate picture of future games because it is not well optimized yet. It is somewhat useful for seeing what VRAM constrained scenarios could look like.
 
Last edited:
I had the chance to buy a evga 3090 ftw3 ultra gaming card 24gb for 750$ or fight the herds and get a 4070ti at 850$ minimum if lucky online.

I bought the 3090 and am going to wait until 5 series. All the games I play at 4k play with plenty of fps. I'd rather do a full upgrade. I may even wait until 6 series. I am just afraid to see the price of 6 series cards.

I am thinking a 6090 or 6080 will cost 2.5k to 3k for a video card. If that happens I will quit pcs as my past time hobby and find a new direction for passing time.
 
I had the chance to buy a evga 3090 ftw3 ultra gaming card 24gb for 750$ or fight the herds and get a 4070ti at 850$ minimum if lucky online.

I bought the 3090 and am going to wait until 5 series. All the games I play at 4k play with plenty of fps. I'd rather do a full upgrade. I may even wait until 6 series. I am just afraid to see the price of 6 series cards.

I am thinking a 6090 or 6080 will cost 2.5k to 3k for a video card. If that happens I will quit pcs as my past time hobby and find a new direction for passing time.
If PC hardware should become so expensive and wages don't increase to offset inflation, then the loss of sales due to affordability may force manufacturers to figure out how to bring prices down to more affordable levels. As long as people keep shelling out cash for what the manufacturer is offering at the price they ask for it, we are not going to see a market driven change in costs for our benefit.
 
Edit:
Ended up buying 7900 XTX.

I'm a casual gamer, but I game at 4k/120hz. I found out my PS5 gets more gaming use than PC but there are games that I prefer to play on PC, like upcoming Diablo 4 and many MMORPGs. And it just happens that those games are usually not very demanding.

I could probably try and find 3090 for about the same price, but I want a white card and that automatically makes it a lot rarer second hand (I'm from EU).
I'm not anti DLSS(3) and I think Ray Tracing performance is important so this is the reason I'm considering 4070Ti instead of 3090/Ti or AMD. I think AMD will just fall behind in performance as more and more games start implementing RT. AMD really messed up this gen.

Wise move.

7900XTX competes with at least a 4080 and up at all resolutions, especially 2/4k+. At ~800 there's nothing close. (A bit surprised the media doesn't make more noise about this...)

Would have also considered the 3090 since it is round that price range, but it just gets absolutely smoked by the XTX. DLSS/RT is good and all, but when your card is down ~20-30% or more in general at your playing resolution... it ain't that good.
 
After seeing the issues with VRAM on the Last of Us Part 1, and Diablo 4.....I think you made the right call!
 
It is a bit strange how different result are from publication to publication:
https://www.techspot.com/review/2627-hogwarts-legacy-benchmark/#RT_1080p-color
View attachment 554215

7900xt beating a 4090 with raytracing on seem to indicate a large issue going on, could have been different part of the game being playing in ? Or some driver-game issue that have been fix, because on techspot test Hogwart run below 60 fps even on 3090 before 10gig of vram become an issue it seem.
This not the only title that Nvidia struggles with RT at lower resolutions. Nvidia has way more overhead on the CPU, add in more draw calls due to RT you have to cull way less on geometry not in view for reflections and lighting. Also when using resolution scaling you run smack right into this limitation.

Nothing is ever cut and dry, Nvidia RT hardware is superior, it is other aspects that can hold it back.
 
As for the value of the 4070Ti, at current pricing, I consider it laughable at best. No DP 2.1, so much for those monitors when available. 12gb already showing weakness.
 
  • Like
Reactions: isp
like this
It is a bit strange how different result are from publication to publication:
https://www.techspot.com/review/2627-hogwarts-legacy-benchmark/#RT_1080p-color
View attachment 554215

7900xt beating a 4090 with raytracing on seem to indicate a large issue going on, could have been different part of the game being playing in ? Or some driver-game issue that have been fix, because on techspot test Hogwart run below 60 fps even on 3090 before 10gig of vram become an issue it seem.
RAM, CPU, cooling, how it is all configured play into it. Site to site.
Clearly though the 7000 series is a lot more formidable @ RT then the 6000 series was and carry an unquestionable raster heft.
Unfortunately for me looking for a VR GPU upgrade everyone sucks. Nvidia for performance/price and AMD just performance (why regress you twats!)
 
The 4070 Ti doesn't have a castrated bus, it is inline with the resolution that a 4070 Ti should target, 1440. It has larger cache system (compared to Ampere) like AMD's Infinity Cache, adding larger cache works well until you overun the cache, so 1440 and below resolutions are where the 4070 Ti shines. the 4080/4090 are the 4K cards for modern video games. AMD lowered the main VRAM bus width because of Infinity Cache, works great at lower resolutions, sometimes not as well at 4K. Just becuase it is different than you are used to doesn't mean it is castrated or worse. Are you a GPU engineer? Clearly nvidia didn't think it was a problem. Too many want their cake and to eat it to on a midrange card...

Did Jensen Huang write this?
 
Did Jensen Huang write this?
No, but what I am saying is stop expecting 4K rendering monster from a mid range card. Everyone acts like the 4070 Ti should render like a 4090 and complain that it does not. Tiered products, tiered prices, tiered performance. This isn't rocket science. Who doesn't want 4K monster GPU for $200-300? The reality is companies have BOMs, overhead, melting 4090 power connector issues, etc. to cover financially. Nvidia will charge what people seem willing to pay. Don't like the price? Buy last gen or wait until next gen and hope for better prices. I keep hoping AMD will become a larger competitor to give Nvidia competition, that will help with lowering prices a bit, but many prefer Nvidia to AMD, so here we are.
 
No, but what I am saying is stop expecting 4K rendering monster from a mid range card. Everyone acts like the 4070 Ti should render like a 4090 and complain that it does not. Tiered products, tiered prices, tiered performance. This isn't rocket science. Who doesn't want 4K monster GPU for $200-300? The reality is companies have BOMs, overhead, melting 4090 power connector issues, etc. to cover financially. Nvidia will charge what people seem willing to pay. Don't like the price? Buy last gen or wait until next gen and hope for better prices. I keep hoping AMD will become a larger competitor to give Nvidia competition, that will help with lowering prices a bit, but many prefer Nvidia to AMD, so here we are.
Who's expecting the 4070ti to render like a 4090? No, what's mostly stated is paying $800 for midrange 1440p is nonsense.

Midrange for $800 is the issue. Nobody ever said it should be like a 4090.
 
Who's expecting the 4070ti to render like a 4090? No, what's mostly stated is paying $800 for midrange 1440p is nonsense.

Midrange for $800 is the issue. Nobody ever said it should be like a 4090.
I agree that $800 seems a bit much for 1440, but I cannot buy a 2023 model Ford F-150 for 2003 prices either. Inflation is a thing and stuff gets more expensive. I am not privy to Nvidia financials to see if they are really price gouging people. Gas doens't cost .99 cents a gallon anymore either. Prices go up. Unfortunately I think the ETH crypto boom caused to severe a price change as GPU manufacturers saw consumers willing to open their wallets prety wide for Ampere gen cards. When more people buy AMD and wait on Nvidia, the greater the pressure to lower prices to more palpable levels, but I doubt that will happen.

Edit: I am seeing on Reddit posts is people who don't like the 12 GB VRAM, don't like that they cannot play 4K maxed out.
 
I am seeing on Reddit posts is people who don't like the 12 GB VRAM, don't like that they cannot play 4K maxed out.
Well tbf, that's reddit. Every subreddit there is full of people who can't figure out why item A that costs 20% of item B can't do everything that item B can.

*Edit* I will say that yes, I'd be bummed about paying $800 for 12gb of vram. The difference is I know what my expectations are with 12gb. Can't say the same for them :D
 
I agree that $800 seems a bit much for 1440, but I cannot buy a 2023 model Ford F-150 for 2003 prices either. Inflation is a thing and stuff gets more expensive. I am not privy to Nvidia financials to see if they are really price gouging people. Gas doens't cost .99 cents a gallon anymore either. Prices go up. Unfortunately I think the ETH crypto boom caused to severe a price change as GPU manufacturers saw consumers willing to open their wallets prety wide for Ampere gen cards. When more people buy AMD and wait on Nvidia, the greater the pressure to lower prices to more palpable levels, but I doubt that will happen.

Edit: I am seeing on Reddit posts is people who don't like the 12 GB VRAM, don't like that they cannot play 4K maxed out.
What does that mean 'cannot play maxed out?' I think you hit the nail on the head regarding prices. I blame those ppl. I don't want to get attacked or have mods after me - but, that is the reality - lots of ppl spent crazy prices especially for 'crypto cards' - and now BOTH NVIDIA AND AMD - know that ppl will pay big bucks for these cards. The new gen is expensive even though they're more readily available and the cards 3080 and higher tier (Ampere) are either out of circulation or they're priced way above MSRP - they're almost at crypto period prices. Also, many are third party sellers hoping that buyers have no clue about all that - and will have the cash to spend at those insane prices.
If only ppl just waited and didn't buy - prices probably would have gone down.
I bought a 3060 new (and later sold it) and bought a 3080 used - I am only concerned about the VRAM thing. It can do some 4K (games) and I mostly hit 60 fps (most of the time).
 
Back
Top