- Joined
- Oct 12, 2020
- Messages
- 1,337
(Not my pic) Taken from the Denver thread on the Micro Center discord server this evening. No one wants them. 

Last edited:
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
RT could be Nvidia's new Tessellation over use situation from yester-year.Well AMDs own numbers say 2% better then a 5070ti in raster. 8% or so less in RT was it? Something like that. I am going to bet RT is going to be all over the place... with some games like Indiana Jones showing AMD maybe even winning, and some games like Cyberpunk showing a bigger gap for NV then others. RT is one of those things, that on Medium-High settings there may be almost zero difference between them, and if you start talking about insane path tracing NV pulls ahead. I think AMD is going to look pretty good... cause x70 class hardware isn't path tracing hardware from either team.
Tech Jesus today... made a point of pointing out a few cards like a 3090TI in RT benchmarks saying for some reason we added this and it might make sense tomorrow. Wink wink.
I finally got around to starting it just a few weeks back, I hope to finish my first play through this upcoming weekend.Cyberpunk isn't even game at this point. Its essentially a synthetic benchmark. Does anyone seriously still play cyberpunk?
They have been trying for just that. Their remix stuff was an attempt to take a bunch of old stuff and push path traced lighting into it. I mean who doesn't want to play a 25 year old game at 25 fps. They also pushed hard for path tracing in games like Cyberpunk.RT could be Nvidia's new Tessellation over use situation from yester-year.
Push RT until your cards stress, but hurts the competition more! With a product stack so varied as Nvidia, from shit to God, Nvidia has every logical reason to repeat this agenda as they hold the top spec cards.
Finally got a sale you bit on?I finally got around to starting it just a few weeks back, I hope to finish my first play through this upcoming weekend.
They have been trying for just that. Their remix stuff was an attempt to take a bunch of old stuff and push path traced lighting into it. I mean who doesn't want to play a 25 year old game at 25 fps. They also pushed hard for path tracing in games like Cyberpunk.
Its mostly not really worked out imo. The truth is most developers want have and want to be adding just enough RT so it can still run on AMD console hardware which is obviously not doing path tracing.
Even with the 7000 AMD cards in raster games with tasteful light RT they held their own just fine. It seems now AMD can handle a bit more. Still the path tracing crazy settings aren't realistic for the 5070ti either.
Hopefully now at least that AMD is 85-90% the same RT performance at the 70 class. Gamers can stop using that as the reason to stick with NV despite the gouging, melting cables and whatever else people are annoyed with NV about. Can't even point to DLSS anymore. As long as FSR3.1/4 finds its way into more games it seems like its basically on par now. They bested DLSS from 2 months ago... and are just a tad off NV new transformer model. Would love to see them side by side at some point. What I have seen I prefer FSR4. Transformer does look to have a little more resolved detail, however I know its not real looks to me like a lot of it is "AI" over sharpening. Probably looks good most of the time but its not really the artistic intent of the game developers either. IMO FSR looks like it retains that better while still clearly looking at least as good as native, with cleaner AA.
Or a 3060 gpuThe truth is most developers want have and want to be adding just enough RT so it can still run on AMD console hardware
Well there is no way they could have cramped enough expensive tensor cores into lower end parts to make "high" end RT a thing. Even now a 5070 ti (and AMDs 9070 by extension) are not really a path tracing capable cards. Heck even the 5080 isn't really a path tracing card without use of DLSS performance mode... even then I doubt many people play at those settings. The upgrade in visuals from just a medium-high RT setting just isn't enough to justify playing at low FPS with higher latency. I'm sure some people do play some games like that. Always the one guy playing at 30FPS with everything set to ultra. lolOr a 3060 gpu
Let just say they did not price high RT performance in a way that could ever destroy the competition that lack it. That one way they played fair with the competition, very high price.
HAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHA!!!!!!!!(Not my pic) Taken from the Denver thread on the Micro Center discord server this evening. No one wants them.
View attachment 714902
Bought it at least a year ago, just never had the time.Finally got a sale you bit on?
Once you finish it I'm sure you'll shelf it unless you need to bench something as well. lol
That's because they're starting with too low a framerate... It's a scenario no one would try, starting at 25-30fps, in actual usage. EDIT: Is that at 1440p? Looks like 4k which this card isn't meant for.
(Not my pic) Taken from the Denver thread on the Micro Center discord server this evening. No one wants them.
View attachment 714902
Yep. To be clear, people bought up the 5090 Astrals that also showed up yesterday from what I saw on Discord. 5070 though? Nobody wants a +$190 over MSRP 5070.Ouch. For the price, that’ll be a TUF sell…
I have to wonder if that's also partially explained by the fact that if someone is willing to pay $2,000 for a gpu, then they'll also likely pay $3,000 for the gpu. But, $550 is already a lot of money to many people, who simply can't afford another $200 on top of that. The 5090 and 5070 are aimed at two very different types of consumers.Yep. To be clear, people bought up the 5090 Astrals that also showed up yesterday from what I saw on Discord. 5070 though? Nobody wants a +$190 over MSRP 5070.
Yeah that would be my guess. 5090 has a market regardless of price really. The buyer closer to the $500-$600 range more likely has a budget. Also not good optics to be $10 below the supposed MSRP of the next tier up of card which is vastly better than a 5070.I have to wonder if that's also partially explained by the fact that if someone is willing to pay $2,000 for a gpu, then they'll also likely pay $3,000 for the gpu. But, $550 is already a lot of money to many people, who simply can't afford another $200 on top of that. The 5090 and 5070 are aimed at two very different types of consumers.
Yep. To be clear, people bought up the 5090 Astrals that also showed up yesterday from what I saw on Discord. 5070 though? Nobody wants a +$190 over MSRP 5070.
Official price drop, first year I doubt it, good sku at msrp should still able to move well enough, a 5070 super release ahead of schedule with an aggressive price seem more what they do then "admit" price drop.Do you think Nvidia will eventually pull an AMD with a price drop? Or is that just crazy talk? If I am honest, $400-$500 would make them a lot more attractive, $500 for a good AIB
Why 18gb? Don't they have to make it 24 using double density chips?Official price drop, first year I doubt it, good sku at msrp should still able to move well enough, a 5070 super release ahead of schedule with an aggressive price seem more what they do then "admit" price drop.
They have good enough room to justify a super version, only 4% more core, but give it 2-3% more clock at the same time and 18GB vram and AMD need to cut the 9070 price to sell them.
They will have 3GB instead of 2GB module available soon (RTX 6000 and some laptop use them already I think), which open the option to add 50% vram without changing anything else or adding other cost outside the vram module I think.Why 18gb? Don't they have to make it 24 using double density chips?
GDDR7 modules will be available in 2, 3, 4, 6, and 8 GB chips, and the answer to your question is pricing.Why 18gb? Don't they have to make it 24 using double density chips?
12gb is 6x 2gb chips (double density as you're calling it). The next step is 3gb chips, which would be 6x 3gb chips = 18gb.Why 18gb? Don't they have to make it 24 using double density chips?
(Not my pic) Taken from the Denver thread on the Micro Center discord server this evening. No one wants them.
View attachment 714902