• Some users have recently had their accounts hijacked. It seems that the now defunct EVGA forums might have compromised your password there and seems many are using the same PW here. We would suggest you UPDATE YOUR PASSWORD and TURN ON 2FA for your account here to further secure it. None of the compromised accounts had 2FA turned on.
    Once you have enabled 2FA, your account will be updated soon to show a badge, letting other members know that you use 2FA to protect your account. This should be beneficial for everyone that uses FSFT.

RTX 5070 Founders Edition Review & Benchmarks

(Not my pic) Taken from the Denver thread on the Micro Center discord server this evening. No one wants them. :ROFLMAO:

1741235602526.png
 
Last edited:
maybe that $740 skus (tuf gaming OC it seem).... does seem not fully sold out on their website still, if they are up to date:

https://www.microcenter.com/product...d-triple-fan-12gb-gddr7-pcie-50-graphics-card
Limited Availability at Denver Store - Buy In Store

$740 is really pushing over pushing it... the moment 5070 stock is anything in any way, the overpriced skus will probably be easy to find like that, not just the microcenter people.
 
Last edited:
Well AMDs own numbers say 2% better then a 5070ti in raster. 8% or so less in RT was it? Something like that. I am going to bet RT is going to be all over the place... with some games like Indiana Jones showing AMD maybe even winning, and some games like Cyberpunk showing a bigger gap for NV then others. RT is one of those things, that on Medium-High settings there may be almost zero difference between them, and if you start talking about insane path tracing NV pulls ahead. I think AMD is going to look pretty good... cause x70 class hardware isn't path tracing hardware from either team.
Tech Jesus today... made a point of pointing out a few cards like a 3090TI in RT benchmarks saying for some reason we added this and it might make sense tomorrow. Wink wink.
RT could be Nvidia's new Tessellation over use situation from yester-year.
Push RT until your cards stress, but hurts the competition more! With a product stack so varied as Nvidia, from shit to God, Nvidia has every logical reason to repeat this agenda as they hold the top spec cards.
 
I am hopeful for the 9070 cards, I’ll likely grab one to pair with a 5950 that needs a GPU but it will depend on how many of the cards stay around MSRP, those OC AIB models are not nearly as interesting and not likely to end up in my shopping cart.
 
Cyberpunk isn't even game at this point. Its essentially a synthetic benchmark. Does anyone seriously still play cyberpunk?
I finally got around to starting it just a few weeks back, I hope to finish my first play through this upcoming weekend.
 
RT could be Nvidia's new Tessellation over use situation from yester-year.
Push RT until your cards stress, but hurts the competition more! With a product stack so varied as Nvidia, from shit to God, Nvidia has every logical reason to repeat this agenda as they hold the top spec cards.
They have been trying for just that. Their remix stuff was an attempt to take a bunch of old stuff and push path traced lighting into it. I mean who doesn't want to play a 25 year old game at 25 fps. They also pushed hard for path tracing in games like Cyberpunk.

Its mostly not really worked out imo. The truth is most developers want have and want to be adding just enough RT so it can still run on AMD console hardware which is obviously not doing path tracing.
Even with the 7000 AMD cards in raster games with tasteful light RT they held their own just fine. It seems now AMD can handle a bit more. Still the path tracing crazy settings aren't realistic for the 5070ti either.

Hopefully now at least that AMD is 85-90% the same RT performance at the 70 class. Gamers can stop using that as the reason to stick with NV despite the gouging, melting cables and whatever else people are annoyed with NV about. Can't even point to DLSS anymore. As long as FSR3.1/4 finds its way into more games it seems like its basically on par now. They bested DLSS from 2 months ago... and are just a tad off NV new transformer model. Would love to see them side by side at some point. What I have seen I prefer FSR4. Transformer does look to have a little more resolved detail, however I know its not real looks to me like a lot of it is "AI" over sharpening. Probably looks good most of the time but its not really the artistic intent of the game developers either. IMO FSR looks like it retains that better while still clearly looking at least as good as native, with cleaner AA.
 
I finally got around to starting it just a few weeks back, I hope to finish my first play through this upcoming weekend.
Finally got a sale you bit on? :)
Once you finish it I'm sure you'll shelf it unless you need to bench something as well. lol
 
They have been trying for just that. Their remix stuff was an attempt to take a bunch of old stuff and push path traced lighting into it. I mean who doesn't want to play a 25 year old game at 25 fps. They also pushed hard for path tracing in games like Cyberpunk.

Its mostly not really worked out imo. The truth is most developers want have and want to be adding just enough RT so it can still run on AMD console hardware which is obviously not doing path tracing.
Even with the 7000 AMD cards in raster games with tasteful light RT they held their own just fine. It seems now AMD can handle a bit more. Still the path tracing crazy settings aren't realistic for the 5070ti either.

Hopefully now at least that AMD is 85-90% the same RT performance at the 70 class. Gamers can stop using that as the reason to stick with NV despite the gouging, melting cables and whatever else people are annoyed with NV about. Can't even point to DLSS anymore. As long as FSR3.1/4 finds its way into more games it seems like its basically on par now. They bested DLSS from 2 months ago... and are just a tad off NV new transformer model. Would love to see them side by side at some point. What I have seen I prefer FSR4. Transformer does look to have a little more resolved detail, however I know its not real looks to me like a lot of it is "AI" over sharpening. Probably looks good most of the time but its not really the artistic intent of the game developers either. IMO FSR looks like it retains that better while still clearly looking at least as good as native, with cleaner AA.

In my eyes FSR4 seems to be so close to DLSS Transformer model that you can probably just cross the difference with a little bit of AMD's own sharpening filter or some other low noise sharpening filter like Lumasharpen. All the details seem to be there in FSR4, they just do not stand out as strongly as DLSS transformer model.
 
  • Like
Reactions: ChadD
like this
The truth is most developers want have and want to be adding just enough RT so it can still run on AMD console hardware
Or a 3060 gpu ;)

Let just say they did not price high RT performance in a way that could ever destroy the competition that lack it. That one way they played fair with the competition, very high price.
 
Or a 3060 gpu ;)

Let just say they did not price high RT performance in a way that could ever destroy the competition that lack it. That one way they played fair with the competition, very high price.
Well there is no way they could have cramped enough expensive tensor cores into lower end parts to make "high" end RT a thing. Even now a 5070 ti (and AMDs 9070 by extension) are not really a path tracing capable cards. Heck even the 5080 isn't really a path tracing card without use of DLSS performance mode... even then I doubt many people play at those settings. The upgrade in visuals from just a medium-high RT setting just isn't enough to justify playing at low FPS with higher latency. I'm sure some people do play some games like that. Always the one guy playing at 30FPS with everything set to ultra. lol

Damn consoles though. They have held game development back in terms of pushing hardware for a lot longer then just RT. Though with mid range cards now pushing a grand, perhaps we should just be ok with that. At least we don't have any games that won't run on 5 year old cards.
 
Finally got a sale you bit on? :)
Once you finish it I'm sure you'll shelf it unless you need to bench something as well. lol
Bought it at least a year ago, just never had the time.
I’ll likely do an additional play through or two for the different starting bits and and some variation in builds but it is a very pretty game.
 
  • Like
Reactions: ChadD
like this
That's because they're starting with too low a framerate... It's a scenario no one would try, starting at 25-30fps, in actual usage. EDIT: Is that at 1440p? Looks like 4k which this card isn't meant for.

Wrong, this is a 4K card. Jensen very specifically told us that the 5070 is a 4090 for $549, and the 4090 is a 4K card, so therefore the 5070 is a 4K card with 12GB of VRAM, all thanks to the wonders of AI and MFG! He said this while standing in front of a giant picture of an 5070 with an $549 price tag, so therefore, it’s true. Our expectations and the differentiation of the product stack are based on the explanation of performance provided by the CEO of Nvidia himself. Are you suggesting you know better than Jensen Huang about the resolution at which you game at with the cards he builds and sells to you?
 
What a ewaste I didn't think the card would be that bad until I saw 15 fps in Indiana Jones the great circle. Really is a shame there are no 5090 5080 cards to be had at retail if there is they are scalped in seconds.
At least AMD seemingly has more instock and I'm sure the drivers are way better.
 
Yep. To be clear, people bought up the 5090 Astrals that also showed up yesterday from what I saw on Discord. 5070 though? Nobody wants a +$190 over MSRP 5070.
I have to wonder if that's also partially explained by the fact that if someone is willing to pay $2,000 for a gpu, then they'll also likely pay $3,000 for the gpu. But, $550 is already a lot of money to many people, who simply can't afford another $200 on top of that. The 5090 and 5070 are aimed at two very different types of consumers.
 
I have to wonder if that's also partially explained by the fact that if someone is willing to pay $2,000 for a gpu, then they'll also likely pay $3,000 for the gpu. But, $550 is already a lot of money to many people, who simply can't afford another $200 on top of that. The 5090 and 5070 are aimed at two very different types of consumers.
Yeah that would be my guess. 5090 has a market regardless of price really. The buyer closer to the $500-$600 range more likely has a budget. Also not good optics to be $10 below the supposed MSRP of the next tier up of card which is vastly better than a 5070.
 
Yep. To be clear, people bought up the 5090 Astrals that also showed up yesterday from what I saw on Discord. 5070 though? Nobody wants a +$190 over MSRP 5070.

5090 also has the halo effect, so they can get away with a higher premium. AMD really can’t.

Also, I think there’s a case to be made of using 5090 as a workstation card. Maybe home AI type stuff. That won’t exist with the 9070XT, which would actually benefit the AMD consumer since it would translate to a better likelihood of affordable inventory.
 
Saw several 5070s in stock in our Eastern Canada BestBuy.

Building a new 9950X3D PC, coming from a 4790K with a 1080Ti. Told son we will stick the 1080Ti for now till 5080 prices are a little better, or even the 9070XT being in stock. He said the 5070 price is not bad, but I explained the 12GB is too low. Thae came back and saw these reviews of the 5070. Oof! Should be less than half the price they want for it, about a grand canadian at the store. Amazing the gap in performance between 5070 and the 5070Ti!

The 5080 MSI Suprim Liquid has been in stock online but at CDN$2400. It was initiall at $1999. I have not seen any 9070XT anywhere.
 
My observations of my local Micro Center stock which keeps fairly accurate stock lists online:

5070's at MSRP are selling, but still not immediately. Last big stock they got in still took 2-3 days to sell through.

5070's at high AIB markups ($699 or more) are selling but much slower. Takes days/week for them to sell through.

5070 Ti's of any price are selling fairly fast.

5080's and 5090's are selling out immediately as they come in.

On the AMD side, 9070 XT's are selling out pretty immediately as well.
 
Do you think Nvidia will eventually pull an AMD with a price drop? Or is that just crazy talk? If I am honest, $400-$500 would make them a lot more attractive, $500 for a good AIB
 
Do you think Nvidia will eventually pull an AMD with a price drop? Or is that just crazy talk? If I am honest, $400-$500 would make them a lot more attractive, $500 for a good AIB
Official price drop, first year I doubt it, good sku at msrp should still able to move well enough, a 5070 super release ahead of schedule with an aggressive price seem more what they do then "admit" price drop.

They have good enough room to justify a super version, only 4% more core, but give it 2-3% more clock at the same time and 18GB vram and AMD need to cut the 9070 price to sell them.
 
Official price drop, first year I doubt it, good sku at msrp should still able to move well enough, a 5070 super release ahead of schedule with an aggressive price seem more what they do then "admit" price drop.

They have good enough room to justify a super version, only 4% more core, but give it 2-3% more clock at the same time and 18GB vram and AMD need to cut the 9070 price to sell them.
Why 18gb? Don't they have to make it 24 using double density chips?
 
Why 18gb? Don't they have to make it 24 using double density chips?
They will have 3GB instead of 2GB module available soon (RTX 6000 and some laptop use them already I think), which open the option to add 50% vram without changing anything else or adding other cost outside the vram module I think.

We can expect 12GB version of 8GB card and 18GB of 12GB being possible has well.
 
Why 18gb? Don't they have to make it 24 using double density chips?
GDDR7 modules will be available in 2, 3, 4, 6, and 8 GB chips, and the answer to your question is pricing.
GDDR7 and HBM3e use the same manufacturing nodes, and those nodes are at capacity, making more of one takes away from the other.
To alleviate this, Nvidia is pushing up Rubin, which uses HBM4. TSMC has been contracted by SK Hynix to produce HBM4 modules on the N5 nodes, this will free up their existing facilities to ramp up GDDR7 production.
Once GDDR7 production picks up, we can then look forward to the prices coming down and cards starting to adopt the higher density chips, but currently, those chips are priced so that nobody orders them so they can focus on fulfilling their existing Nvidia supply agreements.

Had Samsung managed to get its house in order, things would have been different, but they didn't, so here we are.
 
I understand it now, thanks guys! 18gb does make a pretty good middle ground, it is not as absurd as 24gb in a $600 card.
 
I wonder how long until we start seeing cards using fewer channels and higher densities?

I mean the memory interface itself is one of the more expensive parts of the GPU silicon so using 4x4GB chips to get to 16GB instead of 8x2GB would be a significant cost savings.
Start pairing that with fewer PCIe lanes and more still, I’m thinking that’s the only way we start seeing GPU price reductions any time soon, which doesn’t inspire me any…
 
Back
Top