Guess the price of 24GB RTX 3090 FE

Guess the Price of 24GB RTX 3090


  • Total voters
    272
  • Poll closed .
I have only bought Nvidia cards in the 2010s. With the Extreme price Gouging now I might consider moving to the Red Team and Big Navi for the first time if the price/performance is right.

Nvidia right now is Exploiting the market and taking advantage of Consumers far too heavily. I LOVE their technology but I HATE their Corporate GREED over the past few years lately.
Are you really this new to capitalism, that you see success as failure?

This is seriously off-topic.
 
I have only bought Nvidia cards in the 2010s. With the Extreme price Gouging now I might consider moving to the Red Team and Big Navi for the first time if the price/performance is right.

Nvidia right now is Exploiting the market and taking advantage of Consumers far too heavily. I LOVE their technology but I HATE their Corporate GREED over the past few years lately.
AMD will be doing the same, they have no bargaining material to be the profit margin leader but trust they will price their products in order to make more than they were in the past. Look at the launch of Navi, they wanted to charge more but couldn't so they had to reduce it's price because it wasn't faster than what they initially targeted. The launch of the xt lines of CPUs was simply because they saw the original chips constantly on sale for lower than what they wanted so they capitalized on the process putting out better chips to get the old margin back. If they could still brand them as the non xt models and end the sale prices they would. Realize these companies aren't here for the hardware or gaming community, they are here for the wallets tied to those. Don't get the difference between an acquaintance and friend twisted.
 
It just feels like they are catering far too heavily to the elite 1% with these prices and pushing out middle class consumers from getting their hands on these products though.

Am I the only one that feels Nvidia is getting far too GREEDY in the past couple of product cycles?
 
It just feels like they are catering far too heavily to the elite 1% with these prices and pushing out middle class consumers from getting their hands on these products though.

Am I the only one that feels Nvidia is getting far too GREEDY in the past couple of product cycles?
I was a top card purchaser before but now I'm just buying the performance level that matches what I'm willing to pay. I won't be paying for the halo cards of either brand unless they are $800-900 range. I know they won't be so I will get whatever performs better in that range. There will need to be a price reset on these but that won't happen until performance hits a point we're we are talking under 10% top to bottom. If they are attacked in all fronts even if one company is attacking high while the other on the lower price point.
 
Last edited:
It just feels like they are catering far too heavily to the elite 1% with these prices and pushing out middle class consumers from getting their hands on these products though.

Am I the only one that feels Nvidia is getting far too GREEDY in the past couple of product cycles?
This doesn't make any sense.

Are you saying that Nvidia has stopped making GPUs at lower price points, or just that you can't afford the level of performance that you think you deserve?

Please just take it to r/amd.
 
The price will be way too high for what it is, my prediction. Find it more likely if these prices hold out then I likely will stick with my 1080 as it works just fine for me still. The 290X still remains the most I ever paid for a video card.
 
It just feels like they are catering far too heavily to the elite 1% with these prices and pushing out middle class consumers from getting their hands on these products though.

Am I the only one that feels Nvidia is getting far too GREEDY in the past couple of product cycles?

What? Nvidia makes cards for every budget. Your entitlement is showing.
 
It just feels like they are catering far too heavily to the elite 1% with these prices and pushing out middle class consumers from getting their hands on these products though.

Am I the only one that feels Nvidia is getting far too GREEDY in the past couple of product cycles?
These are luxury time-wasting products. Not food and water.

The fight is not between Nvidia or AMD, or between Nvidia/AMD and consumers. If there's a beef then it's between consumers and the shareholders of these companies, whom Nvidia/AMD are legally obligated to create the most profit and value for.

Yes there's probably something to be said for these companies choosing to go public in the first place and putting themselves in this predicament. But you see this is the American way. You see?
 
Last edited:
If you can’t afford something, get something you can afford or get a better job. Every day of our lives we don’t buy something because we can’t afford it.


As much as I hate wccftech I’m inclined to believe those prices (for 12gb) We’re close enough now that the ballpark will be known. It’s not like nvidia are like apple who have end to end ownership or direction of supply chain and mrp. FE is a wildcard but there’s a ton of points stuff will start leaking from now. It’s impossible to keep a lid on things past a certain timeframe. Unless you’re Apple, and they struggle these days.
 
Last edited:
And the highest polling number got it correct. Assuming Jensen just gave us the price for his FE models:
20200901173540.jpg
 
so way more CUDA cores then everyone thought I guess? Now it’s starting to make sense. Rumors were all shit. Looks like nvidia planted them lol. 3080 has 8704 CUDA cores.

Also 3080 is a steal at this point. I mean 6 Tflops difference to 3090 and more memory and you pay more than double the cost?

might be better to wait for the 20GB models though if they are coming out.
 
Okay now. Looking at NVIDIA’s website. The might just be basing performer numbers with RTX and DLSS on.
Borderlands 3, Minecraft and control all showing double the performance from 2080 going to the 3080 with RTx and dlss on at 4K.
 
The might just be basing performer numbers with RTX and DLSS on.
If they're talking about a game that supports at least DLSS, do we have a reason to test without it, other than answering the academic questions?

I still have some reservations with respect to DLSS showing false detail where it really isn't, or missing small things that would otherwise be seen, but I haven't seen any talk of analyzing for that stuff especially in competitive environments.
 
so way more CUDA cores then everyone thought I guess? Now it’s starting to make sense. Rumors were all shit. Looks like nvidia planted them lol. 3080 has 8704 CUDA cores.

Also 3080 is a steal at this point. I mean 6 Tflops difference to 3090 and more memory and you pay more than double the cost?

might be better to wait for the 20GB models though if they are coming out.

It looks more like CUDAs each have 2xFP32, and until very recently they were using half the number they are now reporting in the slides leaked from Partners including one that was live on a partner site in the last 24 hours. So this change to reporting double the CUDA cores is a VERY recent one.
 
If they're talking about a game that supports at least DLSS, do we have a reason to test without it, other than answering the academic questions?

I still have some reservations with respect to DLSS showing false detail where it really isn't, or missing small things that would otherwise be seen, but I haven't seen any talk of analyzing for that stuff especially in competitive environments.

Dude. Yes there is. But i wanna see true rasterizarion performance. Unless you are going to guarantee me dlss in every game then there is no point. Until then I don’t wanna take overall number with rtx and dlss on until it’s in every game.
 
Starting at $1499, is that Nvidia price or some price you will never see or some low quality version that an AIB will sell very limited quantities?

That's the NVidia FE price on their website next to the "Notify Me" button. It looks like there isn't an "FE Tax" this time:
https://www.nvidia.com/en-us/geforce/graphics-cards/30-series/rtx-3090/
GEFORCE RTX 3090
THE BFGPU
The GeForce RTX™ 3090 is a big ferocious GPU (BFGPU) with TITAN class performance. It’s powered by Ampere—NVIDIA’s 2nd gen RTX architecture—doubling down on ray tracing and AI performance with enhanced Ray Tracing (RT) Cores, Tensor Cores, and new streaming multiprocessors. Plus, it features a staggering 24 GB of G6X memory, all to deliver the ultimate gaming experience.

STARTING AT $1,499.00
Founders Edition

$1,499.00
 
  • Like
Reactions: noko
like this
so where's the traversal processor? :D:D:rolleyes::rolleyes:

BTW if the 1.9 performance/watt figure was real, wouldn't it make the 3070 almost twice as fast as the RTX2080Ti? Not that I'm not impressed by the 3070 being faster/cooler/cheaper than the RTX2080Ti.
 
If they're talking about a game that supports at least DLSS, do we have a reason to test without it, other than answering the academic questions?

I still have some reservations with respect to DLSS showing false detail where it really isn't, or missing small things that would otherwise be seen, but I haven't seen any talk of analyzing for that stuff especially in competitive environments.
I'm more than happy with DLSS adding detail that's not shown even at native resolution.
BTW I was expecting DLSS 3.0, maybe next year...
 
so where's the traversal processor? :D:D:rolleyes::rolleyes:

BTW if the 1.9 performance/watt figure was real, wouldn't it make the 3070 almost twice as fast as the RTX2080Ti? Not that I'm not impressed by the 3070 being faster/cooler/cheaper than the RTX2080Ti.

lots of unanswered questions right now. We should hear more leaks soon. Looks like they might be mentioning performance numbers with rtx and dlss on.They have it on their website for 3 games showing 3080 vs 2080 twice the performance with rtx and dlss on. Probably the same for 3070.

If performance is truly double and they are not just mentioning dlss and rtx on performance then 3080 20GB as rumored might by a buy for me. 3090 has 24Gb but performance is not worth the price.
 
I'm more than happy with DLSS adding detail that's not shown even at native resolution.
BTW I was expecting DLSS 3.0, maybe next year...

You were making fun of the Traversal Processor, but you believed the DLSS 3.0, works with every game nonsense?
 
BTW don't expect 20 GB versions of 3080 cards anytime soon. Micron only has 8gb DDR6X chips for now. Sure they can add twice the chips (like the 3090) but that would increase the cost a lot.
 
so where's the traversal processor? :D:D:rolleyes::rolleyes:

BTW if the 1.9 performance/watt figure was real, wouldn't it make the 3070 almost twice as fast as the RTX2080Ti? Not that I'm not impressed by the 3070 being faster/cooler/cheaper than the RTX2080Ti.
It's there, built inside of the GPU :D
 
You were making fun of the Traversal Processor, but you believed the DLSS 3.0, works with every game nonsense?
Every game that works with TAA? well yeah. DLSS is a work in progress, I didn't think DLSS2.0 would be this good and here we are.
 
I'm more than happy with DLSS adding detail that's not shown even at native resolution.
Really depends on whether that detail even fits.

Think in a something like a battle royal shooter, where either
  • DLSS adds detail for an enemy / item that isn't really there due to cuing off of some partial detail from the low-res sample, and this provides and opportunity for the player to make a costly mistake
  • DLSS doesn't pick up on some low-res micro-detail and the player misses something important like an enemy or item, also providing an opportunity for a costly mistake
I'm not saying that this is or isn't happening; I'm saying that I haven't seen how they're addressing the problem. This is something that becomes a problem with computational photography, for example, when 'ground truth' is required for a decision. Obviously much less critical for gaming, but it's something that I think that Nvidia and participating developers need to resolutely address.
 


Well. That’s not bad. So looks like big Navi is probably going to target around 3080. Given the specs that are rumored. Unless Ofcourse they can pack more cores in there.

about 40-50% faster then 2080ti in some games he mentioned.
I might just pick one up for the price. 3090 just not much performance increase for the price.
 
Really depends on whether that detail even fits.

Think in a something like a battle royal shooter, where either
  • DLSS adds detail for an enemy / item that isn't really there due to cuing off of some partial detail from the low-res sample, and this provides and opportunity for the player to make a costly mistake
  • DLSS doesn't pick up on some low-res micro-detail and the player misses something important like an enemy or item, also providing an opportunity for a costly mistake
I'm not saying that this is or isn't happening; I'm saying that I haven't seen how they're addressing the problem. This is something that becomes a problem with computational photography, for example, when 'ground truth' is required for a decision. Obviously much less critical for gaming, but it's something that I think that Nvidia and participating developers need to resolutely address.
So they should address a problem you don't know it even exists? There's proof that DLSS 2.0 can have even better quality than native res, there are several examples with Control and death stranding. (Jensen even showed it with Death Stranding during the stream)

I really didn't like DLSS at all when it was introduced and I really thought it was doomed to fail, it was blurry as hell and it looked worse in many cases than regular upscaling. But DLSS 2.0 proved me wrong. Faster performance, better IQ and free AA. What's not to like?
 
I said $1500...guess I won?

I won't be getting one, but still.

I'm probably going 3080 after all. I don't believe the price increase/performance is there to justify the 3090 over the 3080. If I still played games like I used to I would buy the 3090, but I don't play enough now to justify the cost.
 


Well. That’s not bad. So looks like big Navi is probably going to target around 3080. Given the specs that are rumored. Unless Ofcourse they can pack more cores in there.

about 40-50% faster then 2080ti in some games he mentioned.
I might just pick one up for the price. 3090 just not much performance increase for the price.

And that's with preliminary drivers I guess, as nvidia hasn't provided even AIB partners with full drivers.
 
So they should address a problem you don't know it even exists? There's proof that DLSS 2.0 can have even better quality than native res, there are several examples with Control and death stranding. (Jensen even showed it with Death Stranding during the stream)
The potential for the problem should be addressed, yes. It's a problem with image reconstruction regardless of domain.

Note that I'm interested in Nvidia and developers explaining how they're handling the potential for false detail causing problems.
 
Nobody cares about the price you wonder what the cost basis for raw materials you are paying for the development costs. I bet they make each of the for a 1/3 of the price.
 
And that's with preliminary drivers I guess, as nvidia hasn't provided even AIB partners with full drivers.

I don’t really expect there to be much difference. Nvidia wouldn’t allow him to even do that if they wanted to gimp it. It would look really bad. I am sure he has approved drivers.
 
Nobody cares about the price you wonder what the cost basis for raw materials you are paying for the development costs. I bet they make each of the for a 1/3 of the price.
It is shocking how people fail to understand this. Also the software cost. Driver, developer support and all the fancy new features don't come free.
 
I'm probably going 3080 after all. I don't believe the price increase/performance is there to justify the 3090 over the 3080. If I still played games like I used to I would buy the 3090, but I don't play enough now to justify the cost.

For the difference in cost I can upgrade everything else in my rig. Which I was going to do anyway, but $1500 on a card is not worth it to me.
 
I don’t really expect there to be much difference. Nvidia wouldn’t allow him to even do that if they wanted to gimp it. It would look really bad. I am sure he has approved drivers.
I find it weird they are letting DF do this. There are many bigger YouTubers out there.
 
Back
Top