Guess the price of 24GB RTX 3090 FE

Guess the Price of 24GB RTX 3090


  • Total voters
    272
  • Poll closed .
The high number of skeptics saying over $2000 maybe influenced from a few on this site that kept posting charts of how smaller nodes will hugely increase prices to ad nausea. If true then everything Nvidia has this time around will be much higher in cost. Add in DDR 6x increase cost, more complex circuit board cost for added power capability and for even faster memory if not more memory, more expensive coolers for the extra heat. Unless Nvidia decides to take less of a profit margin on the higher end cards, higher prices I expect but marketed in new and even more irritating ways.

Console pricing, availability, AMD pricing and availability and if effectively competitive could reduce pricing but I doubt AMD will want to significantly undercut Nvidia $/performance however measured for a number of reasons. Mostly availability or the number of GPUs they can get built. AMD quietness to see what Nvidia has to offer I would guess will give them a good chance to properly price their next generation GPUs to maximize profits (that is their job).
 
I doubt AMD will want to significantly undercut Nvidia $/performance
Agreed. This has been the assumption for many years, but AMD seem to have abandoned the strategy. And why shouldn't they - being the $200 leader since the 480 days hasn't brought any meaningful improvement to their market share. If they're not going to sell significantly more for being the better-value (subjective, I know) $/perf option, they might as well make as much money on each card as the competition does.
 
  • Like
Reactions: noko
like this
The high number of skeptics saying over $2000 maybe influenced from a few on this site that kept posting charts of how smaller nodes will hugely increase prices to ad nausea. If true then everything Nvidia has this time around will be much higher in cost. Add in DDR 6x increase cost, more complex circuit board cost for added power capability and for even faster memory if not more memory, more expensive coolers for the extra heat. Unless Nvidia decides to take less of a profit margin on the higher end cards, higher prices I expect but marketed in new and even more irritating ways.

Pessimist might be a better word than skeptic. But from what I have seen a lot of people that think the 3090 will cost more than $2000, think 3090 is the new name for Titan RTX.

I don't. IMO 3090 clearly signifies that it's a gaming performance tier above 3080, so I think it will have a Sub $2K pricing, but adding a x90 tier is also signifies price to move to a new tier as well, which I expect will be $1500 or more.
 
Big memory = big money. $1500 for the 3090 is probably the best we can hope for.


xbox-memory-cost.png
 
Pessimist might be a better word than skeptic. But from what I have seen a lot of people that think the 3090 will cost more than $2000, think 3090 is the new name for Titan RTX.

I don't. IMO 3090 clearly signifies that it's a gaming performance tier above 3080, so I think it will have a Sub $2K pricing, but adding a x90 tier is also signifies price to move to a new tier as well, which I expect will be $1500 or more.
Well since Nvidia already made the price move with Turing, without a significant node change, maybe testing the waters to see if expensive cards will sell enough for them to proceed with Ampere (best case scenario). Just a lot depends on quantity Nvidia figures they can produce and sell, if they can produce as many cards as they want, price of scale, lower prices maybe the better route for them. If supply will be tight then I expect higher prices. Unless AMD has some hidden secret sauce, I don't think AMD have much influence over Nvidia pricing unless AMD comes out strong on perf/$. Nvidia looks like they are setting themselves up for Super Options if need be. AMD yearly updates may push the issue sooner than later. The PC gaming market has matured a lot, long term players, media support like in YouTube with millions of viewers, ESports, awareness and so on, I think Nvidia was very smart in broadening the range of products.
 
LOL:

https://wccftech.com/nvidia-rtx-300...or-799-rtx-3070-for-599-and-rtx-3060-for-399/

Long story short:
RTX 3090 for $1399
RTX 3080 for $799
RTX 3070 for $599
RTX 3060 for $399

Not that I trust WCCFTech, but if these prices materialize and we're looking at a minimum of $400 for an RTX card... yeah, I'll keep passing on Nvidia and keeping my 1060 for now, see what AMD reveals at the end of the year in the $200-300 range... Although it'd make sense, following Nvidia's potential pricing scheme, to release a 3050 for $199 which I'd expect to be similar in performance to the 2060. That I'd be OK with. $400 being the minimum price for entry = NOPE.
 
$400 to buy with an ~all-new~ system, sure.

But if I'm going to just upgrade, I wouldn't want to spend more than $300.
 
if I'm going to just upgrade, I wouldn't want to spend more than $300.
We're in the same boat. Hopefully $300 doesn't relegate us to a GTX range... because I'll jump to a DXR enabled AMD card at that price bracket faster than you can say Radeon. DLSS is very tempting, but other than that, I kinda miss my 480. I had to sell it in the crypto bubble, it was stupid not to make money on the card and buy a cheap 1060 instead. But still, I'd be happy to switch sides again, for the right DXR performance at the right price.
 
The $1,400 price is believable especially given that they have high quality photographs of the card to back it up.

And the price difference between the 3080 and 3090 is typical Nvidia. They heavily gouge the top hoping people will just say fuck it and spend the cash.

Based on the naming it's so obvious that there's going to be a 3090ti within months. Might be worth waiting.
 
Its not news anymore. I think nvidia has price reduction built in already. Rape eraly adopters who will buy no matter what and are loyal to nvidia. Then AMD comes out with Big Navi, if its competitive drop prices 100 or not I guess lol.

It feels like every year nvidia is adding 100 on mid high to high end. ANd 200 or Ti and its replacement.
 
Glad MS FS2020 just came out. If the halo product substantially outperforms the 2080 Ti at 4k in 'the new Crysis,' then it might very well justify a very high price tag, because it will be baller for 5+ years. I just don't see 8k replacing 4k for gaming until later this decade.
 
Glad MS FS2020 just came out. If the halo product substantially outperforms the 2080 Ti at 4k in 'the new Crysis,' then it might very well justify a very high price tag, because it will be baller for 5+ years. I just don't see 8k replacing 4k for gaming until later this decade.
Yep... I always adopted higher resolutions early, (1680x1050 in 2004, 2560x1600 in 2008, 1440p 120hz in 2010 or so, 4k60 in 2014, etc) but I'm more interested in 4k120 on an oled now rather than pushing for 8k 60hz. The quality and motion clarity I feel outstrips the need for more pixels right now. I doubt I'll go 8k until it's available in high refresh late this decade.
 
At 24GB it’s going to be competing against the Titan and I would expect it to have the Titan RTX price tag + inflation + new feature set so at least $1,800 USD.
 
Based on the naming it's so obvious that there's going to be a 3090ti within months. Might be worth waiting.

Wouldn't doubt it. Word is these are running on Samsung manufacturing process which is significanly less efficient than TSMC. Would not be surprised of a TSMC fabbed 3090ti comes out later.
 
Wouldn't doubt it. Word is these are running on Samsung manufacturing process which is significanly less efficient than TSMC. Would not be surprised of a TSMC fabbed 3090ti comes out later.

I highly doubt it. If it’s called 3090 it’s likely top end. They usually tend to do xx80 ti. So you are more likely to see 3080ti and 3070ti as refresh and 3090 will remain top end.
 
I highly doubt it. If it’s called 3090 it’s likely top end. They usually tend to do xx80 ti. So you are more likely to see 3080ti and 3070ti as refresh and 3090 will remain top end.

I don't follow your logic. Ti is a top end version of any card. Why can't there be a 3090Ti?
 
Any sense to change from 2080 Ti Waterforce to 3090 ?:) hehe
i am on 1440P
 
Any sense to change from 2080 Ti Waterforce to 3090 ?:) hehe
i am on 1440P

Only if you donate the Waterforce to me. =p

Otherwise, I would recommend you wait until you can either buy used, or until the 4000 series comes out in a few years.

Unless you can sell the Waterforce for a very good price. Which probably isn't happening at this point because no one is going to buy something so high-end, but so last-gen, when they could just buy the newest card instead, even if it's a 3080 or something.

The question is always: What is the most demanding game you want to play and at what framerate?
Then the question becomes: So do you want to game now, or game later?

You'll play all your favorites eventually, right? So when it comes down to it, it's sometimes more financially smart to wait a little.
 
LOL:

https://wccftech.com/nvidia-rtx-300...or-799-rtx-3070-for-599-and-rtx-3060-for-399/

Long story short:
RTX 3090 for $1399
RTX 3080 for $799
RTX 3070 for $599
RTX 3060 for $399

Not that I trust WCCFTech, but if these prices materialize and we're looking at a minimum of $400 for an RTX card... yeah, I'll keep passing on Nvidia and keeping my 1060 for now, see what AMD reveals at the end of the year in the $200-300 range... Although it'd make sense, following Nvidia's potential pricing scheme, to release a 3050 for $199 which I'd expect to be similar in performance to the 2060. That I'd be OK with. $400 being the minimum price for entry = NOPE.

Wasn't that $1399 rumor only for the 12GB version of the 3090 though? 24GB would be AIB only and that's where you'd see the larger price tag (think Colorful's $1999 3090).

I agree. I wonder what the sub $300 market is going to look like. Obviously we won't get normalized pricing in that segment until probably Q2'21 if there's a Q1'21 launch for those parts.
 
I don't follow your logic. Ti is a top end version of any card. Why can't there be a 3090Ti?

What I should have said is it’s not possible as is. Card is already maxed out in power. They would have to shrink this to like 5nm first. Maybe in Little over a year you will see this as a refresh if they are already working on it.
 
What I should have said is it’s not possible as is. Card is already maxed out in power. They would have to shrink this to like 5nm first. Maybe in Little over a year you will see this as a refresh if they are already working on it.

Did you actually read my post before you responded? I addressed that and that specifically. TSMCs node is far more efficient than Samsung and word is these initial 3000 series cards are using Samsung.
 
:rolleyes: FFS, clickbait is not a source of facts.

Believe what you want. He’s been spot on leading up to everything so far. We can bookmark this thread and see who says “I told you so” when the mid cycle refresh happens. If you think 400 watts for 30-40% faster than 2080ti Is a TSMC node, prepare to be surprised.

Why isn’t RDNA2 rumored to be anywhere near this power hungry but still be quite competitive performace wise? Could it possibly be that they’re on TSMC?
 
Believe what you want. He’s been spot on leading up to everything so far. We can bookmark this thread and see who says “I told you so” when the mid cycle refresh happens. If you think 400 watts for 30-40% faster than 2080ti Is a TSMC node, prepare to be surprised.

Why isn’t RDNA2 rumored to be anywhere near this power hungry but still be quite competitive performace wise? Could it possibly be that they’re on TSMC?

We don't know performance of the 3090, nor Big Navi. If you wan't to go by rumors, Big Navi only competes with the 3080, not the 3090.
 
We don't know performance of the 3090, nor Big Navi. If you wan't to go by rumors, Big Navi only competes with the 3080, not the 3090.

Do you think when Ampere drops, it will be using TSMC or Samsung? and do you think one is more efficient than the other?

I know you don’t know, but there’s enough rumors and evidence out there for you to make an educated guess.
 
Do you think when Ampere drops, it will be using TSMC or Samsung? and do you think one is more efficient than the other?

I know you don’t know, but there’s enough rumors and evidence out there for you to make an educated guess.

It could go either way. The evidence for Samsung the same twitter accounts that said it, also accurately revealed the 3090 name, which was backed up by Micron document. If it does use Samsung it won't be without garnering some advantage out of it. They aren't going to shoot themselves in the foot, and they wouldn't be forced out of TSMC.

As far as MLID being accurate for the rumors he sourced (not just repeating those from the Twitter accounts). His one unique contribution I remember was DLSS 3.0 just automatically working with every game that has TAA (which is every game these days), and claiming it would be forced on every game. :rolleyes:

I would say that is absolute nonsense. So his unique contributions look like BS to me.
 
It could go either way. The evidence for Samsung the same twitter accounts that said it, also accurately revealed the 3090 name, which was backed up by Micron document. If it does use Samsung it won't be without garnering some advantage out of it. They aren't going to shoot themselves in the foot, and they wouldn't be forced out of TSMC.

As far as MLID being accurate for the rumors he sourced (not just repeating those from the Twitter accounts). His one unique contribution I remember was DLSS 3.0 just automatically working with every game that has TAA (which is every game these days), and claiming it would be forced on every game. :rolleyes:

I would say that is absolute nonsense. So his unique contributions look like BS to me.

It uses Samsung because they couldn’t get TSMC for the price they wanted especially with TSMC running at capacity. So yes. They got something out of it which is an Ampere card that’s isn’t delayed. They are not going to gain any performance from it. it’s using 60% more power for what looks like will amount to be 30-40% more performance than a 2080Ti. If you think that’s what a node advantage looks like you haven’t been paying attention.
 
It uses Samsung because they couldn’t get TSMC for the price they wanted especially with TSMC running at capacity. So yes. They got something out of it which is an Ampere card that’s isn’t delayed. They are not going to gain any performance from it. it’s using 60% more power for what looks like will amount to be 30-40% more performance than a 2080Ti. If you think that’s what a node advantage looks like you haven’t been paying attention.

Again that is just pulling numbers from someones ass. No one knows what the performance or power numbers are.

If they wanted TSMC they would have TSMC, so IF they are using Samsung it won't be that far behind, because they aren't going to sabotage all the work they did.
 
It uses Samsung because they couldn’t get TSMC for the price they wanted especially with TSMC running at capacity.

Yeah I think we can accept Ampere is being built at Samsung. We are a week away from announcement and there are chips and cards in the wild. There have been zero rumors of the chips being built at TSMC.

Whatever negotiating Nvidia did with TSMC happened at least 3-4 years ago as the chip design had to start around then. If Nvidia had any regrets after seeing RDNA2 on TSMC 7nm it would have been far too late anyway. If I had to guess I would say Nvidia went with a relatively mature Samsung process before knowing that 7nm would turn out as well as it did.
 
Yeah I think we can accept Ampere is being built at Samsung. We are a week away from announcement and there are chips and cards in the wild. There have been zero rumors of the chips being built at TSMC.

Whatever negotiating Nvidia did with TSMC happened at least 3-4 years ago as the chip design had to start around then. If Nvidia had any regrets after seeing RDNA2 on TSMC 7nm it would have been far too late anyway. If I had to guess I would say Nvidia went with a relatively mature Samsung process before knowing that 7nm would turn out as well as it did.

GA100 is on TSMC which is a counter to the rumors. If you actually listen to what NVidia says about Fabs, it's that they usually work with and qualify both Samsun and TSMC each generation, and decide from there.

This is not a case of NVidia failing to book capacity. If they go with Samsung it will be for sound reasons.

If there is actually a very tight supply constraint and very significant price difference but weaker perfomance for Samsung, then it would be reasonable to expect they would build GA102 parts on TSMC, and lower end higher volume ones on Samsung.
 
GA100 is on TSMC which is a counter to the rumors.

Not really. GA100 doesnt have anything to do with where GA102 is fabbed.

This is not a case of NVidia failing to book capacity. If they go with Samsung it will be for sound reasons.

I’m sure they had their reasons but Nvidia can’t predict the future. They couldn’t know for sure what would be happening in 2020.

If there is actually a very tight supply constraint and very significant price difference but weaker perfomance for Samsung, then it would be reasonable to expect they would build GA102 parts on TSMC, and lower end higher volume ones on Samsung.

That assumes Nvidia knew what 7nm supply and performance would look like today when they started designing GA102 years ago. I’m pretty sure they didn’t.
 
GA100 is on TSMC which is a counter to the rumors. If you actually listen to what NVidia says about Fabs, it's that they usually work with and qualify both Samsun and TSMC each generation, and decide from there.

This is not a case of NVidia failing to book capacity. If they go with Samsung it will be for sound reasons.

If there is actually a very tight supply constraint and very significant price difference but weaker perfomance for Samsung, then it would be reasonable to expect they would build GA102 parts on TSMC, and lower end higher volume ones on Samsung.

Theres a reason their non consumer part is built in the more advanced node from TSMC. I know you were trying to prove a point here but you probably didn’t realize you’d be proving the other guys point.
 
Its not news anymore. I think nvidia has price reduction built in already. Rape eraly adopters who will buy no matter what and are loyal to nvidia. Then AMD comes out with Big Navi, if its competitive drop prices 100 or not I guess lol.

It feels like every year nvidia is adding 100 on mid high to high end. ANd 200 or Ti and its replacement.

This is precisely what I think will happen. Nvidia knows they have a captive market in the ultra tier who will pay through the nose, no questions. They're going to push the envelope on price there.

For the high end and high-mid, they have a small window when they can charge a premium before big Navi is available. Expect an $800-900 3080 and a $600 3070 on release, then reductions early next year.
 
Not really. GA100 doesnt have anything to do with where GA102 is fabbed.

Sure it does. It provides valuable experience with a new fab process that makes it much more likely that they are planning additional parts in that process. The same conditions that influence where GA100 gets fabbed also influence where GA102 gets fabbed. In the absence of credible rumors, it would only be natural to assume that GA102 would follow GA100 at TSMC.

It's only a rumor from a Twitter leaker that has got a some other details correct that muddies the waters.

That assumes Nvidia knew what 7nm supply and performance would look like today when they started designing GA102 years ago. I’m pretty sure they didn’t.

They would have been operating under the same conditions when they started designing GA100.
 
They would have been operating under the same conditions when they started designing GA100.

The economics of GA100 are very different. Nvidia is likely less cost conscious there and far more willing to pony up the cash for the best process.

Either way GA102 is on Samsung. No real reason to doubt that now.
 
The economics of GA100 are very different. Nvidia is likely less cost conscious there and far more willing to pony up the cash for the best process.

Either way GA102 is on Samsung. No real reason to doubt that now.

The Economics of GA102 are not going to be very fab cost sensitive with RTX 3090 rumored to cost anywhere between $1400 and $2000.

Sure there is reaons to doubt. It's merely rumor right now. There is no confirmation from anywhere.
 
These low prices for the 3080 are wishful thinking I bet.

My guess is 3090 will be $1499. The 3080 will be $999. If they make a 3080ti and the 3090 is a titan expect it to be $1299.

All those lower end cards like the 1660s they started creating were to allow them to spread the price range more so they can inflate the top end.

Could I be wrong? Sure but I doubt it. They saw people would still buy the 2080tI at $1200 and saw the $$$.

I also bet they try and get pricing and performance staggered so they also don’t undercut their own sales by people buying 2080tis like they did 1080tis and 980tis to undercut the cost of the new cards, expect the closest equivalent to be priced similarly enough to the previous generation to want to Give the incentive to upgrade but not give you a cheaper route to equal performance.
 
Back
Top