Guess the price of 24GB RTX 3090 FE

Guess the Price of 24GB RTX 3090


  • Total voters
    272
  • Poll closed .
Nvidia's price GOUGING is NEVER gonna stop until AMD can compete.

Here's my guesses for Nvidia's prices.

3090 24GB $2000
3090 12GB $1600
3080 12GB $1200
3070 8GB $800
3060 8GB $400

6 months later.

RTX Titan 3000 series $3000

The people who bought the 24GB 3090's will be pissed but they will just sell off their 3090 and pony up for the Titan.

If Nvidia doesn't go until full price GOUGE mode and only take one kidney instead of both your kidneys.

3090 24GB $1500
3090 12GB $1200
3080 12GB $800
3070 8GB $600
3060 8GB $400

Titan will be released 6 months later for between $2400 and 3K and still will be bought out by the EPEEN must have crowd.
 
The Economics of GA102 are not going to be very fab cost sensitive with RTX 3090 rumored to cost anywhere between $1400 and $2000.

Sure there is reaons to doubt. It's merely rumor right now. There is no confirmation from anywhere.

Heh, a DGX server with 4xA100’s goes for a cool $75K. Just a few days left till we (hopefully) get the real low down.
 
Even if they use Samsung, we will never really know why.

When the better performing, more efficient TSMC variants come out later, you’ll knoW what we are saying here is accurate. You may not want to admit it but you’ll know.
 
When the better performing, more efficient TSMC variants come out later, you’ll knoW what we are saying here is accurate. You may not want to admit it but you’ll know.

You mean just like the 7nm turning cards that came out later?

Even if it went that way, we will never know why they chose an alternate Fab.

Naturally, some people will jump to believe whatever half backed theory is rumored, because they can't accept not knowing.
 
When the better performing, more efficient TSMC variants come out later, you’ll knoW what we are saying here is accurate. You may not want to admit it but you’ll know.

The reason in the end is always simple. MONEY. If it's cheaper to produce on Samsung and they end up using it instead of TSMC, well that's just how corporations think. Real World decisions are based on the bottom line. $$$.
 
By the gods, I hope better performing, more efficient cards come out 'later'. I'd hate to be stuck at this level of performance forever!

When I say later I mean mid cycle refresh with “super” and “Ti” cards. If you bother reading what’s been discussed leading up to my comment you’d have some context.
 
You mean just like the 7nm turning cards that came out later?

Even if it went that way, we will never know why they chose an alternate Fab.

Naturally, some people will jump to believe whatever half backed theory is rumored, because they can't accept not knowing.

Deductive reasoning skills would come in handy. Maybe when you pass by a car wreck where the vehicle is mangled beyond recognition you need someone who was there to actually tell you “speed was involved” not everyone needs that level of confirmation. Most can take pieces of evidence and do a good job of putting the Puzzle together.
 
When I say later I mean mid cycle refresh with “super” and “Ti” cards. If you bother reading what’s been discussed leading up to my comment you’d have some context.
I don't doubt what you meant but didn't say; rather, I'm pointing out that 'Super' and 'Ti' cards are not at all a given, and as you should well know, they are marketing tools used when there's sufficient motivation for their employ. Like getting better yields with higher performance.

That may or may not happen for Nvidia here, behind the scenes, so we may just get a new architecture instead, on a new node even. We don't even have a clear picture of Ampere release 1 yet!
 
Did you actually read my post before you responded? I addressed that and that specifically. TSMCs node is far more efficient than Samsung and word is these initial 3000 series cards are using Samsung.
Yes. That’s why I was saying they are not going to fab this at tsmc. If they do come out with refresh it will be likely on Samsung 5nm next year if that works out for them. I don’t see them going tsmc because if that was an option they would have done that already.
 
Deductive reasoning skills would come in handy. Maybe when you pass by a car wreck where the vehicle is mangled beyond recognition you need someone who was there to actually tell you “speed was involved” not everyone needs that level of confirmation. Most can take pieces of evidence and do a good job of putting the Puzzle together.

Bad analogies aren't helpful when you are implying the same nonsense that people argued when Turning was released on 12nm(really 16nm+).

The claim back then was that 12nm Turing would be a "short release" stop gap part, and that in 6-12 months there would be 7nm parts to replace them.

Now you are essentially implying the same thing. That some kind mid cycle refresh will have a different process and Fab.

The problem I pointed out answering those nonsense claims is the same as your nonsense this time:

Chip Tape out and Masking is now absurdly expensive. It' a massive one time, up front cost. They will want to maximize the usage of those tape outs, to spread its cost over as many chips for as long as possible.

It's much more reasonable to expect that the masks that NVidia is using for the new Ampere chips will get a full GPU cycles usage, and won't be redone at massive expense in some kind of mid cycle refresh.
 
Bad analogies aren't helpful when you are implying the same nonsense that people argued when Turning was released on 12nm(really 16nm+).

The claim back then was that 12nm Turing would be a "short release" stop gap part, and that in 6-12 months there would be 7nm parts to replace them.

Now you are essentially implying the same thing. That some kind mid cycle refresh will have a different process and Fab.

The problem I pointed out answering those nonsense claims is the same as your nonsense this time:

Chip Tape out and Masking is now absurdly expensive. It' a massive one time, up front cost. They will want to maximize the usage of those tape outs, to spread its cost over as many chips for as long as possible.

It's much more reasonable to expect that the masks that NVidia is using for the new Ampere chips will get a full GPU cycles usage, and won't be redone at massive expense in some kind of mid cycle refresh.

You can regurgitate the word “nonsense” over and over again. This isn’t the same thing. NVidia wouldn’t push the card to 400 watts unless they needed to to retain the performance crown. GA100 is using TSMC so to convert the rest of their consumer lineup to TSMC isn’t as much work as you think it is since most of that work has already been done. Sounds like you’re upset beciase nVidia is going to be charging a premium for a sub par node on their top end consumer part.
 
You can regurgitate the word “nonsense” over and over again. This isn’t the same thing. NVidia wouldn’t push the card to 400 watts unless they needed to to retain the performance crown. GA100 is using TSMC so to convert the rest of their consumer lineup to TSMC isn’t as much work as you think it is since most of that work has already been done. Sounds like you’re upset beciase nVidia is going to be charging a premium for a sub par node on their top end consumer part.

It's exactly the same thing. It was an ignorant suggestion 2 years ago, and it's an ignorant suggestion today for exactly the same reasons.

You need all new masks for each GPU die, to switch fabs.

It would be wasting over a hundred million dollars. Just to redo it early.

Whatever fab they choose, they are in it for a full life-cycle, to make back the tens of millions of dollars in up front costs, for each mask set.
 
It's exactly the same thing. It was an ignorant suggestion 2 years ago, and it's an ignorant suggestion today for exactly the same reasons.

You need all new masks for each GPU die, to switch fabs.

It would be wasting over a hundred million dollars. Just to redo it early.

Whatever fab they choose, they are in it for a full life-cycle, to make back the tens of millions of dollars in up front costs, for each mask set.

So you went from not believing they were using Samsung to begin with to not believing they’ll move away from Samsung. I suppose that’s progress.
 
So you went from not believing they were using Samsung to begin with to not believing they’ll move away from Samsung. I suppose that’s progress.

Reading comprehension problem? I said: " Whatever fab they choose, they are in it for a full life-cycle...". Not Samsung.

It's still up in the air as far as I am concerned. Because I believe evidence, not rumors.
 
Reading comprehension problem? I said: " Whatever fab they choose, they are in it for a full life-cycle...". Not Samsung.

It's still up in the air as far as I am concerned. Because I believe evidence, not rumors.

The rumors are based on evidence. You think they’re a completely random occurrence?
 
So you just automatically believe all rumors?

Nope, just the ones that are based on series of events and evidence that make sense.

I will agree that whether nVidia moves to TSMC or stays on a Samsung mid cycle is a big unknown but theres plenty evidence around for me to believe they are in fact on Samsung dies and that it’s nowhere near as efficient as TSMC
 
.... theres plenty evidence around for me to believe they are in fact on Samsung dies and that it’s nowhere near as efficient as TSMC

What evidence? A bunch of people repeating the same rumor that they are on Samsung 8nm? Do you think more people repeating the same rumor makes it more realistic?
 
What evidence? A bunch of people repeating the same rumor that they are on Samsung 8nm? Do you think more people repeating the same rumor makes it more realistic?

What evidence to you have to the contrary? We are less than a week away from announcement and since this is clearly the first time you’ve been around for a GPU launch so I’ll let you know what’s happened in the past... This close to release, the rumors tend to be pretty accurate.

would you like to place a friendly bet? I’ll PayPal/Venmo you $100 if they are using TSMC... You PayPal/Venmo me $100 if they’re on Samsung.

Let’s see if you REALLY believe these rumors are as unsubstantiated as you’re purporting they are. Enough typing, put your money where your keyboard is.
 
Reversing the burden of proof.
A favoured thing by religious people and others with no data/evidence...fallacies != arguments...
 
What evidence to you have to the contrary? We are less than a week away from announcement and since this is clearly the first time you’ve been around for a GPU launch so I’ll let you know what’s happened in the past... This close to release, the rumors tend to be partly accurate.

would you like to place a friendly bet? I’ll PayPal/Venmo you $100 if they are using TSMC... You PayPal/Venmo me $100 if they’re on Samsung.

Let’s see if you REALLY believe these rumors are as unsubstantiated as you’re purporting they are. Enough typing, put your money where your keyboard is.

I don't need contrary evidence. There is an absence of evidence, which leaves this an open quesiton.

You now seem to be assuming that I am sure it's TSMC. I never indicated that. In fact I state my position just a few posts back: " It's still up in the air as far as I am concerned "

Why would I bet on something I think lacks sufficient evidence in either direction? That's fools bet either way. Like betting on a coin toss. Just because you are convinced it's heads, and I think it's random is not reason for me to bet on tails.

I just consider it an unknown. I am comfortable with an unknown until there is sufficient evidence.

It seems some people are so uncomfortable with unknowns they just rush to believe any story that completes a picture. This is why we have so many crazy conspiracy theories that won't go away.

It would be much better if people weren't prone to leaping to conclusions absent real evidence.
 
Ahh ok. So “I have no idea and No opinion on what’s going to happen but I’m gonna argue about what you think is gonna happen“

so basically you’re here to talk a lot and say absolutely nothing. I kinda figured as much based on the “evidence” I’m not surprised you ignored the bet.
 
Ahh ok. So “I have no idea and No opinion on what’s going to happen but I’m gonna argue about what you think is gonna happen“

so basically you’re here to talk a lot and say absolutely nothing. I kinda figured as much based on the “evidence” I’m not surprised you ignored the bet.

Golf clap... Geez, I wouldn't bet on something I claimed was an unknown. Why would anyone take such a stupid bet? If I had claimed that the Fab would definitely be TSMC, then would be the time to challenge me to a bet. But I didn't say that, so why would I bet that. Basic logic failure to go with your multiple reading comprehension failures.

I have opinions, and I will take bets on things where I have evidence/knowledge to tilt the play field sufficiently.

Which fab is being chosen I have said all along, is one I don't see their being sufficient evidence for, so naturally I wouldn't bet on it.

If you want to bet on the MLID claim that Ampere comes with DLSS 3.0 that just works with every game with TAA (pretty much every game). Then I will bet against that.

Since you believe what MLID says, you naturally must believe this one is true??
 
Golf clap... Geez, I wouldn't bet on something I claimed was an unknown. Why would anyone take such a stupid bet? If I had claimed that the Fab would definitely be TSMC, then would be the time to challenge me to a bet. But I didn't say that, so why would I bet that. Basic logic failure to go with your multiple reading comprehension failures.

I have opinions, and I will take bets on things where I have evidence/knowledge to tilt the play field sufficiently.

Which fab is being chosen I have said all along, is one I don't see their being sufficient evidence for, so naturally I wouldn't bet on it.

If you want to bet on the MLID claim that Ampere comes with DLSS 3.0 that just works with every game with TAA (pretty much every game). Then I will bet against that.

Since you believe what MLID says, you naturally must believe this one is true??

Let me guess. You fold at each hand of poker unless the dealer tells you the next card. Lol

Sorry making educated guesses is a foreign concept to you. When the cards drop and you get your confirmation, I’ll be sure to update this thread with an “I told you so” and you can respond that it was merely a lucky guess and I had a 50/50 chance of being right. Lmao.
 
We do know the A100 TDP is 400w, 6912 Cuda Cores, 40gb of HBM, manufactured on TSMC 7nm, boost clock ~1.41ghz

Will a smaller die, less cuda cores ~78% of A100, higher boost clock like 30% higher also take 350w+ ? DDR6x 12gb? to 24gb?

There are just too many differences between A100 (more FP units, no RT cores, DDR6x vs HBM etc.) for me to guess what actual node is being used.

Then again who cares if Samsung or TSMC??? Once the hardware hits only thing of interest is performance, features, will it melt or blow up? Can I benefit from it?And so on.
 
Let me guess. You fold at each hand of poker unless the dealer tells you the next card. Lol

Sorry making educated guesses is a foreign concept to you. When the cards drop and you get your confirmation, I’ll be sure to update this thread with an “I told you so” and you can respond that it was merely a lucky guess and I had a 50/50 chance of being right. Lmao.

So you won't take the DLSS 3.0 bet? :D
 
So you won't take the DLSS 3.0 bet? :D

DLSS isn't a topic I argued with you about. You on the other hand are arguing with me when I say these cards will be on Samsung dies. See the difference?

I know you're the kind of guy that likes a sure thing, so feel free to scroll up and read the subject matter if you don't believe what I just claimed we were arguing about ;)

Seems the "I told you so's" are gonna start sooner than I expected
 
So you won't take the DLSS 3.0 bet? :D
yeah I would bet, Ampere RTX cards will support DLSS :D

Now if DLSS can be used on any game that supports TAA with the same benefits as the games that are programmed to use it -> AMD should then just pack their bags and go home or have something equivalent. Nvidia would automatically own the performance/IQ crown and price their cards higher.
 
DLSS isn't a topic I argued with you about. You on the other hand are arguing with me when I say these cards will be on Samsung dies. See the difference?

I know you're the kind of guy that likes a sure thing, so feel free to scroll up and read the subject matter if you don't believe what I just claimed we were arguing about ;)

So you agree that MLID is spouting made up BS about DLSS 3.0?

I have no evidence, I am just doing an educated guess about DLSS 3.0. The thing you claimed I couldn't do.

You would have the "evidence" of a MLID rumor on your side...
 
I don't need contrary evidence. There is an absence of evidence, which leaves this an open quesiton.

You now seem to be assuming that I am sure it's TSMC. I never indicated that. In fact I state my position just a few posts back: " It's still up in the air as far as I am concerned "

Why would I bet on something I think lacks sufficient evidence in either direction? That's fools bet either way. Like betting on a coin toss. Just because you are convinced it's heads, and I think it's random is not reason for me to bet on tails.

I just consider it an unknown. I am comfortable with an unknown until there is sufficient evidence.

It seems some people are so uncomfortable with unknowns they just rush to believe any story that completes a picture. This is why we have so many crazy conspiracy theories that won't go away.

It would be much better if people weren't prone to leaping to conclusions absent real evidence.


Pretty sure that is bunk. Given that NVidia has been almost exclusively TSMC for a decade or more, and that Big Ampere is on TSMC 7nm (working as a test part and providing extensive learning for chips to follow).

There is no way it makes any sense that consumer Ampere is running on some kind of second rate Samsung process.

People making up theories that NVidia was somehow shut out of TSMC 7nm don't understand, that there would be long term, years in advance reservation of capacity by NVidia. They aren't going to be suddenly caught without capacity at their long term Fab.

Now it is possibly NVidia would voluntarily switch to Samsung for Ampere, but there would have to be a very significant benefit involved.
Questionable if a "3090" will ever launch. This is just a rumor name. Much salt needed.

Pretty sure we will get a 3080 and 3080 Ti in September.


Pretty sure you dont want me digging some more for quotes on that.
 
Pretty sure you dont want me digging some more for quotes on that.

Bored?

Is that supposed to be some kind of huge gotcha? That I said much salt is needed for a unverified rumor, from an known source.

This is how rumors work. Some turn out to be true, some false. Until they do it's better to assume they are all nonsense than to assume they are all true.
 
Linus is also speculating the 3090 will cost $1400 on their Livestream today. So My guesstimate of $1500 looks awfully close.
 
Linus is also speculating the 3090 will cost $1400 on their Livestream today. So My guesstimate of $1500 looks awfully close.

My guess is 3090 will be $1500 MSRP for the base dual-fan model, $1600 for the FE and other aftermarket. That considered, "MSRP" is really just a number nVidia and AIBs puts on press junkets nowadays and the actual street price could be higher.

It's definitely going to be a price bump from the bottom up this round. More than I can recall in the past 25 years, it's worth waiting to see what the competition has before purchasing for most.
 
My guess is 3090 will be $1500 MSRP for the base dual-fan model, $1600 for the FE and other aftermarket. That considered, "MSRP" is really just a number nVidia and AIBs puts on press junkets nowadays and the actual street price could be higher.

It's definitely going to be a price bump from the bottom up this round. More than I can recall in the past 25 years, it's worth waiting to see what the competition has before purchasing for most.

I have only bought Nvidia cards in the 2010s. With the Extreme price Gouging now I might consider moving to the Red Team and Big Navi for the first time if the price/performance is right.

Nvidia right now is Exploiting the market and taking advantage of Consumers far too heavily. I LOVE their technology but I HATE their Corporate GREED over the past few years lately.
 
Back
Top