NVIDIA GeForce RTX 3090 Ampere GPU to sport 4,992 CUDA cores and 12 GB of 18 Gbps VRAM

erek

[H]F Junkie
Joined
Dec 19, 2005
Messages
10,785
Really? They couldn't fit in 8 more cores to makes it 5000? c'mon !!!!

"Referred to as "THE ULTIMATE GEFORCE" by @Ragdoll_Kitties, the GA102 is said to include 12 GB of GDDR6 VRAM running at 18 Gbps. Apparently due to arrive by "August or later", this ties in with previous rumours about consumer Ampere GPUs slipping to Q3 or Q4 2020.

However, Ampere is also said to be based on a 10 nm process. Hence, it is no longer necessarily clear what performance levels to expect from the upcoming architecture. If true, then AMD could have a big advantage over NVIDIA headed into the next-generation mainstream desktop GPUs."


https://www.notebookcheck.net/NVIDI...cores-and-12-GB-of-18-Gbps-VRAM.458939.0.html
 
The technical differences between 7 and 10 nm processes aren't as large as most think, but there are more plants that make 10nm it is a cheaper process with larger yields and fewer demands. AMD could very well reach parity in terms of performance because of the difference but they guarantee they are on the more expensive process with a much smaller production. Even with a superior product getting enough cards to market very well could be an issue for AMD for the first year, it won't be a paper launch but it will face severe supply constraints, going with 10nm nVidia is going with a process that while inferior will likely be very available and allow them to compete against AMD with feature sets and price tags. DLSS 2.0 very well could be the game winner for nVidia this year even if AMD manages to take the top end by as much as 20% DLSS would still allow them to reach performance parody in most upcoming titles for a fair bit less. 10nm was the smarter move, next year once a few more plants have been upgraded to 7nm and Apple moves to 5nm it will open up a lot of fab time and then we can see what both teams bring to the table, Intel should also be entering the race then as well also on 7nm then we get the proper 3 way we have all been wanting.
 
Last edited:
12gb VRAM should be mainstream in my opinion now. 16gb should be on flagship. Nvidia has been doing the 11/12gb since Maxwell.


....if true of course. Its about those margins.
 
Where is this 10nm stuff coming from? What fab?
TSMC has had a 10nm node since 2017. I don't know what uses it, though. It supposedly has twice the logic density of their 16/12nm node with 15% higher performance.
 
Where is this 10nm stuff coming from? What fab?

The twitter post that's the "source" for this "news" claims that it's Samsung 10nm for all gaming chips.

Would be interesting. AMD has gone all in on 7nm for cpus and gpus. If Nvidia goes conservative with 10nm it'll be interesting times ahead for sure.
 
TSMC has had a 10nm node since 2017. I don't know what uses it, though. It supposedly has twice the logic density of their 16/12nm node with 15% higher performance.

Hmmmm, wonder if it's a high performance process? Appears Apple used it for some of their iPhones, but not seeing anything else.
 
The twitter post that's the "source" for this "news" claims that it's Samsung 10nm for all gaming chips.

Would be interesting. AMD has gone all in on 7nm for cpus and gpus. If Nvidia goes conservative with 10nm it'll be interesting times ahead for sure.

Could it be it's a mature node? Maybe they plan on building LARGE chips for now.
 
The twitter post that's the "source" for this "news" claims that it's Samsung 10nm for all gaming chips.

Would be interesting. AMD has gone all in on 7nm for cpus and gpus. If Nvidia goes conservative with 10nm it'll be interesting times ahead for sure.
Samsung's 10nm is a low-power node for SRAM in cell phones :ROFLMAO:. Someone didn't do their research before posting that rumor.
 
The twitter post that's the "source" for this "news" claims that it's Samsung 10nm for all gaming chips.

Would be interesting. AMD has gone all in on 7nm for cpus and gpus. If Nvidia goes conservative with 10nm it'll be interesting times ahead for sure.

Could be an issue of fab space? Supposedly AMD is now the #1 customer of TSMC's 7nm node. When you're the #1 customer, you can command more wafers (much like more recently Apple).
 
Hmmmm, wonder if it's a high performance process? Appears Apple used it for some of their iPhones, but not seeing anything else.
As far as Im aware, no one makes a high performance process any longer. Too costly and not enough difference from lower power types.
 
  • Like
Reactions: dgz
like this
16 or more then. You have a titan.. so I guess you deserve 32gb.


I guess you can't do math?

Titan RTX uses 384-bit bus at 14 Gbps. 12gb VRAM or 24GB divide seamlessly into that bus width.

32GB divides evenly into 256-bit (not wide enough f or Ampere Titan), or 512-bit bus. I don't think Nvidia is ready to return to the added costs of the 512-bit bus.

They will likely continue using the 24GB at 384-bit for the Ampere Titan The 30% faster memory clocks should be able to handle the performance increase.

I think the biggest question on everyone's mind is whether the 3080 Ti will be 11 or 22 GB as well? It would be niice to see the rest of the mainstream cards top-out at 16GB!
 
Last edited:
For how much these cards cost, the next 3080 ti should give us a minimum of 22 gb and Titan with 32 gb. Everything 3070 and below at least 8-16 gb.
 
Yeah if you're bored with Nvidia how is AMD gonna help. They havent done anything interesting since async compute 5 years ago.

You dont know how to read between lines man.

Its not about actual bullshit. Its like when Intel was the ONLY viable option for years.

Im tired of just nV nV nV nV ... lets see some second choices. Competition. Something to compel nV to not charge 1300 for a consumer GPU. Something to challenge thier bullshit "were superior and the only one so fuck it" attitude with pricing. Hence AMD big Navi might finally be the one to cut nVs bullshit down a little.
 
But can it make me a sandwich? Wash my car?

1080ti never leaves my hands until something truly spectacular happens. Is this it? Who knows.
 
You dont know how to read between lines man.

Its not about actual bullshit. Its like when Intel was the ONLY viable option for years.

Im tired of just nV nV nV nV ... lets see some second choices. Competition. Something to compel nV to not charge 1300 for a consumer GPU. Something to challenge thier bullshit "were superior and the only one so fuck it" attitude with pricing. Hence AMD big Navi might finally be the one to cut nVs bullshit down a little.

Let's be real here. If AMD were to release a card that competes favorably with Nvidia's high-end it would be priced accordingly. You'd still be looking at $1000+ for a top end consumer GPU.

Edit: Anyone expecting Big Navi to offer 2080 ti level performance for 2080 prices is fooling themselves. If it's priced at $800 than it will have performance comparable to current $800 video cards.
 
Let's be real here. If AMD were to release a card that competes favorably with Nvidia's high-end it would be priced accordingly. You'd still be looking at $1000+ for a top end consumer GPU.

I don't understand where people get the notion that AMD will price significantly lower. Times have changed, AMD doesn't want low margins anymore.
 
You dont know how to read between lines man.

Its not about actual bullshit. Its like when Intel was the ONLY viable option for years.

Im tired of just nV nV nV nV ... lets see some second choices. Competition. Something to compel nV to not charge 1300 for a consumer GPU. Something to challenge thier bullshit "were superior and the only one so fuck it" attitude with pricing. Hence AMD big Navi might finally be the one to cut nVs bullshit down a little.

Well yeah only AMD can help you with that. Nvidia is doing what they're supposed to do.
 
Let's be real here. If AMD were to release a card that competes favorably with Nvidia's high-end it would be priced accordingly. You'd still be looking at $1000+ for a top end consumer GPU.

Edit: Anyone expecting Big Navi to offer 2080 ti level performance for 2080 prices is fooling themselves. If it's priced at $800 than it will have performance comparable to current $800 video cards.

ehh maybe

5700xt bought better than 2070 level performance down to $399 from $499. That's 20%.
 
12gb VRAM should be mainstream in my opinion now. 16gb should be on flagship. Nvidia has been doing the 11/12gb since Maxwell.


....if true of course. Its about those margins.

If they are pushing GDDR6 to 18 GB/s it tells me two things: Using a 512 bit GDDR6 bus is REALLY expensive and these cards are getting bandwidth starved even with a 384 bit bus.

16 GB is just not happening unless they configure it with a 256 bit bus where it will nearly always lose to the 384 bit card.
 
So saying that competition does not lower prices is inaccurate.

I never said prices wouldn't go down. I said it would be priced accordingly to Nvidia's prices (as in, in-line with, similar to, in the same ballpark, etc). I also specifically said they wouldn't offer 2080 ti performance at 2080 prices. There is a world of difference between a slightly lower price and several hundred dollars difference.
 
Let's be real here. If AMD were to release a card that competes favorably with Nvidia's high-end it would be priced accordingly. You'd still be looking at $1000+ for a top end consumer GPU.

Edit: Anyone expecting Big Navi to offer 2080 ti level performance for 2080 prices is fooling themselves. If it's priced at $800 than it will have performance comparable to current $800 video cards.

Thats what you guys said about Ryzen

And lookie lookie
 
Last edited:
Back
Top