RTX 3xxx performance speculation

As expected, nowhere near $700. In Sweden 3080 Strix is over €1000, which is $1200.
Demand I guess, but I don't see this price going down much. 2080Ti Strix was $1700 in Sweden. Actually, still is.

It is cheaper than I thought it would be in Denmark:
1599207011834.png

$1.905,31 (12.000 DKK)for the RTX 3090 FE.
I had assumed $2.381,63 (15.000 DKK)

Custom fees and VAT/TAX adds up...and it always make me smile when I see people from the US hyperbole about US prices....spoiled indeed ;)
 
Man, that game is ridiculously stupid. I was never a fan of Doom. Unreal and Half Life is th
It is cheaper than I thought it would be in Denmark:
View attachment 276132
$1.905,31 (12.000 DKK)for the RTX 3090 FE.
I had assumed $2.381,63 (15.000 DKK)

Custom fees and VAT/TAX adds up...and it always make me smile when I see people from the US hyperbole about US prices....spoiled indeed ;)

3080 is $900/€760 directly from Nvidia. I might buy that and order a custom card just to be on the safe side. Then re-sell the worse card when reviews are out.
 
Man, that game is ridiculously stupid. I was never a fan of Doom. Unreal and Half Life is th


3080 is $900/€760 directly from Nvidia. I might buy that and order a custom card just to be on the safe side. Then re-sell the worse card when reviews are out.

DK prices:

1599209780588.png

RTX 3090: 12.000,- DKK ($1.905,31 / €1.612,88)
RTX 3080: 5650,- DKK ($897,08 / €759,40)
RTX 3070: 4050,- DKK ($643,04 / €544,35)

Cheaper than expected...oh well, more RAM for my new rig then :D
 
Q: Any idea if the dual airflow design is going to be messed up for inverted cases? More than previous designs? Seems like it would blow it down on the cpu. But the CPU cooler would still blow it out the case. Maybe it’s not so bad.
Q: Second question. 10x quieter than the Titan for the 3090 is more or less quieter than a 2080 Super (Evga ultra fx for example)?


The new flow through cooling design will work great as long as chassis fans are configured to bring fresh air to the GPU, and then move the air that flows through the GPU out of the chassis. It does not matter if the chassis is inverted.
The Founders Edition RTX 3090 is quieter than both the Titan RTX and the Founders Edition RTX 2080 Super. We haven’t tested it against specific partner designs, but I think you’ll be impressed with what you hear… or rather, don’t hear.

Can anyone confirm how loud those cards were/are? A dB measurement reading only goes so far.

Q: Will customers find a performance degradation on PCIE 3.0?

System performance is impacted by many factors and the impact varies between applications. The impact is typically less than a few percent going from a x16 PCIE 4.0 to x16 PCIE 3.0. CPU selection often has a larger impact on performance. We look forward to new platforms that can fully take advantage of Gen4 capabilities for potential performance increases. Note also, all benchmarks and performance data on our website and presented by Jensen was from a PCIE 3.0 platform.

Good, won't have to swap out my couple month old B450 board for an X570. Also, is this a backhanded remark at AMD? You know, since they have PCI-E 4.0 boards on the market and Intel is still MIA.
 
View attachment 276150

I'm still buying PS5 on launch day, but this is too funny not to post :D

Both sides of this narrative is so tired and lame.

Great new Console comes out: PC gaming is doomed!

Great new GPU comes out: Console gaming is doomed!

The reality is that there are three fairly set groups that don't move much between them:

1: Console only: No GPU is going to to move this group. They game on consoles, and want nothing to do with PC complexity.
2: PC only: No console is going to move this group. They game on all PC, and dislike the dumbed down, limited, non moddable console experience.
3: PC + Console gamers: This is the group that shifts what they are buying this cycle based on what great HW is coming out, but they continue to game on both and buy both.

There are of course minor exceptions but this is mainly what is going on. New consoles aren't going to convince many PC only gamers to switch, and new GPUs aren't going to convince many console only gamers to switch...
 
New consoles aren't going to convince many PC only gamers to switch, and new GPUs aren't going to convince many console only gamers to switch...
I’ve been trying to get my husband to switch to pc gaming for years. He plays civilization games on consoles, and I’ve tried to build him an awesome 4K60 rig so he could play those at perfect quality. Instead he insists on playing on switch now, connected to the tv. I don’t get it, it looks like ass at around 720p and runs at about 30fps, but he keeps saying consoles are turn on and go and just doesn’t want to deal with more complication. Like turning on a pc and starting a game is so hard? But people like him have zero interest in how much our gpus improve vs the ease of using a console.
 
I’ve been trying to get my husband to switch to pc gaming for years. He plays civilization games on consoles, and I’ve tried to build him an awesome 4K60 rig so he could play those at perfect quality. Instead he insists on playing on switch now, connected to the tv. I don’t get it, it looks like ass at around 720p and runs at about 30fps, but he keeps saying consoles are turn on and go and just doesn’t want to deal with more complication. Like turning on a pc and starting a game is so hard? But people like him have zero interest in how much our gpus improve vs the ease of using a console.

lol, game cheating might become a thing.
Get another friend to 'game' with :D
 
I have a feeling that this will not be the last time we see a process labeled: xN.
If we go back to GTX 480, NVIDIA started taking a deep look at the transistors they used.
AFAIR they went from 2 types in GTX 480 to 3 types in GTX 580 (along side redesign of datapaths in the GPU.

Just like with the G80 and CUDA, I think NVIDIA laid the foundation for how they handle process nodes after the GTX 480.
TMSC's 12N and Samsungs 8N indicates to me that they have their own unique view on the process node characteristics they want.

I have a strong feeling that the next node NVIDIA uses also will be a xN node.
 
I have a strong feeling that the next node NVIDIA uses also will be a xN node.

Probably. Their next chip should already be in progress. Wonder if they're targeting 7nm or 5nm. Seems they're ok with being a node behind AMD right now but that may change depending on Big Navi.
 
Probably. Their next chip should already be in progress. Wonder if they're targeting 7nm or 5nm. Seems they're ok with being a node behind AMD right now but that may change depending on Big Navi.
All I have heard about "Hopper" is mcm...I think it will be a few years before we know more.
 
Probably. Their next chip should already be in progress. Wonder if they're targeting 7nm or 5nm. Seems they're ok with being a node behind AMD right now but that may change depending on Big Navi.

Not really a node behind. Node adjacent. Transistor Density of GA102 appears to be about 45 Million/mm2, which is in the same ballpark as 7nm Navi.
 
Not really a node behind. Node adjacent. Transistor Density of GA102 appears to be about 45 Million/mm2, which is in the same ballpark as 7nm Navi.

Density is one factor but it’s also dependent on architecture. Power is going to main the real issue.
 
Density is one factor but it’s also dependent on architecture. Power is going to main the real issue.

TSMC is not magic. 3080 is about double the performance of TSMC 7nm based 5700 XT.

2X 5700 XT = 450 Watts.
 
  • Like
Reactions: Auer
like this
All I have heard about "Hopper" is mcm...I think it will be a few years before we know more.
Any gen after RDNA 2 is going to go multi chip. We know they don't like monolithic dies, and they've already perfected it on the cpu side. If Nvidia is able to continue adding 'threads' to their cuda architecture, amd will have to ratchet it up somehow.
 
Any gen after RDNA 2 is going to go multi chip. We know they don't like monolithic dies, and they've already perfected it on the cpu side. If Nvidia is able to continue adding 'threads' to their cuda architecture, amd will have to ratchet it up somehow.

Both still need R&D funds.
And with NVIDIA grabbing 80% of all GPU's shipped last quarter, AMD really has to conquer marketshare from NVIDIA or fall even further behind in features/perfomance.

A vocal fanbase, but diminishing sales is a bad "roadmap" for sucess.
 
TSMC is not magic. 3080 is about double the performance of TSMC 7nm based 5700 XT.

2X 5700 XT = 450 Watts.

No it's not magic but there are power savings going from the 7nm to the 7nm+ also it is a new Arch too and AMD are claiming up to 50% performance per watt.
 
Damn guys cant wait for rtx 3090. I want to hibernate and wake up when will be ready to buy.
 
How valuable will PCIe4 be for a 3080? Am I completely insane to consider a B550 over an X570 just because I hate the inevitable annoyance of a chipset fan?
 
No it's not magic but there are power savings going from the 7nm to the 7nm+ also it is a new Arch too and AMD are claiming up to 50% performance per watt.

Sure AMD is claiming 50% Perf/Watt improvement, just like NVidia is claiming 90% Perf/Watt improvements. Those are marketing games.

TSMC says 15% power saving, that gets you down to 380W.
 
Can we please agree that Doom Eternal is NOT a good benchmark. That game is not a benchmark, at 1080, 4k or 8k. I love the game to death, and it is gorgeous, but that game does not push the hardware. I saw a video of Doom Eternal at 4K and frames were dipping below 120FPS on the 3080. That is NOT impressive. What is even going on in this game except a lot of corridor crawling?

I would, on the other hand, like to see GTA V on a 3080 or 3090.
 
Last edited:
Gonna hold off bit. Gotta save probabily be next summer. Have 1080tis in sli. Not in huge rush because very few rtx games. I do want 3090ti but should get 3080 or ti if make it down the road.
 
I can’t believe that only people willing to pay $3,070 + tax/shipping on GPUs alone are the only ones blessed with SLI. Talk about a fast way to kill off the technology. It’s now the 1% of the 1%. What company (gaming or even business related) would even bother spending $1 in man hours optimizing for SLI/MGPU for probably less than <0.01% of their customer base? Why did Nvidia even bother taking the time & resources? Seems like they even had to re-design the connector.
 
I can’t believe that only people willing to pay $3,070 + tax/shipping on GPUs alone are the only ones blessed with SLI. Talk about a fast way to kill off the technology. It’s now the 1% of the 1%. What company (gaming or even business related) would even bother spending $1 in man hours optimizing for SLI/MGPU for probably less than <0.01% of their customer base? Why did Nvidia even bother taking the time & resources? Seems like they even had to re-design the connector.

Nvidia didn't kill off SLI. DX12 killed it off.

SLI for 3090 is really for professional use.
 
If true, 3 days to decide you want the FE version or not, but gives you a good idea where AIB cards will land, unless you are MSI for mucking the thermal pads on their vrms.
 
New Info dump from NVidia on Ampere architecture, several sites are reporting on it:
https://hothardware.com/reviews/nvidia-geforce-rtx-30-series-ampere-details
https://videocardz.com/newz/nvidia-details-geforce-rtx-30-ampere-architecture

Verifies die size and density: ~44 Million Transistors/mm2. Up from about ~25 MT/mm2 on Turing. Very similar to AMD RDNA 7nm chips

Code:
GPU             Die Size       Transistors  
GA102-300       628 mm2        28    B
GA104-300       392 mm2        17.4 B

The 3070 is the card that really looks the most questionable. They are listing only 441 GB/s memory BW vs 616 GB/s on 2080Ti, yet they match performance. It seems the 2080Ti with almost 40% more memory BW would sometimes leave the 3070 behind. Maybe some miscommunication about GDDR6 memory speeds.

I keep thinking, that snapping up used 2080 Ti if going for anywhere near 3070 price could be a very good idea.
 
To wait or not to wait for Big Navi, planning to upgrade from a 1070 Ti I paid like $400 for to either 3070 or some RDNA2 card. Probably going to pick up a 1440p 144Hz+ IPS monitor next (maybe the upcoming ASUS 175Hz). Can't lie the pricing of 3070 does look very attractive, I'm not the guy that likes to spend much on a GPU, today I'd put ~$500 as my limit roughly, it keeps growing year by year slightly as GPUs tend to be so expensive why I totally skipped 2000 series.

Either way seems like you are able to get massive performance jump coming from a Pascal card without having to spend big bucks either, what a refreshing feel for a change.
 
If you can wait on Big Navi, do it.
But if AMD is still quiet when 3080/3090 launch it probably means they have nothing to compete.
The problem is, by that time 3080/3090 might be out of stock.
 
At those prices it is going to be difficult for AMD I'm afraid.

Even if it were faster than 2080Ti, it has to be considerably so to compete with 3080 and IMO cheaper. Tall order.

We shall know soon enough. Not like one couldn't sell his 3080 for a Big Navi either.
 
To wait or not to wait for Big Navi, planning to upgrade from a 1070 Ti I paid like $400 for to either 3070 or some RDNA2 card. Probably going to pick up a 1440p 144Hz+ IPS monitor next (maybe the upcoming ASUS 175Hz). Can't lie the pricing of 3070 does look very attractive, I'm not the guy that likes to spend much on a GPU, today I'd put ~$500 as my limit roughly, it keeps growing year by year slightly as GPUs tend to be so expensive why I totally skipped 2000 series.
If nvidia's claims of better performance, regardless of how slight, than a 2080 Ti, then that 500 bucks will make you satisfied with your purchase. Having gone from a 1070 with strong OC to a 2080 Ti to power an ultrawide 1440p 120Hz monitor, pretty much everything crumbled under that card I could throw at it. The price sucked, but getting a $200 off email promotion was too tempting.

Waiting for Big Navi to come is up to whether you need to upgrade now or can wait several months for a product and stock to materialize. A 3070 with claimed performance will be great, it's up to AMD to present something which will make nvidia's products easier to pass on. Right now though? Sure is a lot of silence going around on team red...
 
Back
Top