RTX 3xxx performance speculation

I have a feeling these coolers became necessary once they got a idea of where RDNA2 was going to land. Their wattage's scream of a process being pushed to the limit and I have a feeling these will not overclock well.

I think this probably started when they realized they were going to have more than 2x the CUDA cores compared to a 2080ti and needed a silent way to cool 400W in even crappy cases.

Based on history for overclocking for CPUs and GPUs I don’t think we’ll see a lot left on the table anymore. Only chance is if they undervolt/underclock the 3090 to keep the power at 400W.

I bios modded and hardmodded my 2080ti and it topped out around 2140Mhz and 400W under water. That’s less than 10% OC.
 
Could have stopped typing after this.
Chip may be capable, but cooling might be a limiting factor for acceptability. If they are this power hungry now, I doubt they are in the conservative voltage/performance range.
 
  • Like
Reactions: noko
like this
Snowdog - putting the fan on the back side of the cooler creates a low pressure area which can allow more even air flow through the fins flowing from the front side to the back, fans have uneven air flow across their surface depending upon location of the fan blades airflow, meaning some of the fins will have more airflow than others if airflow is directed into the fins. This is most likely a good engineering design for creating more even air flow through the fins maximising the airflow as well. More air flow, lower the temperature rise in general. Still directing that heat or air flow to other components in a concentrated way. We do not know the percentage of the heat that will be exhausted out the back compared to the much bigger surface area of the back fan. Testing is all that is needed on this.

I too presume but don't know, Nvidia should have done a number of design tests with different configurations. Their diagram for airflow is utterly stupid but that is besides the point, marketing. For that diagram to even represent some sort of reality, the backfan would have to be exhausting virtually all the air going into the case, having significantly more airflow than the GPU fan, both which is rarely the case.

On other thread dealing with this HSF design I think I explained the concerns and showed or backed up the concerns with image of a TR system (beit still on my desktop for testing).

I would advise just wait to see testing done, feedback from folks with different configurations on how effective this design is for given configurations. I have my concerns but also it may work rather well too. Waiting may lead to no card for folks as well for awhile.

As for general statements about AMD reference coolers, in general they did fall short. The two that stick out well is the Radeon Nano design which cooled the GPU rather good except did posed some issues with small cases with heat buildup and the Vega 64 LC that removed 345w+ of heat outside your case and for normal settings was pretty quiet yet was not too bad when pumping 450w through the card. Just because AMD made some shitty coolers, and some good ones, does not indicate one way or another if RNDA2 reference cooling will be good or bad.

AMD like before will most likely sell reference boards to the AIBs which they can put their own cooler designs on. Custom boards take 3 months, not reference boards. AMD may have given AIBs already blueprints to the reference board dimensions and heat areas that need to be cooled with values. In other words AIBs can have full cooling solutions ready before receiving the first reference board for sell. Then again I am sure we will see reference boards/coolers with your favorite brand with a label on it like ASUS so folks can pay a little more for them :). AIBs can also have customize bioses. I would not expect custom AIB boards early on but I would expect custom cooled versions of reference boards early on. Also in general, AMD reference boards usually kick ass for quality in power delivery and components.
 
Last edited:
What is going on with the 3070?

"More importantly, the use of 14Gbps GDDR6 across a 256-bit wide memory bus gives the RTX 3070 448 GB/s of memory bandwidth, which is the same as the Turing RTX 2070 SKU. "
 
The fan on the back. I see no functional reason that this couldn't be push fan on the front with the exact same result. I have seen test results on flow through coolers, push on the front, or pull on the back has no significant difference.

I think it's there for the marketing splash. The fan on the back, is an attention grabber. Put the fan on the front, and it looks like a normal cooler.

Thoughts?

I don’t pretend to be an engineer but I would assume the fan on the back and the cold air it draws in would help cool the 3090s vram on the back side. NVIDIA then likely took that design and applied it to the rest of the lineup regardless of them having vram on the back or not. None of us know how hot this new Micron gddr6x gets so I’d be glad a fan on the bottom was cooling it if I were a 3090 FE owner.
 
What is going on with the 3070?

"More importantly, the use of 14Gbps GDDR6 across a 256-bit wide memory bus gives the RTX 3070 448 GB/s of memory bandwidth, which is the same as the Turing RTX 2070 SKU. "

Thats dissapointing and 2080ti performance seems out of reach now.

I just can't see the RTX 3060 being on ga104 now. More than likely it will be on a 192 bit ga106.

Nvidia will most likely streamline and balance their dies sets. Last gen, nvidia had tu102, tu104, tu106, tu116, and tu117 (5 total). The tu102 only for the 2080ti and the tu117 was sort of phased out.

This gen, they are using ga102 for a much larger percent having both the 3080 and 3090, the ga104 will most likely be 3070 only, ga106 in the 3060 and 1660ti replacement, and the ¿ga116? in everything below that.

(Speculating)
 
Last edited:
Thats dissapointing and 2080ti performance seems out of reach now.

I just can't see the RTX 3060 being on ga104 now. More than likely it will be on a 192 bit ga106.

Nvidia will most likely streamline and balance their dies sets. Last gen, nvidia had tu102, tu104, tu106, tu116, and tu117 (5 total). The tu102 only for the 2080ti and the tu117 was sort of phased out.

This gen, they are using ga102 for a much larger percent having both the 3080 and 3090, the ga104 will most likely be 3070 only, ga106 in the 3060 and 1660ti replacement, and the ¿ga116? in everything below that.

(Speculating)
You know what a full GA104 die has for Cuda Cores? Thought the 3070 was using most of them but not sure now. If the 3070 is a significantly cutdown GA104 then your probably right, Nvidia will just have a higher end skew, maybe 3070Ti using GA104 full die or most of it. I hope we don't have another round of 6gb cards at the $300-$400 mark.
 
You know what a full GA104 die has for Cuda Cores? Thought the 3070 was using most of them but not sure now. If the 3070 is a significantly cutdown GA104 then your probably right, Nvidia will just have a higher end skew, maybe 3070Ti using GA104 full die or most of it. I hope we don't have another round of 6gb cards at the $300-$400 mark.

Unsure, but ga102 was reported 628 mm^2 and ga104 is 395 mm^2

Here is the 3090 with 82 of 84 SM available.
Imagine it being about 30x21 mm
20200906_030621.jpg


Now the ga104. Without the nv link, they might be able to squeeze 5 clusters in the die.
20200906_030606.jpg


Ignoring the upper right, that should come to 7680 cuda cores for a perfect die. That leaves some room for the 3070ti with gddr6x, but probably too big for a 3060.
 
  • Like
Reactions: noko
like this
I liked my g12 setup, it was shorter than my 1080ti ftw3 hybrid.
Threw a noctua 92mm on it and a mini pwm to pwm adapter so Afterburner could override the stock fan curve when I was gaming.
Had core and memory cooled independently since 1080ti had better gains from overclocking vram vs core.
I think the raw power = everything attitude is going to be slowly changed by games that are stupid dark and blackhole flat shadows like MW 2019.
Things may change over time where you need to start flipping on ray tracing and cinematic fx to see into dark corners or on top of stacks.

 
I am going off of facts, nVidia launched the 2000 series at an unheard of price point. Now they are rolling all of that back, and it isn't out of the goodness of their little hearts, its out of competition.
That may be part of it, we don't know yet. But I think another reason is that they have looked at the financial situation and found that another extremely high priced product would both hurt their sales and image enough to warrant lowering the price this time around. Perhaps component prices plays a role too.
Also, sales in the enterprise market would keep the unit price down, so I'm sure they are well aware of what they are doing, and are not doing it out of goodness.

As no commercial company is.
 
It is cheaper than I thought it would be in Denmark:
View attachment 276132
$1.905,31 (12.000 DKK)for the RTX 3090 FE.
I had assumed $2.381,63 (15.000 DKK)

Custom fees and VAT/TAX adds up...and it always make me smile when I see people from the US hyperbole about US prices....spoiled indeed ;)

As an American living the UK, Americans are indeed spoiled on price. After you've netted out sales taxes vs VAT, and considered exchange rate, we pay ~20% more in the UK than the US. In Denmark, it looks like you guys take it even more on the nose. Even places where the cards are manufactured have more expensive retail prices than in the US.
 
As an American living the UK, Americans are indeed spoiled on price. After you've netted out sales taxes vs VAT, and considered exchange rate, we pay ~20% more in the UK than the US. In Denmark, it looks like you guys take it even more on the nose. Even places where the cards are manufactured have more expensive retail prices than in the US.

I think Europe has forced support and returns for longer? Also don’t forget the US has to pay sales tax not included in the prices you see.
 
You whole defense of the FE cooler post comes to mind, no fact, no tests, just bias speculation in favor of nVidia and some very dubious understanding of thermodynamics and airflow.

I am going off of facts, nVidia launched the 2000 series at an unheard of price point. Now they are rolling all of that back, and it isn't out of the goodness of their little hearts, its out of competition.

AMD has a vested interest in achieving, they've recently done so in the cpu market, Lisa Su seems incredibly competent, and now their Navi refinement comes. I'm no amd fan, their driver issues drove me nuts on the HD5900, but all indications from nVidia is that Navi 2 is a competitor, not just a card that only tops nVidia's last gen.

well this post certainly ran off and left the facts behind.

While it is true that prices went up for the RTX 2000 series, I don't know where you come up with "isn't out of the goodness of their little hearts"

"It's out of competition" from AMD?
What competition? I know you are not suggesting that Nvidia had to lower prices because of the 5700 that simply doesn't even touch Nvidia's currently gen high end. Big NAVI is not out, it doesn't exist on the market now and there is absolutely no reason to assume it will be here before or at the time the 3080 can be bought. As a matter of fact, the best guesses would be just before the end of the year. That's not even for sure, as it could easily be later, January or even past that. Who knows exactly when. It is certainly not in the realm of fact.

But, guessing... making up... Imagining that Nvidia not only knows exactly when AMD is launching Big Navi, that they also know exactly the performance.. That, without any facts and absolutely no information, Nvidia had to lower the price on the entire 3000 series because there will be competition from the future. We know it's not out of the kindness of their hearts so it must be exactly what you suggested: Nvidia had no choice, AMDs new GPU that it coming months away, in the future, is so amazing, so powerful, that the impact blast broke the laws of space and time, forced Nvidia to lower the price of the 3000 series. And like you already said, you only go off facts. So it must be, fact.

Certainly could not be any other reasoning, no other possible explanation, right?

Well.... Ya really don't have to invent, imagine, or make up a bunch of stuff. There are a lot possibilities. Even possible reasons, not from the future, but standing right there, I mean, right out in the open.

See..
Watching Nvidia's unveiling video, it seemed pretty clear that they were targeting and promoting directly to Pascal owners. There is no questioning or guessing, it is in no uncertain terms. Jen Huang looking straight in the camera and says it out loud, calling out to Pascal owners by name, saying this is the upgrade they have been waiting for. Directly marketed to Pascal owners that these 3000 cards were made for them.

It just doesn't take much imagining or imagination. Not even belief that cards from the future can have an impact in the past. Really, since you already know that the prices went up for the 2000 series, those prices went up from the 1000 series. There is no doubt Nvidia started off day one marketing the 3000 series directly to the 1000 series owners, even calling to them by name. Hmmm...

We may never know the true answers to these questions. But I feel pretty comfortable stating that it's at least reasonable to think that perhaps Nvidia priced their 3000 series cards to entice Pascal owners who didn't upgrade to the more expensive 2000 series gpus. And you know.... There are an awful lot Pascal owners out there... That might be why a marketed these cards directly towards them.

Special note:
Late night sleep deprived post, smiling... just trying to pass time. ;)
 
I think Europe has forced support and returns for longer? Also don’t forget the US has to pay sales tax not included in the prices you see.

Yes, the consumer protection is much better in the EU/UK. I suppose you're paying some for that.

Yes, I've done the math and understand implicitly the dynamic having moved back and forth between US, UK and France. Even with NYS+NYC sales tax at 8.88%, you pay ~20% in the UK more when considering FX. This disparity in EU/UK Vs US prices retail prices is par for the course.
 
Last edited:
I'm taking sales tax e.g.NNYNYC 8.8% into account in the equation. have extended service and returns regulations here.


Yes, I've done the math and understand implicitly the dynamic having moved back and forth between US, UK and France. Even with NYS+NYC sales tax at 8.88%, you pay ~20% in the UK more when considering FX. This disparity in EU/UK Vs US prices retail prices is par for the course.

Sounds like the UK needs to step up their game! ;)
 
Can't wait to see canadian prices. 700USD would translate to 915CDN. To my experience it's usually a cheaper than straight conversion, so 900 would be great.
 
Everything you wanted to know about GDDR6x, but were too afraid to ask:

https://www.tomshardware.com/news/m...ls-the-future-of-memory-or-a-proprietary-dram

"Micron admits that GDDR6X chips are costlier to produce than previous-generation GDDR6 devices. Furthermore, they require a very clean and stable signal, which is why the memory controller of Nvidia's GA102 GPU that powers GeForce RTX 3080/3090 cards now sits on its own power rail to ensure very clean and stable power."
 
Can't complain about local pricing here either at 529 € and 739 € the 3070 and 3080 are slightly cheaper than the regular RTX 2070 and RTX 2080 are typically right now. I hope that pricing stays at nvidia though and that it's not sold out for the next couple of months (half year?) and you have to really be quick to get one as the supply will be very small on nvidia's site and somehow I doubt aftermarket cards will be quite that cheap, especially here the etailers need their profit, expect most will be like 599 € and 799 € or expensier but I hope to be wrong.

That's why I'm slightly unsure whether it's worth the Big Navi wait, especially if it looks like I might pick up a GSYNC compatible monitor next. Paid 399€ for a Geforce 1070 Ti on BF which was a decent buy I thought, the 3070 does look very appealing to say at least at that price too. Idk, trying to hunt a card from Nvidia's site at launch day feels like a legit good option this time around.

EDIT: I don't know if it's a mistake done on purpose but on the finnish Nvidia site the link to 2000 series cards links to the Pascal 1000 series cards, just like they wouldn't want you to see the pricing of those cards. xD
 
Last edited:
As an American living the UK, Americans are indeed spoiled on price. After you've netted out sales taxes vs VAT, and considered exchange rate, we pay ~20% more in the UK than the US. In Denmark, it looks like you guys take it even more on the nose. Even places where the cards are manufactured have more expensive retail prices than in the US.

I wouldn't use the word spoiled. Americans don't pay VAT. Enjoy your expat country and its faults.
 
Can't complain about local pricing here either at 529 € and 739 € the 3070 and 3080 are slightly cheaper than the regular RTX 2070 and RTX 2080 are typically right now. I hope that pricing stays at nvidia though and that it's not sold out for the next couple of months (half year?) and you have to really be quick to get one as the supply will be very small on nvidia's site and somehow I doubt aftermarket cards will be quite that cheap, especially here the etailers need their profit, expect most will be like 599 € and 799 € or expensier but I hope to be wrong.

That's why I'm slightly unsure whether it's worth the Big Navi wait, especially if it looks like I might pick up a GSYNC compatible monitor next. Paid 399€ for a Geforce 1070 Ti on BF which was a decent buy I thought, the 3070 does look very appealing to say at least at that price too. Idk, trying to hunt a card from Nvidia's site at launch day feels like a legit good option this time around.

EDIT: I don't know if it's a mistake done on purpose but on the finnish Nvidia site the link to 2000 series cards links to the Pascal 1000 series cards, just like they wouldn't want you to see the pricing of those cards. xD

you looking to buy the 3070 or 3080?

I automatically want to say wait for 3rd party reviews before deciding anything. But I guess you are thinking that a huge wave in demand could drive up prices if the supply is limited.
It is probably safe to assume that there will not be endless stock, they could sell out becoming too hard to get or way too expensive.

So are you thinking of pre-ordering or something?

As for AMD, big Navi..
Just seems a little silly all the talk about it. Big Navi is not even an option, what is their to be conflicted over? This is a unicorn that has been talked to death for months on end now. Seems like the drumming up of this amazing big Navi has been getting plastered across boards and forums for years now. Going on and on and still we are waiting. There is no real signs confirming it will launch by the end if the year even. Sure, people hope and want it to. There is a lot of wishing for it. But let's get real.

- it is going to be costly to make huge 7nm chips at TSMC. Wafers are not cheap and AMD cannot get around this. The 5700xt was not especially cheap, while some make the claim that AMD made some conscious choice to not be seen as the cheaper alternative, I also think TSMC could have also played a role in the end price.

- Big dies are difficult to produce, the yields are problematic. To salvage enough useful chips for mass production, it's typical to have to use cut down chips.

- Piling on resources does not automatically come with perfect scaling. That is honestly very rare especially when talking about real world performance. Especially now where GPU speeds are capable of bottlenecking CPUs even in normal resolutions.

So in every way, it is a tall order just to expect AMD to come out with a big Navi that is double the 5700xt cores and double the performance. Not impossible, not at all. But to have them be cheap and in abundance? Anytime soon or...by the end if the year, if there is a Big Navi launch, what are people expecting?
It cannot be cheap, it cannot be in vast huge quantity, and it's really wishful thinking to believe it's going to make Ampere irrelevant.

Plus, there is absolutely no reason to think that AMD will price Big Navi at a significant undercut to nvidia's ampere pricing. If it comes at all, it's going to be priced, at the very least, to the price performance scale of Ampere. They will not be underselling them. The serious possibility that if it even comes it will be more expensive, price vs performance. And a good bet it won't even match the 3080. But if somehow they do, it's not going to be cheaper. Just not possible.
 
The price difference between the 3080 and 3090 is insane for a mere (likely) ~20% increase in performance...
I would go with the 3080 but I find the VRAM to be too limited. Interestingly, 3090 has a whopping 24GB, which is arguably too much... I do not want to pay for something that I am not going to use.

Consequently, I have decided to wait for the 3080 Ti, which I expect to be released in early 2021... I expect to performance to be 5-10% lower than the 3090 for a much lower price.
In the meantime, I will pick up a 2080 Ti for a good price as many have shown up recently on Kijiji...
 
I think I'm going to 2x SLI 3090 for that 4K 120hz FPS minimum goodness but I'm fairly confident this will be my last SLI purchase ever. I don't believe a single game released since late 2019 has SLI support. I still have 19 games in my Steam library or on my Wishlist that I know for a fact support SLI which I'm sure will need a 2nd 3090 to get up to that magic 4K 120 fps #. I highly doubt any new SLI supported games will be added to my wishlist by the time I beat these ~19 games or so since DX12 pretty much killed it off and Nvidia just made it worse by only allowing 3090's only to allow SLI. I guess by that time, I can sell off my 2nd GPU and downgrade to a more compact system. Here's just a list of SLI from my library/wishlist that I still haven't played or bought yet (2015~2019 release dates):

Dying Light
Rise of the Tomb Raider
Homefront: The Revolution
No Man's Sky
Deus Ex: Mankind Divided
Watch_Dogs 2
Mass Effect: Andromeda
Sniper: Ghost Warrior 3
The Evil Within 2
Final Fantasy XV
We Happy Few
Yakuza Kiwami 2
Star Wars Jedi Fallen Order
Shadow of the Tomb Raider
X4: Foundations
Metro Exodus
Blair Witch
The Outer Worlds
Journey to the Savage Planet
Borderlands 3

Blah I don't know if it's worth it. $1,570 for an extra RTX 3090 + Nvlink bridge divided by 19 games comes out to around $83/game (I guess less, when you factor in the $ I'd get after selling off the 2nd GPU). Wonder if I should just punch out early and go single GPU solution. I've been rockin' 2-4 way SLI for almost 10 years now (I had a 2nd 2080Ti but it recently blew up and I got a full MSRP refund since the model was no longer being manufactured to be replaced).
 
Last edited:
I think I'm going to 2x SLI 3090 for that 4K 120hz FPS minimum goodness but I'm fairly confident this will be my last SLI purchase ever. I don't believe a single game released since late 2019 has SLI support. I still have 19 games in my Steam library or on my Wishlist that I know for a fact support SLI which I'm sure will need a 2nd 3090 to get up to that magic 4K 120 fps #. I highly doubt any new SLI supported games will be added to my wishlist by the time I beat these ~19 games or so since DX12 pretty much killed it off and Nvidia just made it worse by only allowing 3090's only to allow SLI. I guess by that time, I can sell off my 2nd GPU and downgrade to a more compact system. Here's just a list of SLI from my library/wishlist that I still haven't played or bought yet (2015~2019 release dates):

Dying Light
Rise of the Tomb Raider
Homefront: The Revolution
No Man's Sky
Deus Ex: Mankind Divided
Watch_Dogs 2
Mass Effect: Andromeda
Sniper: Ghost Warrior 3
The Evil Within 2
Final Fantasy XV
We Happy Few
Yakuza Kiwami 2
Star Wars Jedi Fallen Order
Shadow of the Tomb Raider
X4: Foundations
Metro Exodus
Blair Witch
The Outer Worlds
Journey to the Savage Planet
Borderlands 3

Blah I don't know if it's worth it. $1,570 for an extra RTX 3090 + Nvlink bridge divided by 19 games comes out to around $83/game (I guess less, when you factor in the $ I'd get after selling off the 2nd GPU). Wonder if I should just punch out early and go single GPU solution. I've been rockin' 2-4 way SLI for almost 10 years now (I had a 2nd 2080Ti but it recently blew up and I got a full MSRP refund since the model was no longer being manufactured to be replaced).

mGPU is dead in consumerspace FYI.
The only reason the RTX 3090 has NVLink is because it is a producer card (replacing the Titan line)
 
I'm going to speculate a bit more.
3090 is about 20-25% faster than 3080. Both overclocked. Maybe even less becaue 3080 has lower TDP and the same PCB (and the same cooler design, just smaller), so probably more OC headroom.
3080S/Ti coming in the late first or early second quarter of 2021, with 20gb of ram.
About 20% faster than 3080. At $999.
Pretty sure we'll see Pascal history repeat itself. At slightly higher prices. 1080Ti was in most cases even faster than Titan.
 
Last edited:
As an American living the UK, Americans are indeed spoiled on price. After you've netted out sales taxes vs VAT, and considered exchange rate, we pay ~20% more in the UK than the US. In Denmark, it looks like you guys take it even more on the nose. Even places where the cards are manufactured have more expensive retail prices than in the US.

As an American, for most of my life, up until a year or two ago, we never had to pay any tax on online electronics. Up until recently I saved thousands of dollars buying flagship cards every year until recently. We're now starting to feel the tax pinch and it's painful. I don't know if Newegg is still charging tax for my state, haven't bought from them yet this year. . .Might find out on the 17th.

I'm going to speculate a bit more.
3090 is about 20-25% faster than 3080. Both overclocked. Maybe even less becaue 3080 has lower TDP and the same PCB (and the same cooler design, just smaller), so probably more OC headroom.
3080S/Ti coming in the late first or early second quarter of 2021, with 20gb of ram.
About 20% faster than 3080. At $999.
Pretty sure we'll see Pascal history repeat itself. At slightly higher prices. 1080Ti was in most cases even faster than Titan.

Since the 3090 is essentially a prosumer card now, I'd doubt the FE are really focused on overclocking so much as just delivering a crispy 350w. AIB cards may eventually give us powerful 400w boards that have additional room, which could be barn burners.
 
As an American, for most of my life, up until a year or two ago, we never had to pay any tax on online electronics. Up until recently I saved thousands of dollars buying flagship cards every year until recently. We're now starting to feel the tax pinch and it's painful. I don't know if Newegg is still charging tax for my state, haven't bought from them yet this year. . .Might find out on the 17th.

Way off topic, but the "issue" is, since 2018, states can deem all online commerce as effectively in-state, and thus charge can charge state tax. Most states have now jumped on the bandwagon doing this under the auspices of levelling the playing ground for non-online retailers (which is correct, in my opinion), but the real motivation is, of course, added tax revenue for states. And as is tradition, companies/online retailers have just balanced the potential loss in sales due to these taxes with further "creative" tax avoidance schemes -- so it's questionable as to whether these new online taxes actually have brought in any revenue at all, net-net. In any case, last I checked, there are only 6-7 states that still do not charge taxes for online purchases where the retail does not maintain a physical presence. It was fun while it lasted...

As an individual, there are legal ways to claim state taxes back, or get reimbursed if you're exporting, but honestly, it's normally more hassle than it's worth (likely by design) unless you're dealing in volume. My time is worth more than the $40-50 I'd save on a 3080 purchase, TBH. Better to just grit and pay your state taxes in my view.
 
Does anyone know if 3090 benchmarks will be officially out before 3080 is out?

Good question. I have a feeling, since nV knows 3090 will be top dog, and many people are going to buy it simply for that reason, NDA may end along with 3080 9am EDT on Sept 17. If nV is worried performance is not up to snuff on 3090, reviews will appear on Sept 24.
 
As an American living the UK, Americans are indeed spoiled on price. After you've netted out sales taxes vs VAT, and considered exchange rate, we pay ~20% more in the UK than the US. In Denmark, it looks like you guys take it even more on the nose. Even places where the cards are manufactured have more expensive retail prices than in the US.
Imagine living in the states, having to pay 9 % sales tax, and 24% income tax, VAT = lol.
 
Good question. I have a feeling, since nV knows 3090 will be top dog, and many people are going to buy it simply for that reason, NDA may end along with 3080 9am EDT on Sept 17. If nV is worried performance is not up to snuff on 3090, reviews will appear on Sept 24.

I am just worried about 3090 benchmarks being underwhelming and to you point not being released until 24 at which point it may be hard to get 3080. I am assuming 20% improvement, but its hard to make decisions based on assumptions without hard data (especially in RTX / DLSS scenarios).
 
We all hear that going back to 512 bit memory bus's is going to be unlikely.
However, I wonder would AMD do something abit more than 384 bit, like 416 , or 448 bit memory bus for 13GB or 14GB cards?
Enough to match RTX 3000 series in mem bandwidth without going to GDDR6x or HBM2.

I do wonder how much the extra 8GB of GDDR6x on the 3090 costs Nvidia over the 2080. I would guess at least USD 100+ more per card.
 
Back
Top