Who is buying a 3080 and 3090?

According to a review of the new format in TomsHardware.com: "Nvidia has also added PCIe Gen4 support to its Ampere GPU. It's worth pointing out that this probably won't matter much for gaming performance, as the large 24GB of VRAM means there should be less data going back and forth over the PCIe bus. The other problem of course is that the fastest gaming CPUs still come from Intel, and Intel doesn't have a desktop PCIe Gen4 solution yet. That will come with next year's Rocket Lake processors, which will yet again use Intel's 14nm++(++) process. Intel's Alder Lake will also support PCIe Gen4 and will be the first SuperFIN (or post-SuperFIN) desktop CPU from Intel." I am not certain that upgrading my motherboard and processor just to be able to use this card is a viable solution for me in the near future. I think I will upgrade to a 2090Ti watercooled solution until the dust settles next year sometime (around Chrismas maybe).
 
I just upgraded to a cheap 1080ti for my 1440p60hz monitor. I don’t think my AMD 2700x has enough juice keep up with a faster card. By the time raytracing games are cheap and plentiful I plan to upgrade. Maybe two years or so. You 4K and high refresh gamers can enjoy though! I gotta day, that 3090 looks beautiful. My 1000watt power supply could use a new friend...
 
Planning on a 3080 FE. Then when an ITX version of a 3060/3070 arrives, I’ll get that for my HTPC

I run 1440P GSYNC and I’m hoping for BF1 at a locked 165FPS ULTRA settings.
 
  • Like
Reactions: DBZ33
like this
I'm going to stick with my 2080ti because it's awesome for now until an AIB comes out that I like enough to replace the 2080ti. I really doubt yield will be there for everyone who wants a new card on launch day. Not trying to be a downer but I'd be surprise if supply met demand.
 
Planning on a 3080 FE. Then when an ITX version of a 3060/3070 arrives, I’ll get that for my HTPC

I run 1440P GSYNC and I’m hoping for BF1 at a locked 165FPS ULTRA settings.

Hey Revenant it’s Isaac from the BF1 server, I’m also upgrading from my Titan XP to the 3080.
 
As for me, I've been looking at either the eVGA RTX 3080 XC3 Ultra Gaming or the MSI RTX 3080 Ventus 3X. I'm debating whether or not I want to wake up before 6am in the morning Thursday knowing that it (the first sale) will likely be over in less than 75 seconds because...BOTS! Out!
 
As for me, I've been looking at either the eVGA RTX 3080 XC3 Ultra Gaming or the MSI RTX 3080 Ventus 3X. I'm debating whether or not I want to wake up before 6am in the morning Thursday knowing that it (the first sale) will likely be over in less than 75 seconds because...BOTS! Out!

Nvidia did mention that they have measures in place to keep the boys from accessing the store, this time around.
 
I like to push things and I like challenges so 3090 is the only one on my radar.

It'll be an interesting adventure to see if I can get Skyrim SE and Fallout 4 to use 24GB VRAM.
 
What time does the nvidia store begin selling the GPUs on Thursday?
Anybody got an idea?
I was thinking 9 AM Pacific, is that too late??
Im really dreaming that I might actually score a GPU, but hey.
I can start anytime, no work that day anyway.
 
What time does the nvidia store begin selling the GPUs on Thursday?
Anybody got an idea?
I was thinking 9 AM Pacific, is that too late??
Im really dreaming that I might actually score a GPU, but hey.
I can start anytime, no work that day anyway.

6AM PST
 
I'll end up getting a 3090, either an evga 3090 FTW3 or a Hybrid/Kingpin... not really sure yet, and I doubt I'll take part in the release rush, I'll probably see how it all pans out. I got the cash now, but I can afford to wait it out and see what will fit my needs best. I can still max all my games out at 4K for now... lol.
 
I have a 1080 Ti and honestly it's still a good GPU for me.
I skipped the 2080 series, now the urge has me by the throat.
If I get shut out, I can easily wait until the supply increases.
I'd also like to see what AMD is hiding......but if I get lucky, I'm good with that too.

I'll be spamming nvidia at 0600 hours......LoL.
 
All fanboy-ism or did you not read and comprehend the entire article?

OK, I've read the article a few times, and I don't see how this benefits nvidia as stated. So please feel free to explain for the cheap seats. You know the biz and I don't really, so what am I missing?

My take on the article.

So First the article says the margins aren't there and the cards cost over $150 to produce. So either they are deeply stupid or that is stated badly because last I checked $699 is more than 4x $150. So are they talking chip cost but not card cost? Or? THey then blather about the cooler being expensive to make, but I'm highly doubting they could drop $150 ont he cooler alone at manufacturer pricing. So first up we have to more or less blindly accept that the margin on the card is only 40%. I've seen that stated multiple places but I haven't seen anything evidence presented to support it.

But lets take it as fact and assume that every employee at nvidia has en explosive collar around their neck that blows up if they aren't making 60% margin by the end of the fiscal quarter. If $699 is 40% margin, then to hit $60% margin, They would have to sell at $799.

If they had debuted with a 3080 at $799, space time would have ground to a halt right? I mean that was literally only $300-500 LESS than everyone's speculation. There was massive peril apparently in selling at a price that nobody would have batted an eye about and would likely still have called a great price so long as performance was there, which it is. Since 60% margin isn't a scary price, why forego it? This needs a reason. So far we are supposed to accept that "cause AMD".

We will ignore that and pretend there is something magical about $799 that would have caused great personal harm to all of nvidia staff somehow or tanked their stock price or something. We'll call it a given it is super bad.

OK, so We limit supply no because we don't know what yield per wafer will be, but because we want to create scarcity. Stated as just a desire to create scarcity, I can believe it 1000%. Mainly because scarcity is great press. Nothing makes people want something than some people having it and saying it is great (which is likely given what we have seen of the 3080) while another gorup is vocally annoyed that they can't get it. I mean nvidia would have the bulk of the market share anyway, so I'm not sure why it would be necessary, but yeah marketing MBAs would be all over making that happen. So plausible. Another possibility is that nvidia doesn't want it to be scarce at all, and nobody doubts the wafer yield, but that the chip is BIG, and we all have just watched the last 4+ months of everyone and their brother lining up to swamp TSMC and Samsung, including intel. We have apple making phones and wanting laptop supply for their arm chips, we have two new consoles, we have intel conceding to bypass the 10nm fuckery that has been going on they will be going to TSMC, we have new nvidia cards, we have new AMD cards, etc. I mean take the names of the headline, and SOMETHING was going to have to be getting scarce real soon now just due to fab capacity and scheduling.

So AIB's costing more. OK if nvidia took a 40% margin, then old realative AIB pricing would be off because that would be based on 60% margins. So AIB pricing would likely be higher than FE cards unless they cut corners someplace. Sure, sounds reasonable... but... they say the cooler is super expensive. So in theory the bigger AIB cards with cheaper coolers should be able to claw back margin there. They appear to have bigger coolers, whcih the article author ASSURES me are not as pricey as the FE coolers by a lot, and they are $50 more than the FE. That in theory should get you back to 60% or near it, especially if that $150 number form the article was referencing just the cooler.

So they then claim the AIBs will have bad performance compared to the FE. Totally flipping the script on EVERY card released to date. Because shitty AIB numbers will never see the light of day or something? I mean there's really only three kinds of shoppers for game cards. People who get what th system builder put in there and reviews won't enter into it. Enthusiast who will sure as hell notice that a lot of these cards suck and will tear the AIBs a new one, be they AMD or NVIDIA fanboys, and the bang for your buck group who will actually look at the worse numbers and go "yeah but is it a good deal"? Yes, the whining fanboys who need to whip out their epeen and measure up who's brand loyalty is best brand loyalty would quote overblown numbers when arguing in forums, but everyone will know what's up when spending cash except people buying form system integrators. The vast unwashed masses who might be swayed by some pull quotes form old reviews will be buying laptops, new gen consoles, and phones. They aren't even part of this math. I don't see any version of reality where AIBs costing more and performing worse is good for NVIDIA except one.... that is one where nvidia doesn't want AIBs selling cards they could be selling instead.

Then they claim that this is some kind of ploy for price anchoring. Except that publishing $699 has already done that. All future prices will be referenced off that. That is THE anchor. Why a vastly elaborate scheme to do something they could have done very simply at launch by just saying "$799" with literally no ill effect? On top of that the scheme involves some underwear gnomes type middle step where they need to do this because if they released cards with double the amount of expensive ram there is no way that they could justify higher prices for them without some elaborate shell game? This part makes no sense and just sounds like tin foil hat wargarble in the article. Nvidia wants that but they won't sell them themselves. Only the AIBs, who they apparently were kicking in the nuts a couple bullet points up will get to sell them. For a much higher price.

I read all thiis and see the following two possible scenarios.

First one: That for carious reasons NVIDIA sees this generation of AMD launch to be potential disruptive especially coinciding with the new console gen coming this quarter as well. NVIDIA had to account for this, and needed an attractive price, and was forced to stick to a very aggressive schedule for release while running into a very (and increasingly) tight market for fab capacity that is of any actual use. I'm going to guess that GDDR6 ram is also going to be expensive and kind of scarce as well given the consoles eating it up for Christmas inventory too. I'm guessing here that nvidia didnn't buy lots of fab capacity because they didn't know if they would actually be done and ready to roll in quantity by the time they actually had to commit to paying and sing off on clawbacks and other fun stuff. I'm guessing that we WILL see AIBs do cards with more ram first because that means getting that ram is THEIR problem AND it lets them recoup some profit from nvidias agressive pricing against AMD and the consoles by going there first. Scarcity will persist until Q1 2021 because everyone is trying to get fab capcity and components for the new iPhone launch, new console launches, two new graphics card launches, etc. Then we will get the tock next year and maybe that will be a new higher price, but with gobs of ram.

Second one: We are going to run into the long tail HARD. Because we seem to be on the verge of being able to do 4k real well and the leap to 8k has some structural problems that will keep it from being something big or soon. When you have 4k done good enough or better by consoles on up, it becomes like word processing or web browsing. You don't care much about the next gen processor underpinning it. It's all good enough. At that point nvidia might want to trade margin for revenue by shooting their AIBs in the back. This could be the setup for the ambush later. Heck forgtet the long tail. Lets say that there's room for massive improvement year after year. All the cool kids are going verticle and bringing shit in house except intel who is farming out stuff tyring to escape the 10nm hellscape they built for themselves. Nvidia may jsut be following the cool kids and trying not to look like intel over in the corner there with the 10mm cancer. Business is all about vertical integration right now, so why not make your stock prices go up by stabbing AIBs in the back. I mean they were super nice to them by taking a shit all over their gaming brands, right? Nvidia loves their AIBs.

Wahrgarble... scarcity... master plan for higher prices. Just makes no sense as presented there. Yeah no shit there will be scarcity. And yeah there will be unreasonable prices, because that has been everything since the 1080 launched. Predicting that is like predicting water will feel wet. Calling it some grand conspiracy to benefit nvidia because somehow uttering $799 would have been unfathomable or something? That makes no sense. $699 being genuine and nvidia trying to be cost competitive while facing capacity issues to manufacture things makes some sense, and the market being hostile to accepting the retail price AND scarcity also makes sense. Nvidia facing scarcity and wanting to make loud publicity noises while letting the AIBs take the heat on pricing and the hassle of sourcing scarce components while nvidia churns upa feeding frenzy, then shoots them in the back once scarcity eases and they can bring the best and fastest in house in a way the AIBs can't touch? Especially after putting your thumb on the scale with the cost-downs? That's more sinister and I can see how it might happen.
 
Getting a 3080, and will flip if a 3080 Ti version comes out. The FE is sexy. Regardless, its an upgrade for me coming from 980 Ti's. This upcoming release schedule is a nice time. This & AMD Zen3. Good times.
 
OK, I've read the article a few times, and I don't see how this benefits nvidia as stated. So please feel free to explain for the cheap seats. You know the biz and I don't really, so what am I missing?

My take on the article.

So First the article says the margins aren't there and the cards cost over $150 to produce. So either they are deeply stupid or that is stated badly because last I checked $699 is more than 4x $150. So are they talking chip cost but not card cost? Or? THey then blather about the cooler being expensive to make, but I'm highly doubting they could drop $150 ont he cooler alone at manufacturer pricing. So first up we have to more or less blindly accept that the margin on the card is only 40%. I've seen that stated multiple places but I haven't seen anything evidence presented to support it.

But lets take it as fact and assume that every employee at nvidia has en explosive collar around their neck that blows up if they aren't making 60% margin by the end of the fiscal quarter. If $699 is 40% margin, then to hit $60% margin, They would have to sell at $799.

If they had debuted with a 3080 at $799, space time would have ground to a halt right? I mean that was literally only $300-500 LESS than everyone's speculation. There was massive peril apparently in selling at a price that nobody would have batted an eye about and would likely still have called a great price so long as performance was there, which it is. Since 60% margin isn't a scary price, why forego it? This needs a reason. So far we are supposed to accept that "cause AMD".

We will ignore that and pretend there is something magical about $799 that would have caused great personal harm to all of nvidia staff somehow or tanked their stock price or something. We'll call it a given it is super bad.

OK, so We limit supply no because we don't know what yield per wafer will be, but because we want to create scarcity. Stated as just a desire to create scarcity, I can believe it 1000%. Mainly because scarcity is great press. Nothing makes people want something than some people having it and saying it is great (which is likely given what we have seen of the 3080) while another gorup is vocally annoyed that they can't get it. I mean nvidia would have the bulk of the market share anyway, so I'm not sure why it would be necessary, but yeah marketing MBAs would be all over making that happen. So plausible. Another possibility is that nvidia doesn't want it to be scarce at all, and nobody doubts the wafer yield, but that the chip is BIG, and we all have just watched the last 4+ months of everyone and their brother lining up to swamp TSMC and Samsung, including intel. We have apple making phones and wanting laptop supply for their arm chips, we have two new consoles, we have intel conceding to bypass the 10nm fuckery that has been going on they will be going to TSMC, we have new nvidia cards, we have new AMD cards, etc. I mean take the names of the headline, and SOMETHING was going to have to be getting scarce real soon now just due to fab capacity and scheduling.

So AIB's costing more. OK if nvidia took a 40% margin, then old realative AIB pricing would be off because that would be based on 60% margins. So AIB pricing would likely be higher than FE cards unless they cut corners someplace. Sure, sounds reasonable... but... they say the cooler is super expensive. So in theory the bigger AIB cards with cheaper coolers should be able to claw back margin there. They appear to have bigger coolers, whcih the article author ASSURES me are not as pricey as the FE coolers by a lot, and they are $50 more than the FE. That in theory should get you back to 60% or near it, especially if that $150 number form the article was referencing just the cooler.

So they then claim the AIBs will have bad performance compared to the FE. Totally flipping the script on EVERY card released to date. Because shitty AIB numbers will never see the light of day or something? I mean there's really only three kinds of shoppers for game cards. People who get what th system builder put in there and reviews won't enter into it. Enthusiast who will sure as hell notice that a lot of these cards suck and will tear the AIBs a new one, be they AMD or NVIDIA fanboys, and the bang for your buck group who will actually look at the worse numbers and go "yeah but is it a good deal"? Yes, the whining fanboys who need to whip out their epeen and measure up who's brand loyalty is best brand loyalty would quote overblown numbers when arguing in forums, but everyone will know what's up when spending cash except people buying form system integrators. The vast unwashed masses who might be swayed by some pull quotes form old reviews will be buying laptops, new gen consoles, and phones. They aren't even part of this math. I don't see any version of reality where AIBs costing more and performing worse is good for NVIDIA except one.... that is one where nvidia doesn't want AIBs selling cards they could be selling instead.

Then they claim that this is some kind of ploy for price anchoring. Except that publishing $699 has already done that. All future prices will be referenced off that. That is THE anchor. Why a vastly elaborate scheme to do something they could have done very simply at launch by just saying "$799" with literally no ill effect? On top of that the scheme involves some underwear gnomes type middle step where they need to do this because if they released cards with double the amount of expensive ram there is no way that they could justify higher prices for them without some elaborate shell game? This part makes no sense and just sounds like tin foil hat wargarble in the article. Nvidia wants that but they won't sell them themselves. Only the AIBs, who they apparently were kicking in the nuts a couple bullet points up will get to sell them. For a much higher price.

I read all thiis and see the following two possible scenarios.

First one: That for carious reasons NVIDIA sees this generation of AMD launch to be potential disruptive especially coinciding with the new console gen coming this quarter as well. NVIDIA had to account for this, and needed an attractive price, and was forced to stick to a very aggressive schedule for release while running into a very (and increasingly) tight market for fab capacity that is of any actual use. I'm going to guess that GDDR6 ram is also going to be expensive and kind of scarce as well given the consoles eating it up for Christmas inventory too. I'm guessing here that nvidia didnn't buy lots of fab capacity because they didn't know if they would actually be done and ready to roll in quantity by the time they actually had to commit to paying and sing off on clawbacks and other fun stuff. I'm guessing that we WILL see AIBs do cards with more ram first because that means getting that ram is THEIR problem AND it lets them recoup some profit from nvidias agressive pricing against AMD and the consoles by going there first. Scarcity will persist until Q1 2021 because everyone is trying to get fab capcity and components for the new iPhone launch, new console launches, two new graphics card launches, etc. Then we will get the tock next year and maybe that will be a new higher price, but with gobs of ram.

Second one: We are going to run into the long tail HARD. Because we seem to be on the verge of being able to do 4k real well and the leap to 8k has some structural problems that will keep it from being something big or soon. When you have 4k done good enough or better by consoles on up, it becomes like word processing or web browsing. You don't care much about the next gen processor underpinning it. It's all good enough. At that point nvidia might want to trade margin for revenue by shooting their AIBs in the back. This could be the setup for the ambush later. Heck forgtet the long tail. Lets say that there's room for massive improvement year after year. All the cool kids are going verticle and bringing shit in house except intel who is farming out stuff tyring to escape the 10nm hellscape they built for themselves. Nvidia may jsut be following the cool kids and trying not to look like intel over in the corner there with the 10mm cancer. Business is all about vertical integration right now, so why not make your stock prices go up by stabbing AIBs in the back. I mean they were super nice to them by taking a shit all over their gaming brands, right? Nvidia loves their AIBs.

Wahrgarble... scarcity... master plan for higher prices. Just makes no sense as presented there. Yeah no shit there will be scarcity. And yeah there will be unreasonable prices, because that has been everything since the 1080 launched. Predicting that is like predicting water will feel wet. Calling it some grand conspiracy to benefit nvidia because somehow uttering $799 would have been unfathomable or something? That makes no sense. $699 being genuine and nvidia trying to be cost competitive while facing capacity issues to manufacture things makes some sense, and the market being hostile to accepting the retail price AND scarcity also makes sense. Nvidia facing scarcity and wanting to make loud publicity noises while letting the AIBs take the heat on pricing and the hassle of sourcing scarce components while nvidia churns upa feeding frenzy, then shoots them in the back once scarcity eases and they can bring the best and fastest in house in a way the AIBs can't touch? Especially after putting your thumb on the scale with the cost-downs? That's more sinister and I can see how it might happen.
You do know that nothing you said proved anything, right?
 
Getting a 3080, and will flip if a 3080 Ti version comes out. The FE is sexy. Regardless, its an upgrade for me coming from 980 Ti's. This upcoming release schedule is a nice time. This & AMD Zen3. Good times.
Damn... It's going to be a night and day transition for you...
 
  • Like
Reactions: Whach
like this
I have a 1080 Ti and honestly it's still a good GPU for me.
I skipped the 2080 series, now the urge has me by the throat.
If I get shut out, I can easily wait until the supply increases.
I'd also like to see what AMD is hiding......but if I get lucky, I'm good with that too.

I'll be spamming nvidia at 0600 hours......LoL.

Honestly I'd keep my Titan XP if I didn't need to replace another GPU. Since I'm buying anyway, I figured I might as well get a 3090 and move the Titan XP to the other PC.
 
I would like to buy an RTX 3080 if one available, but I can not go with founders edition, because think it has the worst fan setup for air cooling, since it blows hot air to processor.

I need to find one that does it like a blower edition.

However, price is way more than price stated by Nvidia, here it goes around US$970 retail which is 38% more than Nvidia.

Availability is a bit concern though, especially with bitcoin price which quite stable nowadays, I assumed miner will change from their 2xxx flagship to the new 3080/3090.
 
I would like to buy an RTX 3080 if one available, but I can not go with founders edition, because think it has the worst fan setup for air cooling, since it blows hot air to processor.

I need to find one that does it like a blower edition.

However, price is way more than price stated by Nvidia, here it goes around US$970 retail which is 38% more than Nvidia.

Availability is a bit concern though, especially with bitcoin price which quite stable nowadays, I assumed miner will change from their 2xxx flagship to the new 3080/3090.
If you are talking about an air cooler for the CPU you can always reverse how the air goes through the cooler coming out same place as the card backside where the fan blows and then either exit the hot air on top or front of case. Just an option. We don't know the temperature increase yet from the GPU. If water cooling the CPU this design may make little difference. Just needs to be tested.
 
Last edited:
If you are talking about an air cooler for the CPU you can always reverse how the air goes through the cooler coming out same place as the card backside where the fan blows and then either exit the hot air on top or front of case. Just an option. We don't know the temperature increase yet from the GPU. If water cooling the CPU this design may make little difference. Just needs to be tested.

Yes I did talking about air cooler for CPU. Here is what will happened in my case if I still using air cooling on CPU when buying an RTX 3080 :

RTX 3080 case air flow.jpg


We don't know how hot RTX 3080 yet, but I know it will make my air cooled CPU get hotter with the founders edition.
If I change CPU fan direction, I must change other case fan direction as well.

I assume the diagram also fit with most of air cooled CPU setup as well.
 
Yes I did talking about air cooler for CPU. Here is what will happened in my case if I still using air cooling on CPU when buying an RTX 3080 :

View attachment 279567

We don't know how hot RTX 3080 yet, but I know it will make my air cooled CPU get hotter with the founders edition.
If I change CPU fan direction, I must change other case fan direction as well.

I assume the diagram also fit with most of air cooled CPU setup as well.
You do have an option of going opposite with your CPU cooler, Left to right air flow, use your front fans as exhaust, top fans and bottom fans as supply. I use my top fans as supply blowing over the VRMs, Memory, shielding heat from a GPU (except presently in the case is a blower card). Blowing heat out front through 360mm radiator. Works great, memory and VRMs run very cool allowing for at least very good ram OCs. Yes heated air if there are no fans running will normally go up due to being less dense but with fans you can direct that heat anyway you want. When I was running SLI with two 1080 Ti's, talking 500w+ of heat, way more than a 3080 this worked out rather well. Since the 3080 GPU back side air is going up you could also use your top fans as exhaust, front fans as supply but I would reverse you back fan to feed the CPU cool air. Personally I like the cool air blanket on the motherboard for better cooling.
 
OK, I've read the article a few times, and I don't see how this benefits nvidia as stated. So please feel free to explain for the cheap seats. You know the biz and I don't really, so what am I missing?

My take on the article.

So First the article says the margins aren't there and the cards cost over $150 to produce. So either they are deeply stupid or that is stated badly because last I checked $699 is more than 4x $150. So are they talking chip cost but not card cost? Or? THey then blather about the cooler being expensive to make, but I'm highly doubting they could drop $150 ont he cooler alone at manufacturer pricing. So first up we have to more or less blindly accept that the margin on the card is only 40%. I've seen that stated multiple places but I haven't seen anything evidence presented to support it.

But lets take it as fact and assume that every employee at nvidia has en explosive collar around their neck that blows up if they aren't making 60% margin by the end of the fiscal quarter. If $699 is 40% margin, then to hit $60% margin, They would have to sell at $799.

If they had debuted with a 3080 at $799, space time would have ground to a halt right? I mean that was literally only $300-500 LESS than everyone's speculation. There was massive peril apparently in selling at a price that nobody would have batted an eye about and would likely still have called a great price so long as performance was there, which it is. Since 60% margin isn't a scary price, why forego it? This needs a reason. So far we are supposed to accept that "cause AMD".

We will ignore that and pretend there is something magical about $799 that would have caused great personal harm to all of nvidia staff somehow or tanked their stock price or something. We'll call it a given it is super bad.

OK, so We limit supply no because we don't know what yield per wafer will be, but because we want to create scarcity. Stated as just a desire to create scarcity, I can believe it 1000%. Mainly because scarcity is great press. Nothing makes people want something than some people having it and saying it is great (which is likely given what we have seen of the 3080) while another gorup is vocally annoyed that they can't get it. I mean nvidia would have the bulk of the market share anyway, so I'm not sure why it would be necessary, but yeah marketing MBAs would be all over making that happen. So plausible. Another possibility is that nvidia doesn't want it to be scarce at all, and nobody doubts the wafer yield, but that the chip is BIG, and we all have just watched the last 4+ months of everyone and their brother lining up to swamp TSMC and Samsung, including intel. We have apple making phones and wanting laptop supply for their arm chips, we have two new consoles, we have intel conceding to bypass the 10nm fuckery that has been going on they will be going to TSMC, we have new nvidia cards, we have new AMD cards, etc. I mean take the names of the headline, and SOMETHING was going to have to be getting scarce real soon now just due to fab capacity and scheduling.

So AIB's costing more. OK if nvidia took a 40% margin, then old realative AIB pricing would be off because that would be based on 60% margins. So AIB pricing would likely be higher than FE cards unless they cut corners someplace. Sure, sounds reasonable... but... they say the cooler is super expensive. So in theory the bigger AIB cards with cheaper coolers should be able to claw back margin there. They appear to have bigger coolers, whcih the article author ASSURES me are not as pricey as the FE coolers by a lot, and they are $50 more than the FE. That in theory should get you back to 60% or near it, especially if that $150 number form the article was referencing just the cooler.

So they then claim the AIBs will have bad performance compared to the FE. Totally flipping the script on EVERY card released to date. Because shitty AIB numbers will never see the light of day or something? I mean there's really only three kinds of shoppers for game cards. People who get what th system builder put in there and reviews won't enter into it. Enthusiast who will sure as hell notice that a lot of these cards suck and will tear the AIBs a new one, be they AMD or NVIDIA fanboys, and the bang for your buck group who will actually look at the worse numbers and go "yeah but is it a good deal"? Yes, the whining fanboys who need to whip out their epeen and measure up who's brand loyalty is best brand loyalty would quote overblown numbers when arguing in forums, but everyone will know what's up when spending cash except people buying form system integrators. The vast unwashed masses who might be swayed by some pull quotes form old reviews will be buying laptops, new gen consoles, and phones. They aren't even part of this math. I don't see any version of reality where AIBs costing more and performing worse is good for NVIDIA except one.... that is one where nvidia doesn't want AIBs selling cards they could be selling instead.

Then they claim that this is some kind of ploy for price anchoring. Except that publishing $699 has already done that. All future prices will be referenced off that. That is THE anchor. Why a vastly elaborate scheme to do something they could have done very simply at launch by just saying "$799" with literally no ill effect? On top of that the scheme involves some underwear gnomes type middle step where they need to do this because if they released cards with double the amount of expensive ram there is no way that they could justify higher prices for them without some elaborate shell game? This part makes no sense and just sounds like tin foil hat wargarble in the article. Nvidia wants that but they won't sell them themselves. Only the AIBs, who they apparently were kicking in the nuts a couple bullet points up will get to sell them. For a much higher price.

I read all thiis and see the following two possible scenarios.

First one: That for carious reasons NVIDIA sees this generation of AMD launch to be potential disruptive especially coinciding with the new console gen coming this quarter as well. NVIDIA had to account for this, and needed an attractive price, and was forced to stick to a very aggressive schedule for release while running into a very (and increasingly) tight market for fab capacity that is of any actual use. I'm going to guess that GDDR6 ram is also going to be expensive and kind of scarce as well given the consoles eating it up for Christmas inventory too. I'm guessing here that nvidia didnn't buy lots of fab capacity because they didn't know if they would actually be done and ready to roll in quantity by the time they actually had to commit to paying and sing off on clawbacks and other fun stuff. I'm guessing that we WILL see AIBs do cards with more ram first because that means getting that ram is THEIR problem AND it lets them recoup some profit from nvidias agressive pricing against AMD and the consoles by going there first. Scarcity will persist until Q1 2021 because everyone is trying to get fab capcity and components for the new iPhone launch, new console launches, two new graphics card launches, etc. Then we will get the tock next year and maybe that will be a new higher price, but with gobs of ram.

Second one: We are going to run into the long tail HARD. Because we seem to be on the verge of being able to do 4k real well and the leap to 8k has some structural problems that will keep it from being something big or soon. When you have 4k done good enough or better by consoles on up, it becomes like word processing or web browsing. You don't care much about the next gen processor underpinning it. It's all good enough. At that point nvidia might want to trade margin for revenue by shooting their AIBs in the back. This could be the setup for the ambush later. Heck forgtet the long tail. Lets say that there's room for massive improvement year after year. All the cool kids are going verticle and bringing shit in house except intel who is farming out stuff tyring to escape the 10nm hellscape they built for themselves. Nvidia may jsut be following the cool kids and trying not to look like intel over in the corner there with the 10mm cancer. Business is all about vertical integration right now, so why not make your stock prices go up by stabbing AIBs in the back. I mean they were super nice to them by taking a shit all over their gaming brands, right? Nvidia loves their AIBs.

Wahrgarble... scarcity... master plan for higher prices. Just makes no sense as presented there. Yeah no shit there will be scarcity. And yeah there will be unreasonable prices, because that has been everything since the 1080 launched. Predicting that is like predicting water will feel wet. Calling it some grand conspiracy to benefit nvidia because somehow uttering $799 would have been unfathomable or something? That makes no sense. $699 being genuine and nvidia trying to be cost competitive while facing capacity issues to manufacture things makes some sense, and the market being hostile to accepting the retail price AND scarcity also makes sense. Nvidia facing scarcity and wanting to make loud publicity noises while letting the AIBs take the heat on pricing and the hassle of sourcing scarce components while nvidia churns upa feeding frenzy, then shoots them in the back once scarcity eases and they can bring the best and fastest in house in a way the AIBs can't touch? Especially after putting your thumb on the scale with the cost-downs? That's more sinister and I can see how it might happen.

The "win" for Nvidia is that they can advertise a $699 price point with glowing reviews, but there are such limited quantities at $699 that the real cost of the 3080 is going to be closer to $799+ after the AIBs get their margins. It's a PR move after the backlash of the Turing pricing. They have their "price anchor," however, it is virtually a made up number that very few consumers will ever see over the 2 year lifespan of the Ampere generation. Nvidia wants you to buy as many 3080s as you can possibly buy, but not at the $699 price point from them. The "artificial" scarcity isn't the GPU, but the FE card. But they want you to reference the reviews for that magical $699 card that you can't buy, but then buy one for $799+ from an AIB. THAT'S THE POINT. I'm sure that AMD cards played a role in the pricing. They are anticipating more competition in the 3080 class (no competition in the 3090 class...hence the extravagant price).

Glowing Day 1 reviews - Check
FE cards with extremely limited availability - Comes tomorrow

Another thing to note is that the $150 price for the cooler was a low estimate. It could actually be higher. Also, there will be AIB 3080's that match the $699 price point, however, they will be nowhere near as good as the FE card. Kind of like last generation. Nvidia said there would be $999 2080Ti's, but they were the poorest binned ones that didn't boost like the $1199 ones. I'd be willing to gamble that the AIB $699 3080 non-FE card is going to be essentially a 3080 "Lite" with much, much poorer boost clocks and higher temps.

Honestly, I'm moving more toward the wait and see camp rather than the F5 mashing group tomorrow morning. A $500 Big Navi/3070 card doesn't sound so bad for 1440p.
 
Last edited:
You do have an option of going opposite with your CPU cooler, Left to right air flow, use your front fans as exhaust, top fans and bottom fans as supply. I use my top fans as supply blowing over the VRMs, Memory, shielding heat from a GPU (except presently in the case is a blower card). Blowing heat out front through 360mm radiator. Works great, memory and VRMs run very cool allowing for at least very good ram OCs. Yes heated air if there are no fans running will normally go up due to being less dense but with fans you can direct that heat anyway you want. When I was running SLI with two 1080 Ti's, talking 500w+ of heat, way more than a 3080 this worked out rather well. Since the 3080 GPU back side air is going up you could also use your top fans as exhaust, front fans as supply but I would reverse you back fan to feed the CPU cool air. Personally I like the cool air blanket on the motherboard for better cooling.

Thanks for the info, worth to try. You make me rehink again to buy the founders one :) .

Would be a lot easier if just having a water cooling though :) . Because I don't know how to reverse the direction on front case fan from my Phanteks Pro M SE , I should try to search how now.

According to a review of the new format in TomsHardware.com: "Nvidia has also added PCIe Gen4 support to its Ampere GPU. It's worth pointing out that this probably won't matter much for gaming performance, as the large 24GB of VRAM means there should be less data going back and forth over the PCIe bus. The other problem of course is that the fastest gaming CPUs still come from Intel, and Intel doesn't have a desktop PCIe Gen4 solution yet. That will come with next year's Rocket Lake processors, which will yet again use Intel's 14nm++(++) process. Intel's Alder Lake will also support PCIe Gen4 and will be the first SuperFIN (or post-SuperFIN) desktop CPU from Intel." I am not certain that upgrading my motherboard and processor just to be able to use this card is a viable solution for me in the near future. I think I will upgrade to a 2090Ti watercooled solution until the dust settles next year sometime (around Chrismas maybe).

as far as I know from review of RTX 3080, using it on PCIE gen3 vs gen4 only yield <3% difference in real world gaming, so it will not make such a big different for gen3 mobo user.
Not the case of syntethyc benchmark which show 100% improvement from gen3 vs gen4.

You can see screenshot I took from a review in youtube about comparing gen3 vs gen4 performance of RTX 3080 in synthetic vs real world games
Real World Games
PCIE RTX 3080 Real Game.png

Synthetic Benchmark

PCIE RTX 3080 Synt setting.png

Based on those review, I think PCIE bandwith still not saturated yet even on PCIE 3 in real world performance.
 
Thanks for the info, worth to try. You make me rehink again to buy the founders one :) .

Would be a lot easier if just having a water cooling though :) . Because I don't know how to reverse the direction on front case fan from my Phanteks Pro M SE , I should try to search how now.



as far as I know from review of RTX 3080, using it on PCIE gen3 vs gen4 only yield <3% difference in real world gaming, so it will not make such a big different for gen3 mobo user.
Not the case of syntethyc benchmark which show 100% improvement from gen3 vs gen4.

You can see screenshot I took from a review in youtube about comparing gen3 vs gen4 performance of RTX 3080 in synthetic vs real world games
Real World Games
View attachment 279914

Synthetic Benchmark

View attachment 279915

Based on those review, I think PCIE bandwith still not saturated yet even on PCIE 3 in real world performance.
Just flip the fan around to push air into the case. I would measure tempertures, do tests to see if change is beneficial if reversing air flow in your case, back, CPU, front fans etc. Take data using like HWinfo.

Don't know to many things that actually push pcie bandwidth except 3dMark pcie express test. Does not appear to be that critical at this time. If Ryzen 4 can see some benefit with next generation of GPUs is to be seen.
 
Back
Top