Jensen Huang announces Ampere prices

Did you watch the DF video? He didn’t use RT in a bunch of the games he tested. Lowest increase was close to 70% compared to 2080.

I watched it. I'm in love. Nvidia added some slides to their website too. Doom Eternal outdoor scenes were hitting 100% faster than the 2080 (on a 3080) if I saw right in the DF video..

This means I'll be throwing my 1080P 144hz monitor straight in the trash... any good 240hz 1440P monitors around yet? :p
 
DF already got the hands on.


Christ. It slaughters the 2080. 60-80% performance gains are nuts for ANYTHING over a single gen.

There is going to be zero chance at getting one of these anywhere near MSRP at launch. Everyone is going to want one.
 
3070 looking good as a 1070 replacement for me. I'll wait for benchmarks and probably AiB to be sure though. Hope AMD new GPU is decent to help keep competition going...
 
3070 looking good as a 1070 replacement for me. I'll wait for benchmarks and probably AiB to be sure though. Hope AMD new GPU is decent to help keep competition going...


My only issue.. is it still has 8gb VRAM! It should have 10gb and the 3080 should have 12gb minimum. We really need to move on from 8gb in mid-high end GPUs. I know you can argue that the 3070 is a midrange card but its $500, not a midrange price.
 
Times like this I am grateful I went water cooling. It will be a while before there are any options for a block or pre blocked ones like my Evga hydro copper. A little disappointed with the ram amounts, my 1080 ti is coming up on three years old and it has 11gb. I might be interested in a 3080 with a block.
 
There is going to be zero chance at getting one of these anywhere near MSRP at launch. Everyone is going to want one.

Yeah, that's my only worry, as I'll be trying to go from a 2080 to 3080. If I have to wait a few or several months to get one at or under MSRP, I guess that's what I'll be doing then. I got my 2080 for $650 last April, so hope to do the same at some point for a 3080.
 
Did you watch the DF video? He didn’t use RT in a bunch of the games he tested. Lowest increase was close to 70% compared to 2080.

I was giving an example from the past, not claiming anything one way or the other regarding 3000 series RT performance.
 
I was giving an example from the past, not claiming anything one way or the other regarding 3000 series RT performance.
Re-read what I said. DF didn’t use RT in most of their testing and the 3080 was >70% faster than the 2080.
 
Probably still going to hold out until Nov (maybe there will be stock by then) but I am VERY interested in the 3090 right now. I can't wait to see reviews and I'm excited to see GN's inevitable tear-down. I wonder if this FE will be easier for Steve to rip apart compared to the 20-series.

I can't believe I'm seriously considering a $1500 GPU, but if the reviews pan out then I am on fucking board.

I hope AMD can deliver something competitive with Big Navi/RDNA 2 but it looks like they have their work cut out for them.
 
As an eBay Associate, HardForum may earn from qualifying purchases.
Everyone is "panic" (Not sure if the right word) selling on eBay. Literally just scored a 2080 ti FTW3 on ebay for $545 and there have been a ton others that I missed.

Like this guy had 7 that he listed and they all sold in 8 minutes, like wtf?? LOL I scored one of them.

https://www.ebay.com/itm/EVGA-GeForce-RTX-2080-Ti-FTW3-ULTRA-11G-P4-2487-KR-7x-available/114387037944?ssPageName=STRK:MEBIDX:IT&_trksid=p2057872.m2749.l2649
Either that guy's an idiot or it's some kind of hacked account scam.

Hope you get the card in any case.
 
As an eBay Associate, HardForum may earn from qualifying purchases.
Re-read what I said. DF didn’t use RT in most of their testing and the 3080 was >70% faster than the 2080.

Re-read what I said. My mention of RT was an example of a previous generational gap where RT highlighted something the 2080 was good at, that the 1080ti was not good at. I'm not nor ever was implying that RT is key to the 3000 series numbers being thrown around, only an example of how numbers have been cherry-picked in the past based on the differences in past architectures. With a new architecture, I'm sure there will be new things to cherry-pick numbers based on. I'm not saying the new cards won't be faster, but I'll get excited when I see benchmarks for games that I actually care about and play.
 
Either that guy's an idiot or it's some kind of hacked account scam.

Hope you get the card in any case.

I quickly looked at his feedback before pressing buy it now, he had 7 other 2080 tis that he sold within the past year that he got positive feedback for.

Also his username is walrax and in the pictures there are custom boxes with branding walrax on them. So I'm certain that its real.

I hope i get the card lol and its not a scam.
 
Marketing IS nothing but sponsored rumors until there is 3rd party verification.

You really don't understand how tech companies work, do you? "sponsored rumors"? Sure, marketing departments always try to place the best foot forward in making their company's product shine, but those departments ARE PART OF THE SAME COMPANY and have direct access to the performance information. They are using data that is provided to them by their engineers, NOT some random website blogger. They know full well that if they post blatantly false information, they will get called out for it.
 
3070 looking good as a 1070 replacement for me. I'll wait for benchmarks and probably AiB to be sure though. Hope AMD new GPU is decent to help keep competition going...
I am with you on that. I held out on the 2070 because of the pricing and I don't play anything that used RT, but the jump from a 1070 to a 3070 looks too good to pass up.
 
You really don't understand how tech companies work, do you? "sponsored rumors"? Sure, marketing departments always try to place the best foot forward in making their company's product shine, but those departments ARE PART OF THE SAME COMPANY and have direct access to the performance information. They are using data that is provided to them by their engineers, NOT some random website blogger. They know full well that if they post blatantly false information, they will get called out for it.

Every slide I've seen seems intentionally vague in regards to where those performance metrics actually came from. That's almost certainly because the data was cherry-picked. Not false or embellished necessarily. Like the example I gave before of trying to compare the 1080Ti and the 2080 using an RTX benchmark. The 2080 would stomp it, and those numbers would not be "embellished", they would be real, but they would still paint a false picture from the perspective of most gamers if used to represent the card's "performance".

If you think that marketing numbers are as good as 3rd party benchmarks, then I think it's you who "doesn't understand how tech companies work".
 
Last edited:
I'm still going to wait for 3rd party data instead of gobbling up nvidia marketing as gospel, but when talking about the 3070, wouldn't "previous gen that is one nothc higher" be the 2080, not the 2080ti? Seems to me like you went up two "nothces".

It gets muddied b how you view the fact the 2080 and 2080 ti were released at the same time. But you like being a troll and before you were all "they raised prices and jsut moved everything down a notch" When apparentlya ccording to you now, neither were true.

stop behaving like an ass.
 
Sweet merciful fuck that price jump from the 3080 to 3090, yikes.

Performance looks very impressive but needs more 3080ti.
 
3080 absolutely wins it for me as an upgrade to my 1080ti.

Still gonna wait for reviews and see what the aftermarket coolers offer, if they somehow manage to come out with a 3080ti or 3080 super for less than $1000 that may be a consideration. If I haven't bought the 3080 yet.
 
Didn't see any USB-C on any of these cards. Did I just miss it?
 
Sweet merciful fuck that price jump from the 3080 to 3090, yikes.

Performance looks very impressive but needs more 3080ti.
Yep, I *could* splurge for a 3090, but feel like I'd be getting more for my money with a 3080 overclocked to high hell and waiting out 3080 Ti in spring.

And Cyberpunk having DLSS 2.0 definitely makes that decision easier, since it reduces the need for brute force power to achieve 4K60 with max eye candy.
 
Didn't see any USB-C on any of these cards. Did I just miss it?
It appears Virtual-Link is pretty much dead since it didn't really catch on. I definitely liked it on Turing!
 
Last edited:
Didn't see any USB-C on any of these cards. Did I just miss it?
You didn't miss it. It's simply not there on these reference designs.

Doesn't mean AIBs won't add it after the fact. It might mean you have to wait a bit longer though...
 
Maybe someone will give me 20 bucks for my 1080

LOL.. that's what I had in mind too. If I had a backup card I'd just sell the 1080 while I can... Ebay prices wont be dropping any time soon I'm sure.

Only backup card I have for the gap in time though is an r9 290.. sigh
 
Well, I was going to go all in on a 3090....but my monitor is a 1440P Gsync and if I can get BF1 at a locked 144FPS with the 3080, I'm good. I just hope they have some 20GB 3080s available around launch. Adobe likes memory.
 
I think they purposely left that big gap in price between the 3080 and 3090 so they can slot in a 3080ti in 6 months...$1100 3080ti splits the difference

Maybe, although there's also the 3090's big jump in memory in addition to the overall increase in complexity. I'm sure NVIDIA is thinking about a 3080 Ti or a similar model -- just that the specs for the 3090 play a large role in its price.
 
These prices look good to me. That said, it also confirms what we instinctively knew all along, that Nvidia priced their cards as they did because they could, not because they needed to. :cautious::p:eek:

I think it's back to them being normal sized GPU dies. Based on die shots seen GA102 is a 500mm^2 class die vs 754mm^2 for TU102. Turing was ridiculously expensive because cramming in even proof of concept level ray tracing on the process used required ridiculously huge, and thus expensive, gpu dies. Ampre is back to normal except for the 3090/titan replacement because last generation sales results have convinced NVidia they can shake down money is no object gamers even harder. Since the windfall they get from those people covers a larger share of R&D than the cards the rest of us buy I'll shed no tears for their wallets.


https://www.anandtech.com/show/1605...re-for-gaming-starting-with-rtx-3080-rtx-3090
 
Maybe, although there's also the 3090's big jump in memory in addition to the overall increase in complexity. I'm sure NVIDIA is thinking about a 3080 Ti or a similar model -- just that the specs for the 3090 play a large role in its price.


I definitely see them holding a space for the ti/super cards. They're missing that $999 price segment which would be filled by the 3080ti. 20GB / 32TF probably.
 
I think they purposely left that big gap in price between the 3080 and 3090 so they can slot in a 3080ti in 6 months...$1100 3080ti splits the difference
If you take in all the information given today and then take a step back you'd realize that the 3090 is a completely different tier, thus gpu, altogether. The 3090 uses the GA102-300-A1 gpu, while the 3080 uses the GA102-200-K1-A1. Both are different enough that they might as well be sister gpus (kind of like how Titan V was based on Volta, not Pascal). The 3090 is pretty much an RTX Titan (Jensen's own words I believe).

Jensen calling the 3090 a "Beast" is no exaggeration. The 3080 could have never been a cut down 3090, as, again, they are far too different. So realistically, a 3080 Ti would have to be invented. But where do you begin? An overclocked 3080 with more VRAM? Or do you cut down from a more-expensive-for-nvidia-to-manufacture 3090?

In other words, the gap may simply just exist because nothing actually exists to fill it. We might not see that gap filled until the next gpu refresh.
 
If you take in all the information given today and then take a step back you'd realize that the 3090 is a completely different tier, thus gpu, altogether. The 3090 uses the GA102-300-A1 gpu, while the 3080 uses the GA102-200-K1-A1. Both are different enough that they might as well be sister gpus. The 3090 is pretty much a RTX Titan (Jensen's own words I believe).

both the 3090 and 3080 are using the GA102 design and both use the same Samsung 8nm process...so they are more alike then different...especially if the 3080 comes in a 20GB variant
 
both the 3090 and 3080 are using the GA102 design and both use the same Samsung 8nm process...so they are more alike then different...especially if the 3080 comes in a 20GB variant

Those don't matter -- it's about RAM, core counts, clock speed and the like. If a more complex design has lower chip yields, it's more expensive to make.
 
both the 3090 and 3080 are using the GA102 design and both use the same Samsung 8nm process...so they are more alike then different...especially if the 3080 comes in a 20GB variant
They are based on the same architecture (Ampere), which makes it a bit less extreme than Volta vs Pascal, but there are significant differences at the hardware level that can not be made up by just bumping clocks and adding more VRAM. The 384-bit bus vs 320-bit bus width disparity cannot be overlooked, regardless of the VRAM configuration. And unless the 3080 has 1794 CUDA cores disabled (i.e. lasered off), you can't make up that difference either.
 
Last edited:
So long GTX 1080, you served me well for the last not quite 4 years. RTX 3080 here I come.

Same here. I still game at 1440p so the 3080 should utterly destroy anything at that resolution.

I also do VR, and the 1080 has been lacking a bit lately since I got an Index. Though that may also be due to my older CPU as well (CPU/mobo/RAM upgrade incoming as soon as we see what AMD brings with Ryzen 4000).
 
Back
Top