Jensen Huang announces Ampere prices

Well, the pricing on the 3070 and 3080 is a pleasant surprise. I'll wait for benches and probably for the higher RAM models and decide what to do. The 3090 isn't completely off the table, but if the 3080 isn't too far off, that's what I'll go with to replace my 1080 Ti. Now if AMD would just release Zen 3....
 
Feel free to substantiate your performance claim.

Slides always have a bit of performance stretching going on, but do you really think they would go so far as to claim 3070 > 2080Ti when it's not so? I'm sure raster vs raster in a slew of benchmarks will lessen the gap, but after a claim like that the whole market would rage if the 2080Ti ends up faster..
 
You said rumors, I gave you non-rumor performance data. Is it perfect? No, but it’s what we have until 9/17 and October.

Marketing IS nothing but sponsored rumors until there is 3rd party verification.
 
I like how he said it's time for Pascal owners to upgrade. I mean... maybe? $500 is still a lot and anyone who owns a GTX 1080 is probably still comfortable with it. I was hoping to see the 3060 announced with pricing but nothing. If the RTX 3060 is as fast as the RTX 2080 TI or a bit slower then consoles may have a hard time selling, but that's assuming a lot including pricing. Guess I'll be picking up a used RTX 2070 by Christmas time for cheap.
Jen is banking on those Pascal owners who chose not to spend their money on a 2000 series to finally cave in. It's far more tempting fruit given the large performance gap for a price bracket they are already comfortable with.

The real bait Jen and company are fishing with is the introduction of the 3090 and its pricing. It is cleverly not called the 3080 Ti for the very reason of introducing a new pricing bracket. If they had introduced it as a 3080 Ti instead with that very same price bump, then it would undoubtedly made its target audience uncomfortable. There is a psychological effect in play by distancing the top of the product stack from the previous generation's.
 
do you really think they would go so far as to claim 3070 > 2080Ti when it's not so?

I'm sure they could find something that the 3070 is really good at, that the 2080Ti is not so good at, and craft a benchmark number around that advantage.

Like comparing a 1080Ti and a 2080 using an RTX benchmark and claiming that represents the comparative gaming performance of the two cards.
 
How would that be typical? My 2080 is faster than 1080Ti more often than not.
Because you can't read.

the 2070 runs about on par with the 1080. The 2080 is faster. That was last gen. It should beat the 1080 all the time.

This gen th 3070 is on par with the 2080ti. That is the typical launch process. A card runs close to, or better than the one from the previous gen that is one nothc higher in the line up.
 
Even if the 3rd party reviewers show slighlty less or even a lot less performance than the slides... I'm not seeing how this is a price INCREASE as according to GotNoRice. Even the example given of a 2080 beating a 1080Ti doesn't bear this out. His example of a 2080 is an ACTUAL price increase rather than this time around.

1080Ti ($699 FE) --> 2080 ($799 FE) -- roughly equal performance
2080Ti ($1199 FE) --> 3070 ($499 FE) -- (nvidia says its roughly equal)

Thats a price increase? I'm not seeing it.

EDIT: am i just being trolled? should stop taking the bait
 
So if I get this right the 3070 is faster than a RTX 2080 Ti and $500? Good luck AMD.
The benchmarks i've seen are with raytracing enabled. So the 3070 is faster with RTX enabled, but in normal raster performance I think the 2080ti is still faster. The big performance improvement here is on the raytracing side, with another 20-30% bump in raster. So on the RTX side the older cards will be obsolete fast, but not on the normal raster side.
 
I need more info on the 3090... I need that memory and I haven’t seen their Quadro refresh yet. Most have to wait for the new Threadrippers now... It’s a good morning..
 
This gen th 3070 is on par with the 2080ti. That is the typical launch process. A card runs close to, or better than the one from the previous gen that is one nothc higher in the line up.

I'm still going to wait for 3rd party data instead of gobbling up nvidia marketing as gospel, but when talking about the 3070, wouldn't "previous gen that is one nothc higher" be the 2080, not the 2080ti? Seems to me like you went up two "nothces".
 
Ok....Pascal level performance increase at relatively reasonable prices. I'm impressed. I'm glad he basically called the 3090 a Titan.
 
Ok....Pascal level performance increase at relatively reasonable prices. I'm impressed. I'm glad he basically called the 3090 a Titan.

There's still room for a Titan when 2Gb GDDR6X lands next year. They could do a 48GB card with a fully enabled chip for $3000.
 
well shit, I was expecting way higher prices making it easy to hold off on upgrading. My gtx 1080 has been holding its own but really shows its age with my newer 1440p 144hz screen, the 3080 is looking like a reasonable upgrade.
 
Equal at what?

They actually claimed slightly higher performance and I would suspect that they are right that the 3070 will be a 2080 ti equivalent. The 3070 has more CUDA cores (5888) vs 2080 TI (4352)... Unless their performance is going backwards per core by a lot it will beat out the 2080 Ti.

Edit: its also clocked higher at 1.73 vs 1.63 for the founders
 
2080Ti has a TDP of 260W, 3070 is 220W. Unless there are 0 improvement going from 12nm to 8nm, I’d say that should be enough evidence.

Power consumption numbers are a substitute for actual benchmarks?
 
They actually claimed slightly higher performance and I would suspect that they are right that the 3070 will be a 2080 ti equivalent. The 3070 has more CUDA cores (5888) vs 2080 TI (4352)... Unless their performance is going backwards per core by a lot it will beat out the 2080 Ti.


It's not unprecedented. The 1070 basically matched the 980TI.
 
Told ya, that a new process node would shrink the chips down enough that we would get some lower-end price drops. Like the 1070 Ti was $100 less than the GTX 980. This massive price cut only came a year after the cutting-edge process node had been optimized, but t did happen!

The 8nm process is-tried-and tested, so prices only had one direction to head (pre-launch price cuts)

The 3090 is the exception to that rule (ikely because it uses overclocked parts and the fastest-binned GDDE6X)
 
Last edited:
Power consumption numbers are a substitute for actual benchmarks?
There's past precedent that it seems like you're ignoring for the sake of keeping an internet argument going. Nvidia doesn't - to my recollection - have a habit of embellishing marketing slide claims, and then realworld benchmarks telling a drastically different story.

So a company's track record is obviously going to be part of the purchase calculus for many people.
 
There's past precedent that it seems like you're ignoring for the sake of keeping an argument going. Nvidia doesn't - to my recollection - have a habit of embellishing marketing slide claims, and then realworld benchmarks telling a drastically different story.

So a company's track record is obviously going to be part of the purchase calculus for many people.

Digital Foundry has had early access to the 3080. Some real performance info there.

EDIT: GOT NINJA'd -- Removed LINK
 
There's past precedent that it seems like you're ignoring for the sake of keeping an internet argument going. Nvidia doesn't - to my recollection - have a habit of embellishing marketing slide claims, and then realworld benchmarks telling a drastically different story.

So a company's track record is obviously going to be part of the purchase calculus for many people.

Holy rose colored fucking glasses! :eek:
 
I am super curious what kind of pricing Samsung gave them for their business, and given their current foundry's loads they should be able to get these things out in actual numbers I am already contacting my rep to try to get my hands on a 3090, figure if I can order it day 1 I may see it for April..
 
Feel free to cite examples where Nvidia presentation slides claimed performance that wasn't backed by benchmarks.

Not wasting my time with fan boys.They have a history of pulling shenanigans and tricks. I want real benchmarks otherwise your just spitting shit. You could very well be right but I've learned my lesson long time go to wait for 3rd party confirmation.
 
Last edited:
There's past precedent that it seems like you're ignoring for the sake of keeping an internet argument going. Nvidia doesn't - to my recollection - have a habit of embellishing marketing slide claims, and then realworld benchmarks telling a drastically different story.

Every slide I've seen seems intentionally vague in regards to where those performance metrics actually came from. That's almost certainly because the data was cherry-picked. Not false or embellished necessarily. Like the example I gave before of trying to compare the 1080Ti and the 2080 using an RTX benchmark. The 2080 would stomp it, and those numbers would not be "embellished", they would be real, but they would still paint a false picture from the perspective of most gamers if used to represent the card's "performance".

I'm not trying to argue. I hope the new generation is amazing. What I'm really saying is, I'd rather wait for benchmarks than suckle on the marketing teat. Pessimism has generally paid off more for me than latching onto sponsored optimism.
 
Is it safe to assume right now that 3080ti will be next year? and 3090 is actually the titan replacement? all speculation but $1199, for 3080ti and maybe 16gig of vram instead of 24 like the 3090 right now. Thoughts?

The 3080Ti will be 20GB
 
To summarize from the DF video. Looks like the 3080 performance matches well with NVIDIA's claims on a limited number of titles tested so far by Digital Foundry. 1.8X increase for non-DLSS and no RT over 2080 matches well with the ~2X claim from NVIDIA. They saw close to 2X on one RT/DLSS title. From the DF numbers this means that the 3080 is roughly 1.4X the 2080Ti at 699 in rasterized titles. 3070 matching a 2080 Ti seems very reasonable based on these initial results.
 
Every slide I've seen seems intentionally vague in regards to where those performance metrics actually came from. That's almost certainly because the data was cherry-picked. Not false or embellished necessarily. Like the example I gave before of trying to compare the 1080Ti and the 2080 using an RTX benchmark. The 2080 would stomp it, and those numbers would not be "embellished", they would be real, but they would still paint a false picture from the perspective of most gamers if used to represent the card's "performance".

I'm not trying to argue. I hope the new generation is amazing. What I'm really saying is, I'd rather wait for benchmarks than suckle on the marketing teat. Pessimism has generally paid off more for me than latching onto sponsored optimism.
Did you watch the DF video? He didn’t use RT in a bunch of the games he tested. Lowest increase was close to 70% compared to 2080.
 
I didn't expect the 3090 would be that massive of a jump over the 2080ti. I had planned skipping a year with my launch 2080ti, but this is a big enough jump that i'll do it.

Looks like the 3090 will finally give us 4K above 60FPS in almost every title.
 
Just watched the presentation ,first time i`m impressed at anything NVIDIA in years, looks like a 3080 for me.
 
Not wasting my time with fan boys.They have a history of pulling shenanigans and tricks. I want real benchmarks otherwise your just spitting shit. You could very well be right but I've learned my lesson long time go to wait for 3rd party confirmation.
All good, we're all friends here. Maybe it's the kid in me that gets excited about new tech, and the candy-coated marketing is just fun ritual. When AMD has their presentation I'll be rooting for them too because the whole space benefits. I've already got a slot open in my new Hackintosh build for a 6800XT.
 
Back
Top