What did you think of the NVidia 3000 Series Launch?

What did you think of the NVidia 3000 Series Launch?

  • Massively Disappointing: Expected Unicorn Power.

    Votes: 8 4.4%
  • Slightly Disappointing: Wanted/Expected more.

    Votes: 9 4.9%
  • OK: Met Expectaions.

    Votes: 21 11.5%
  • Very nice: Better than expected.

    Votes: 124 67.8%
  • Mind blowing: Unicorn Power is here!

    Votes: 21 11.5%

  • Total voters
    183
  • Poll closed .
It's kind of how Halo products work. I get a chuckle over the people who get upset that Halo products are poor perf/$. I have even seen people complain that 3090 is too expensive, so they are switching to consoles. :confused:

Ignore the Halo product, and get the next down the line, which performs close for half the price. "Problem" solved.

I'm sure a lot of people bought 2080 Ti's and Titans and never actually used them for anything. It's all about ego with some folks. If you flee to console land there's no need to feel inferior because nobody has something better than you. Go figure.
 
The hype around this launch so far is unreal. Good luck finding a card in stock.
 
Im kinda surprised they didn’t put a usb-c port on it. I thought thats the ‘future’ of VR headset connectivity?

I’ll see if one of the aftermarkets tacks one on. Also it should give me time to assess whether we get some other space invaders kookiness like last time...
 
So far much better than expected.

I thought we will see a 3080 at 2080ti performance for $899 and a 3090 at +30% 2080ti performance for $1399

I'm pleased to see that I can get double the performance of my 1080ti for $699, but will wait for the reviews to see if it really is like that...
 
Im kinda surprised they didn’t put a usb-c port on it. I thought thats the ‘future’ of VR headset connectivity?

I’ll see if one of the aftermarkets tacks one on. Also it should give me time to assess whether we get some other space invaders kookiness like last time...

Check out Anand's article and Ctrl-F for "VirtualLink"

https://www.anandtech.com/show/1605...re-for-gaming-starting-with-rtx-3080-rtx-3090

Meanwhile VirtualLink ports, which were introduced on the RTX 20 series of cards, are on their way out. The industry’s attempt to build a port combing video, data, and power in a single cable for VR headsets has fizzled, and none of the big 3 headset manufacturers (Oculus, HTC, Valve) used the port. So you will not find the port returning on RTX 30 series cards.
 
I don't see how any of us can comment on it when all we have is Nvidia PR and Youtube shills who regurgitate said Nvidia PR.

I want real benchmarks. For all we know real world performance is dog shit.
 
Every market? I haven't seen a sub $500 card listed yet, how can you say every market? I will say they priced about in line with my expectations, although pleasantly surprised the 3080 was about $100 less than I was expecting. Now to see what else comes out and when and see how the availability is going to be.

I wouldn't say that you're wrong, but I have a RTX 2060 KO Ultra right here in my computer, and I think that's an implication that Nvidia is ready at every price point, and can release a card that's slightly more powerful, but slightly cheaper, than anything AMD can release.

If AMD releases a card that has 100% performance at 100% price, Nvidia will immediately respond with a card that has 101% performance and 99% price.

Internally, within the companies, this is a true war of the ages. And I think they are ready for anything.

So that's how I can say "every market." They *will* release a sub $500 card like they always do, and if not them, then their closest partners will.
 
I don't see how any of us can comment on it when all we have is Nvidia PR and Youtube shills who regurgitate said Nvidia PR.

I want real benchmarks. For all we know real world performance is dog shit.

What videocard are you using? Just curious.
 
is the price of the 3080 Founders Edition actually $699 or will they add $200 on top of it like last time?
 
um no, not at all.. lol

the 3090 has 10496 cuda cores, 24GB of GDDR6X at 384bit bus. the 3080 has 8704 cuda cores with 10GB of GDDR6X at 320bit bus. is it the same die? yeah most likely but the 700 dollars isn't just buying you more memory. it's just an RTX titan 2 without the titan branding. only this time they left a lot more room in there for a 3080 super and 3080ti without them stepping on the toes of the card above it. same with the 3070 and 3080.
Lot more room memory wise. Performance wise there is only about 20-25% from 3080 to 3090. So I am not sure how 3090 isn’t just killed of with anything faster than 3080. Heck even 20GB 3080 will do that.
 
is the price of the 3080 Founders Edition actually $699 or will they add $200 on top of it like last time?

No "FE Tax" this time. Something that probably doesn't make AIB partners happy:
https://www.nvidia.com/en-us/geforc...id=nv-int-cwmfg-49069#cid=_nv-int-cwmfg_en-us
1599079284851.png
 
Cautiously excited about what was shown yesterday.

Will wait on actual reviews (plural) of the AIB cards (interest for the EVGA 3080 Hybrid personally).

I must say, though, that a 3080Ti seems almost inevitable and to me that sounds like it might be THE move to do. Close to 3090 perf on a pure gaming card for 850-900 would be awesome. Even at 1000 I'd be all over it l.

And going by the size those things are getting I just might have to finally try out watercooling to cram it in there (ITX).
what do you mean? The 3090 IS a gaming card.
 
all of the RTX 3000 series that have been announced so far will be a lot more expensive than I can bring myself to spend on a GPU upgrade. That's bad news for economical GPU upgrade buyers, as the sub-$500 GPUs will remain on Turing quite a bit longer. And I wouldn't pay the asking price for a mid-range part whose production is about to be put out to pasture.
 
all of the RTX 3000 series that have been announced so far will be a lot more expensive than I can bring myself to spend on a GPU upgrade. That's bad news for economical GPU upgrade buyers, as the sub-$500 GPUs will remain on Turing quite a bit longer. And I wouldn't pay the asking price for a mid-range part whose production is about to be put out to pasture.

This is how it goes every generation. Top cards first then midrange later on. I would expect x60 and x50 cards by April next year. Maybe even earlier given the 2060 isn't super popular right now.
 
all of the RTX 3000 series that have been announced so far will be a lot more expensive than I can bring myself to spend on a GPU upgrade. That's bad news for economical GPU upgrade buyers, as the sub-$500 GPUs will remain on Turing quite a bit longer. And I wouldn't pay the asking price for a mid-range part whose production is about to be put out to pasture.

The 3070 through 3090 aren't really for low- or mid-range users, anyway. Allegedly, the 3070 is on par with a previously $1400 part.
 
I thought it was brilliant because they pulled a bit of sleight of hand and everyone bought it. By comparing the 3080 to the 2080 not 2080TI it made the Ampere jump seem WAY higher than it actually is. I think everyone would have expected about 30-40% increase and that is right at what the 3080 has over the 2080 TI. It is still good certainly, but right around where I think everyone thought it would be.
 
  • Like
Reactions: noko
like this
I thought it was brilliant because they pulled a bit of sleight of hand and everyone bought it. By comparing the 3080 to the 2080 not 2080TI it made the Ampere jump seem WAY higher than it actually is. I think everyone would have expected about 30-40% increase and that is right at what the 3080 has over the 2080 TI. It is still good certainly, but right around where I think everyone thought it would be.
I mean, I'm not a huge fan in general, but it makes sense they would compare the 3080 to the 2080 as they are similarly priced... why would you compare a $700 card against a $1200 card, even if it can beat it, it's competing for the people who are willing to spend around $700-$800, which is what it's replacing for that cost. Of course it's marketing, but really why wouldn't you compare it to the same price bracket? Just understand what card you're looking at and what it's performance is relative to what you have (or what you want to compare it against). If they wanted the numbers to be super inflated they could have compared it to a 2060... but they didn't, they compared it against one that was in the same price bracket. I'm not sure why this is a slight of hand... it's what is being replaced by this model, even if I'm not a fan of nvidia's antics in general, I still don't see how this is misleading in any way. Now if they were comparing the 3090 to the 2080, I would see your point. They are always going to show themselves in the best light, which is why they chose digital foundry to run benchmarks for them... so it looked like it was from a 3rd party reviewer, even if the benchmarks, settings and method of displaying results, and what to compare it to where all directed by nvidia (it's in DF's disclaimer, this isn't just hypothetical). I'll wait for real comparisons to come out, but I still think comparing it to the 2080 makes sense if that's the price bracket.
 
I mean, I'm not a huge fan in general, but it makes sense they would compare the 3080 to the 2080 as they are similarly priced... why would you compare a $700 card against a $1200 card, even if it can beat it, it's competing for the people who are willing to spend around $700-$800, which is what it's replacing for that cost. Of course it's marketing, but really why wouldn't you compare it to the same price bracket? Just understand what card you're looking at and what it's performance is relative to what you have (or what you want to compare it against). If they wanted the numbers to be super inflated they could have compared it to a 2060... but they didn't, they compared it against one that was in the same price bracket. I'm not sure why this is a slight of hand... it's what is being replaced by this model, even if I'm not a fan of nvidia's antics in general, I still don't see how this is misleading in any way. Now if they were comparing the 3090 to the 2080, I would see your point. They are always going to show themselves in the best light, which is why they chose digital foundry to run benchmarks for them... so it looked like it was from a 3rd party reviewer, even if the benchmarks, settings and method of displaying results, and what to compare it to where all directed by nvidia (it's in DF's disclaimer, this isn't just hypothetical). I'll wait for real comparisons to come out, but I still think comparing it to the 2080 makes sense if that's the price bracket.
  • He compared 3070 to the 2080Ti, $500 to $$1200 card, why not the 2070 Super (it would be about 35% more performance, around a 2080 Ti FE, less impressive, normal generational increase)?
  • Why not the 2080 Super, same price to the 3080? 2080 was $100 more I do believe for the FE. 2080Ti is 16% faster than a 2080 Super.
  • Mentioning Pascal which was a huge leap in performance over the previous generation to the upcoming Turing generation does not appear equivalent. Some people I think will just conclude mentally that it is. We will find out shortly.
It will also probably come down what exactly is being tested and if it really matters in the end for most gamers. DLSS 2.1 looks awesome! But if only 15 to 20 games have it out of thousands of games people play it becomes irrelevant for many. Same with RT, which may still be really further down the road when it really matters. For some of us it will matter sooner, at least for me it does matter.
 
For people complaining that the RTX 3090 isn't a "gaming" card due to the high VRAM or whatnot:

3 points:

-The 3090 is 100% being marketed as a gaming card this time. It's on the GeForce website, with GeForce drivers, in GeForce packaging, with custom AIB variants, etc. It's not the infamous "TITAN" branding that blurred the lines between a gaming and a professional graphics card. If you're telling yourself it's not a gaming card then you're just in denial and probably just upset because you can't afford it.

-EVERY single generation of GPU people ALWAYS say "you don't need more than X amount of RAM or VRAM" guess what? that ALWAYS doesn't age well. VRAM requirements almost double every single GPU generation and always have for decades. 24GB of VRAM won't be enough in 5 years or when the next gen of consoles release almost guaranteed. I play many games that almost cap my 11GB of VRAM on my 2080Ti at 4K. Why on gods earth would I buy anything with 10GB of VRAM or less in 2020 especially when next gen games are around the corner?

-You're either too young or shortsighted if you haven't been following GPU technology over the last 10 years or so to see what's going on. There was no such thing as "Ti" or "TITAN" or "SUPER" for many generations of Nvidia GPU's. All Nvidia did was switch around the traditional naming schemed of the original X60, X70, X80 chipsets to trick their customers into paying higher prices. This isn't a conspiracy theory when you following the chipset #'s, power pin requirements, tech specs, etc. It's just what happens in an oligopoly market with no competition on the high end. I could type a 10-page essay about it. Long story short, Nvidia successfully name changed their top GPU in 2010 (the GTX 580 - $500) into $750, then $1000, then $1200, then $3,000, then $2,500, and now it's back at $1,500. Call it whatever you want, a TITAN, a TI, a SUPER, a rainbow unicorn, whatever. The RTX 3090 is currently the best gaming GPU Nvidia has to offer which has a 300% higher price tag than the best gaming GPU just 10 years ago.
 
The RTX 3090 is currently the best gaming GPU Nvidia has to offer which has a 300% higher price tag than the best gaming GPU just 10 years ago.

It's unclear to me why people feel entitled to the "best" at a particular price. The definition of best has changed in the past 10 years. Why not compare the 3080 to what you could get 10 years ago for $699? Tells a whole different story.
 
SLI is also essentially dead as a practical solution unlike 10 years ago, and top cards are competing with 2x top old cards (price wise). Still expensive, but a lot has changed. Not least the cost per transistor is no longer falling for new processes which means we are in for price hikes long term.
 
For people complaining that the RTX 3090 isn't a "gaming" card due to the high VRAM or whatnot:

3 points:

-The 3090 is 100% being marketed as a gaming card this time. It's on the GeForce website, with GeForce drivers, in GeForce packaging, with custom AIB variants, etc. It's not the infamous "TITAN" branding that blurred the lines between a gaming and a professional graphics card. If you're telling yourself it's not a gaming card then you're just in denial and probably just upset because you can't afford it.

-EVERY single generation of GPU people ALWAYS say "you don't need more than X amount of RAM or VRAM" guess what? that ALWAYS doesn't age well. VRAM requirements almost double every single GPU generation and always have for decades. 24GB of VRAM won't be enough in 5 years or when the next gen of consoles release almost guaranteed. I play many games that almost cap my 11GB of VRAM on my 2080Ti at 4K. Why on gods earth would I buy anything with 10GB of VRAM or less in 2020 especially when next gen games are around the corner?

-You're either too young or shortsighted if you haven't been following GPU technology over the last 10 years or so to see what's going on. There was no such thing as "Ti" or "TITAN" or "SUPER" for many generations of Nvidia GPU's. All Nvidia did was switch around the traditional naming schemed of the original X60, X70, X80 chipsets to trick their customers into paying higher prices. This isn't a conspiracy theory when you following the chipset #'s, power pin requirements, tech specs, etc. It's just what happens in an oligopoly market with no competition on the high end. I could type a 10-page essay about it. Long story short, Nvidia successfully name changed their top GPU in 2010 (the GTX 580 - $500) into $750, then $1000, then $1200, then $3,000, then $2,500, and now it's back at $1,500. Call it whatever you want, a TITAN, a TI, a SUPER, a rainbow unicorn, whatever. The RTX 3090 is currently the best gaming GPU Nvidia has to offer which has a 300% higher price tag than the best gaming GPU just 10 years ago.

A couple things that are sort of off here. The Ti monkier has been around since geforce3 which broke into T200 and Ti500. Ti as an ending has been around since the geforce 4 4400 Ti. So Ti isn't "new" at all, it's older than both RTX and GTX by a good margin. And nvidia has always had odd fucking words around their cards. You had 6800, 6800nu, 6800 2565, 6800gt, 6800 ultra, 6800 ultra extreme, and 6800 ultra extreme 512mb cards. They've always charged an arm and a leg for their top end card.

TITAN at the start did had a point. I had a few extra things turned on that didn't matter for gaming and was sold as a compute card, even as recently as TITAN V this was the case. That it turned into a gaming thing was due to the drive to own the best at all costs and showed they could sell gaming cards at that price. But it didn't start out that way.

This isn't just nvidia either. Monitor prices have run away as well, and you can find similarly stupid storage products (optane, samsung pro series) that people are going to buy where you are paying double or more for something that is marginally better than the second tier down and almost nobody actually needs it. It's not just computers either, everything works that way now.
 
AMD is desperately panicking right now.

When it comes to sheer features, performance, and even price, Nvidia just can't be legitimately defeated. They corner every segment of the market, and then deliver a KO punch. And it's just brutal.

I really hope AMD has forged a sword that can be dubbed "Nvidia Killer."

Otherwise, Nvidia has already won this round. Again.



THIS.
Signed former (crap drivers) Gigabyte RX5700 owner,and owner of a 699 Canadian $ dollar Asus 'WHITE' Radeon Nano bought at launch at NCIX,and a Radeon Fury X at FULL msrp at LAUNCH,and a 470 4gb and a few Nv cards. This nVIDIA launch was very impressive.
 
I think my reaction to nVIDIA's launch of the RTX 30 series is a combination of surprise and excitement really. They've set the 3 cards they announced to an interesting role set in terms of what they will do for the gamer. Let's start at the top: the 3090. It is now the new Titan and not the new 3080Ti. It is as I named it when they announced it, "MegaWeapon". This thing can do 8K/60FPS. Who saw that coming? It got (compared to the Titan RTX) a $1000 discount in price when you compare the 2 cards. However; $1500 is still a lot of money. You could build an entire PC (which could include an RTX 3080 or 3070 if you picked your parts wisely) for that much money. It'll attract its crowd and its customer but it is still a lot of money for something so powerful. Now I go to the 3080 and 3070. The price left me very surprised. They kept the same prices for the 3080 and 3070 that they did for the 2080 Super and 2070 Super when they easily could've risen the price heavily. I take it as a little bit of a gift from nVIDIA. Also, the performance you get from the 2 cards when directly comparing them to the last generation? If you believe nVIDIA's own presentation slides you'd say this is near a quantum leap. But nVIDIA's own numbers can say so much. I'm patiently awaiting BitWit, Paul's Hardware, Jayz2Cents, Hardware Unboxed and Gamer's Nexus' video reviews to understand what's really going on here. As for me, the eVGA RTX 3080 XC3 Ultra is looking really good to me right now. Hopefully (as I've had great luck with eVGA and video cards from them not having coil whine) I get another quality card from them next week. Out!
 
I think 3060 will kill it in sales.
1060 owners will see a ton of reasons to get off the most owned card on Steam.
The generational improvement especially for the 3gb 1060 owners will be all over YouTube.
 
I think it's really obnoxious that two weeks after these cards "launched" we still have nothing that shows how these cards perform other than spoon-fed Nvidia marketing and a few obscure youtube videos. Sure, I'd love to believe that a company with a huge financial incentive to make their products look as good as possible is going to be 100% truthful with their marketing... but it's just as likely, if not more likely, that they are simply trying to set and cement the narrative about the performance of these cards before 3rd parties even have a chance to actually benchmark them using the real games that people actually play.
 
GotNoRice makes a good point about only being able to speculate from what nVIDIA told us about their cards and having to wait until launch day's eve (Wednesday the 16th) to see reviews/benchmarks for the 3080 with games we play. However; there's 1 game I play (Anthem) that hasn't been bench marked at all by anyone since the Open Beta/Demo they launched in February of 2019. I get the feeling even for this round Anthem will be left out of benchmarks again even though benchmarks from its February 2019 demo showed this game is a massive system resource eater. Out!
 
Looks like the reviews didn’t really add a whole lot to the leaks and early speculation. Performance in real world and ideal conditions is pretty close to what we were told and heat and power is inline too, compared to what was revealed and expected.
Biggest negative factor to paint after this morning is lack of cards on launch date and the fact that no one saw a single FE card on nvidia’s site it went from notify me to out of stock a 6am PT sharp.
 
Looks like the reviews didn’t really add a whole lot to the leaks and early speculation. Performance in real world and ideal conditions is pretty close to what we were told and heat and power is inline too, compared to what was revealed and expected.
Biggest negative factor to paint after this morning is lack of cards on launch date and the fact that no one saw a single FE card on nvidia’s site it went from notify me to out of stock a 6am PT sharp.
I have one in my cart at Nvidia, does not go past Checking Out. . . pop up :(:mad:
 
Before Nvidia would launch FE first then AIBs would have to wait. This launch all were at the starting line.
 
I found other 30x0 features useful.
Hopefully I won't find them critical for work in 2022.
I think this guy was the only one that dive into the features.
 
Back
Top