RTX 3xxx performance speculation

OK - but you misrepresented the quote. He specifically said 3080. He did not say 3000 series. He is talking about how the 3080 FE a mythical $699 card because it is. All you can really find out there are the AIB alternatives for much more money. That's one card out of the dozens and dozens that are available.
How do you know that he was talking about the 3080 FE and not all 3080s? If there are only a few 3080s available I would assume that there are even fewer 3090s. I don't know how many 3070s are out there though.
 
How do you know that he was talking about the 3080 FE and not all 3080s? If there are only a few 3080s available I would assume that there are even fewer 3090s. I don't know how many 3070s are out there though.
He said 3080 FE in the second paragraph.
 
so again the 20GB VRAM rumors start up- this time with the 3080 Ti?...8GB is too little, 10GB is on the fringe and 20GB+ is too much...12-16GB would be perfect
 
I really thought with the RDNA2 launch we'd see a driver update from NVIDIA to address 1440p scaling issues with the 3080. Anyone hearing anything about this or is it really an architectural decision to go after 4K? Kind of stuck between a rock and a hard place with 1440p/144.
 
EVGA rebranded their 3060 Ti XC3 Ultra as FTW3 Ultra, presumably to charge more for it.
Interested to see how prices compare to the 3070 XC3.

EVGA-GeForce-RTX-3060-Ti-8GB-FTW3-ULTRA1.jpg
 
Last edited:
so again the 20GB VRAM rumors start up- this time with the 3080 Ti?...8GB is too little, 10GB is on the fringe and 20GB+ is too much...12-16GB would be perfect

Will it hurt the card if there is too much?

12 GB (384 bit) will look bad against AMD's 16 GB offering and there is no way it will be 16 GB (256 bit)

With AMDs SAM and nVidia's upcoming version, having extra vram will come in handy for those that are limited on system ram.
 
Will it hurt the card if there is too much?

12 GB (384 bit) will look bad against AMD's 16 GB offering and there is no way it will be 16 GB (256 bit)

With AMDs SAM and nVidia's upcoming version, having extra vram will come in handy for those that are limited on system ram.

20+GB VRAM won't hurt but in terms of price/performance it does
 
20+GB VRAM won't hurt but in terms of price/performance it does
I suppose, but not significantly when talking this price bracket.

Again, 16 GB is out of the question, but I would still lean towards 20 GB over 12 GB even though 12 GB would perform better in most scenarios (bandwidth).
 
the current best value/performance cards for most people are the 3060 Ti and 3080...
 
I think the 3060 Ti is looking to be an excellent card. No way I would pay the money for a 3080 and its VRAM. I would go 6800 XT on that choice.
Ditto, especially when they're already literally neck in neck in terms of performance. The 16GB is the topping on the cake. Unfortunately I couldn't get my hands on a 6800XT for 2 weeks after launch despite basically being on-deck 18 hours of the day with Discord notifications, being on AMDs site every single morning at their restock time, AND camping out Micro Center several times, to no avail.

I did finally score a 3080 from Micro Center after those 2-3 weeks, and unless I can find an in-stock 6800XT in the next week or two when my return period ends, 3080 it is.... as much as I wanted to give my money to AMD.
 
I think the 3060 Ti is looking to be an excellent card. No way I would pay the money for a 3080 and its VRAM. I would go 6800 XT on that choice.
I'm currently going for a 3080 but the lower VRAM on it does bother me a bit. However, the worse RT performance of the 6800 XT also bothers me. In theory, if a heavy hitter game comes out that both uses more than 10Gb VRAM at max settings and heavily uses RT then whichever card you go with you'll have to turn down a setting to get optimal performance out of it. It comes down to what the buyer prefers. The rumored 3080Ti with 20Gb VRAM could be the perfect high-end card depending on how it's priced.
 
The rumored 3080Ti with 20Gb VRAM could be the perfect high-end card depending on how it's priced.
Why does everyone just assume a rumor is true. I keep seeing comments from this board and others that treats the rumor like it's gospel truth. For all we know that rumor may never come to fruition. Until Nvidia themselves announces it, I roll my eyes at the rumors.
 
Why does everyone just assume a rumor is true. I keep seeing comments from this board and others that treats the rumor like it's gospel truth. For all we know that rumor may never come to fruition. Until Nvidia themselves announces it, I roll my eyes at the rumors.

So much this. It's all wishful thinking at this point.
 
So it looks like at 4k neither the 3080 or 3090 can do 60fps comfortably in Cyberpunk. Frame rates are more like 20s without DLSS and high 50s with DLSS. Which just reinforces that this tech is in its infancy no matter how much vram you put on it, and it will be years before next gen gaming with RTX in 4K is a reality.

If you game at 4K you will want a new card in two years even if you get a 3090.

There is probably little point agonizing over VRAM since the 3090 has it in spades and still can’t deliver.

Or I suppose you could argue games like Cyberpunk are just ahead of the hardware, which is the way it always used to be. It is only recently that the cards got so far ahead of the software. If Cyberpunk is a preview of what is to come, in any event, I am more interested in what Ampere’s successor will do than Ampere itself.

Of course, RTX was developed precisely to create this new upgrade cycle.

Edit: DLSS appears absolutely necessary for playable framerates.
 

Attachments

  • EADD58A4-C421-46AD-8B3F-B0DA3EDE6F8E.png
    EADD58A4-C421-46AD-8B3F-B0DA3EDE6F8E.png
    41 KB · Views: 0
Last edited:
Why does everyone just assume a rumor is true. I keep seeing comments from this board and others that treats the rumor like it's gospel truth. For all we know that rumor may never come to fruition. Until Nvidia themselves announces it, I roll my eyes at the rumors.
Fair point. In my opinion the 3080Ti is pretty likely at this point since the leaks seem to come from credible sources but yes, we won’t know for sure until it’s announced or pictured from a reliable source. This is why I specified in my post that the 3080Ti is “rumored”.
 
So it looks like at 4k neither the 3080 or 3090 can do 60fps comfortably in Cyberpunk. Frame rates are more like 20s without DLSS and high 50s with DLSS. Which just reinforces that this tech is in its infancy no matter how much vram you put on it, and it will be years before next gen gaming with RTX in 4K is a reality.

If you game at 4K you will want a new card in two years even if you get a 3090.

There is probably little point agonizing over VRAM since the 3090 has it in spades and still can’t deliver.

Or I suppose you could argue games like Cyberpunk are just ahead of the hardware, which is the way it always used to be. It is only recently that the cards got so far ahead of the software. If Cyberpunk is a preview of what is to come, in any event, I am more interested in what Ampere’s successor will do than Ampere itself.

Of course, RTX was developed precisely to create this new upgrade cycle.

Edit: DLSS appears absolutely necessary for playable framerates.
That's also DLSS Performance Mode, which renders the game internally at 1080p. DLSS Quality Mode would be 1440p internal res.
 
That's also DLSS Performance Mode, which renders the game internally at 1080p. DLSS Quality Mode would be 1440p internal res.
I was going to say 1440p would probably be OK but missed that it was in Performance Mode, so the performance at 1440p DLSS Quality Mode? 60FPS+ with all Ultra settings? Well the game will be out shortly, plus optimizations I would think will still be happening. The question is always if the RT use actually makes a beneficial gaming experience. This game may take 6 months + before fully optimized, bug fixes, playing adjustments etc.
 
So I should be able to find a 3080Ti to purchase some time in 2022?
 
That's also DLSS Performance Mode, which renders the game internally at 1080p. DLSS Quality Mode would be 1440p internal res.
That's also with some extreme settings like "Ray traced global illumination'.
The game was designed to scale up for future hardware, but that is not to say you can't play the game at 4k with current settings.

Today's high settings are yesterday's ultra settings. Combined with some settings common sense and DLSS, this game is more than playable on 4k.
 
Why does everyone just assume a rumor is true. I keep seeing comments from this board and others that treats the rumor like it's gospel truth. For all we know that rumor may never come to fruition. Until Nvidia themselves announces it, I roll my eyes at the rumors.

Some random HP microcode, but still more weight than a simple rumor:
https://www.guru3d.com/news-story/h...-3080-tirtx-3070-tiand-rtx-3080-variants.html

Most likely 12 GB since it is 384 GB. The card will be a direct competitor to the 6900xt.
 
Some random HP microcode, but still more weight than a simple rumor:
https://www.guru3d.com/news-story/h...-3080-tirtx-3070-tiand-rtx-3080-variants.html

Most likely 12 GB since it is 384 GB. The card will be a direct competitor to the 6900xt.
12GB makes sense to me since the board has slots already for it, can use the same cooler/configuration and would not need to use both sides of the board plus less hit on Nvidia DDR6x inventory. Also would not make the 3090 virtually pointless.
 
12GB makes sense to me since the board has slots already for it, can use the same cooler/configuration and would not need to use both sides of the board plus less hit on Nvidia DDR6x inventory. Also would not make the 3090 virtually pointless.

There are non-gaming use cases where 24 GB comes in handy. But yeah, basically a Titan vs x80ti scenario. Can't imagine anyone saw that coming...
 

Well that's an incredibly dumb reason to ban them from reviewing their FE cards. Its not like Ampere sucks at rasterization; its actually pretty much equal to the competition depending on the resolutions.

But I guess being equal is not good enough, so reviewers are expected to showcase Nvidia products in ways that emphasize what they're best at. Such a shame that reviewers can't review a card in ways that they believe is more beneficial to their audience without penalty.
 
Well that's an incredibly dumb reason to ban them from reviewing their FE cards. Its not like Ampere sucks at rasterization; its actually pretty much equal to the competition depending on the resolutions.

But I guess being equal is not good enough, so reviewers are expected to showcase Nvidia products
This I imagine they send scripts, with bullet point to cover and banning access to people that do not follow them enough is the only way to enforce them, reviewer need to be net positive of the products after all for a company to have incentive to work with them.

Such a shame that reviewers can't review a card in ways that they believe is more beneficial to their audience without penalty.
It is a shame, but in the very splitted world, them refusing to do any NVIDIA covering of their product until serious reviewer like Unboxed do not lose card access is asking a lot.

A bit like movie reviewer that accept to do some twitter reaction about a movie before the embargo lift and actual critics have the right to talk about the movie, serious media in the past would have never accepted to do something like that, for the by the click self employed on their own modern one, it is harder to ask.
 
HUB has been poking the bear for years now.
Based on what we know about Nvidia, it doesn't surprise me at all.
 
This is silly. HUB's latest reviews of the 3060ti have been very favorable and reviews of the 6900xt not so much.

Plenty of RT performance shown as well showing nVidia walking all over AMD both with and without DLSS.

Perhaps they mentioned reviewer fraud on how the 3090 stole the sELECTION due to fake emails with improper ip addresses....

OK, I'm done.
 
Yeah well, scaring off the reviewers just makes me not trust the current ones.
It means either that they are bought or afraid to tell the truth in fear of being banned from receiving samples.
From now on I want on experiences from normal users.
 
Yeah well, scaring off the reviewers just makes me not trust the current ones.
It means either that they are bought or afraid to tell the truth in fear of being banned from receiving samples.
From now on I want on experiences from normal users.
Welcome to pr and marketing circa: decades ago.
 
I think Steve “Hardware Jesus” from GN usually doesn’t accept review samples or often sign NDAs for exactly this reason and I feel like I’ve seen videos from him about being not sent review samples or being banned before. I would trust his reviews and LTT and AT as well.
 
Back
Top