Ok so what do you guys really think of the 3080 reviews?

GN actually saw a performance degradation in a few scenarios when OC'd

The FE cards may have the superior cooling solution, but with such a compact PCB, I imagine power delivery is at it's limit even at default settings.

I was doing some reading about how memory OC works with this new GDDR6x stuff. It has some interesting error "correction" stuff happening. Instead of getting artifacts and all the other usual signs, you simply will performance loss as the memory controller keeps requesting good data instead of passing it on to the pipeline.
 
GN actually saw a performance degradation in a few scenarios when OC'd

The FE cards may have the superior cooling solution, but with such a compact PCB, I imagine power delivery is at it's limit even at default settings.

Compact LOL?
 
350w+ is a lot to move through an AIO. Which one are you looking at? Ill be blocking mine. Just gotta see what I can snag tomorrow morning.
I've used a Kraken X41 for a 980 Ti, 1080, 1080 Ti, and now a 2080 Super. I've used a Kraken X62 for the second 1080 Ti, and due to where it was mounted it ran warmer than the X41. I'll likely end up using the X62 (non stock fans) though since even EVGA is using a 240 AiO on their hybrid model
 
Which FE card, link?


That ACTUAL FE card. Both the 3080 and 3090 FE boards are the same PCB. The 3090 has the extra RAM filled in and a few extra capacitors.

1600283797680.png
 
I think the Marketing team did over hype the cards, but that is normal Marketing, and the reason that you cant buy/preorder anything just off Nvidia marketing promises. But overall, this is the card to upgrade to for anyone that has held onto the 1080TI. most games are showing a 50-80% improvement. The reviews def show that theres areas to improve with drivers, since there are games that are on the very low end of improvement differences. For anyone that has a 2080 TI, i feel bad a lot of them dumping them so cheap, The reviews show that the 2080 TI is still a strong card, and the 3080 is 25-30% faster in most areas, which is great for a $700 card, but no reason to just throw out the 2080 TI and dump them. Of course there will be the people that can just throw away 1200, and those would prob be buying the 3090 anyway, but for most people under the 2080 TI, this 3080 looks like a great card for 700 that will give everyone a very strong bump in performance.
 
350w+ is a lot to move through an AIO. Which one are you looking at? Ill be blocking mine. Just gotta see what I can snag tomorrow morning.

Alphacool just stated their Eiswolf and Eiswolf 2 can handle the 3080 and 3090 TDP. Only the full block Aurora is available now but the Eiswolf pump/block are due in the next few days.
 
Maybe by the end of 2021, with where DSSL will be and how much RTX get use in games and where the drivers finish at for that generation (and what you get do with HDMI 2.1) the gap will be impressive, but right now a bit of a deception (and maybe in 2020 many scenarios will have a 2080TI faster than a 3070 after all).

There is some scenario with full or almost full RTX where it bench around 50% higher than a 2080 TI after all, for a cheaper price tag, and maybe that after a while game will take advantage of this and the loading time with the Xbox like Direct API that we have yet to see what it will do.
 
The raster performance boost is excellent.

I think it pretty much confirms what I said in some other threads about raytracing performance.

I am underwhelmed with the RTX boost, here. and its simply because they didn't add more RTX cores. They have only allowed the improvements per core, to be the driving force of the RTX performance improvement. So....we get a decent boost to RTX, with the same amount of cores as a 2080 ti. Making the 3080 an ok 1440p RTX card. But RTX is still a large hit to overall performance.

And the 3070 has proportionately less RTX cores. which means its RTX will only be about as good as a 2080ti. Which means its either going to be nearly worthless in the next gen of games, for RTX. Or, raytracing will continue to be simply a little garnish for better reflections and certain shadows.

I don't think DLSS is great, right now. it is A LOT better than it was in the past. But, I think there are still too many image quality issues, for it to be the driving force for such expensive cards. Reviews say amazing things. But actually playing Death Stranding, I noticed issues with DLSS I've only seen mentioned by one publication.
 
I think it is about what was expected pre-launch, but lower than what people thought post-launch. Solid card, but it seems like it is at a speed that AMB can relistically beat with their upcoming cards without going crazy. I suspect Big Navi slots in between 3080 and 3090, and Nvidia releases 3080 Super in response.
 
They had to do another generation where raster was priority and did what they could to uplift RTX performance with the same core count.

There will have to be a shift in game design for GPU's to equally prioritize both which I hope we are on the verge of in the next 3 years given the consoles are pushing RT so heavily marketing wise.

I'm also pretty disappointed with RTX performance. It's literally 10-15 FPS over a 2080 Ti @ 4K native and that's in 1 year old games. Anything modern that pushes effects will once again cripple the card.

4K Control DLSS with RTX is still sub 60FPS and I expect Cyberpunk to be even worse.
 
idk, havent watch any yet. i just cant seem to get as excited for new hardware after [H] shutdown...
 
I think it is about what was expected pre-launch, but lower than what people thought post-launch. Solid card, but it seems like it is at a speed that AMB can relistically beat with their upcoming cards without going crazy. I suspect Big Navi slots in between 3080 and 3090, and Nvidia releases 3080 Super in response.

So far the most reliable sources are saying that biggest navi is somewhere between 3070-3080 in raster games, and if they absolutely push the wattage (ie, you will never see an OC), it will be just below 3080
 
Seems more like a normal generation now, unless ray tracing is involved. Cool improvement, further solidifies me in waiting a while longer to see what AMD has.
 
2080 to 3080 over 60% faster (up to 90%) for the same price? (At 4K, I know it drops when you look at 1440 and 1080 - I've gamed at 4K for many years now) -

Fan-fucking-tastic.

This. And not to mention it drops prices of 20xx series on the used market.
 
4K Control DLSS with RTX is still sub 60FPS and I expect Cyberpunk to be even worse.

I could be misunderstanding something, but
Here:
https://www.theverge.com/21435926/nvidia-geforce-rtx-3080-review
And here:


Do seem to show a bit above 60 average fps for the 3080 at 4K highest, Max RTX, DLSS On for Control

With this copy pasted from the review:
Control was unplayable a lot of times with the RTX 2080 at max settings on 4K resolution, thanks to its physics-defying gameplay and the game’s demanding specs. Control is very GPU-intensive and uses a variety of graphics techniques to produce some visually stunning gameplay. That’s why gameplay regularly dipped below 30fps on the older RTX 2080, adding horrible input lag to the fast-paced shooting scenes. But on the RTX 3080, it averaged 72fps, with only occasional dips below 60fps. It’s an 80 percent jump, which is staggeringly good.
 
Not impressive enough for me to try to get one tomorrow morning. I'll wait for big Navi and some reviews of the AIB boards. I'll probably still get a 3080 though.

Dlss has been great with control though I see some issues. The paintings are all very very low res unless I look at them from a sharp angle, at which point they go high res for about two seconds. I'm still quite impressed and I'm able to run it at ultra everything with DLSS
 
This is one of the best channels out there for info about streaming, video catpure, and some general stuff on video/content creation. And he did not dissapoint, by doing this video on the 3080:




he confirmed the NVENC encoder is unchanged from Turing. Something I had been wondering about.
 
Last edited:
I think the performance numbers from my perspective was really up to spec with what I expected. nVIDIA with their Marketing I sensed was telling a moderate level fish tale or two with what it could do. I knew it was going to be a strong boost, the question was how strong. I'm coming from a GTX 1080 SC from eVGA so I am due for a new card. Seems the 3080 spanks the 1080 at 1440p Ultra settings so that was another factor in me getting up tomorrow with hope to get a 3080. 2 other things played into account for me though: 1. Card size. The eVGA 3080 UC3 Ultra is .7 inches longer than my current eVGA 1080 SC so it will fit in my Phanteks Eclipse P400A with pretty good ease. 2. Power draw and PCIe power connectors needed. I was led to believe by first marketing from AIBs that all AIB 3080's would need 3X PCIe 8pin connectors to run. The eVGA 3080 UC3 Ultra only needs 2. While my PSU is 850W and can do either 3X PCIe connector or 2X PCIe connector card, cable managing 3 of those cables is a bit harder than 2 so therefore 2? I can live with that. Can't wait for tomorrow morning. Out!
 
Youtubers and streamers can read a script that reaches more buyers than hard core component reviews.
Shroud and Ninja can yammer on about 360fps monitors, 10900k, and 3090 is the only way to game, people will buy.
It’s sad, but they will open up the advertised component specs and have at it.
 
I could be misunderstanding something, but
Here:
https://www.theverge.com/21435926/nvidia-geforce-rtx-3080-review
And here:


Do seem to show a bit above 60 average fps for the 3080 at 4K highest, Max RTX, DLSS On for Control

With this copy pasted from the review:
Control was unplayable a lot of times with the RTX 2080 at max settings on 4K resolution, thanks to its physics-defying gameplay and the game’s demanding specs. Control is very GPU-intensive and uses a variety of graphics techniques to produce some visually stunning gameplay. That’s why gameplay regularly dipped below 30fps on the older RTX 2080, adding horrible input lag to the fast-paced shooting scenes. But on the RTX 3080, it averaged 72fps, with only occasional dips below 60fps. It’s an 80 percent jump, which is staggeringly good.


Depends where you test. Most of these reviewers run through hallways at the start of the game. Later on the 3080 drops to mid 50's.
 
if good overclocker 2080 Ti are getting that close to a 3080 then I dont think the 3070 has a chance at beating the 2080 Ti but they should trade scores and be close.
 
So far the most reliable sources are saying that biggest navi is somewhere between 3070-3080 in raster games, and if they absolutely push the wattage (ie, you will never see an OC), it will be just below 3080

Well we will see. That would be almost impossible to be that low in performance. Just a 5700XT refresh on RDNA 2 on the new process and 80 CUs would be faster than that. And the reliable sources especially the German guy is saying AMD will be competitive at all levels and better than 3080
 
So far the most reliable sources are saying that biggest navi is somewhere between 3070-3080 in raster games, and if they absolutely push the wattage (ie, you will never see an OC), it will be just below 3080

Also consider that 3800 and 3700 performance is WAY below what NVIDIA claimed so if the between 3070 and 3080 was based on the early hype it could be 3080 with the same expected speed. Big Navi should be 30-40% faster than a 2080Ti at a minimum
 
Well we will see. That would be almost impossible to be that low in performance. Just a 5700XT refresh on RDNA 2 on the new process and 80 CUs would be faster than that. And the reliable sources especially the German guy is saying AMD will be competitive at all levels and better than 3080

Call a majority of us gun shy but we have heard that from AMD and "reliable" AMD sources before. From Hawaii to now, AMD always seems to be behind NV especially at the top tier.

Don't get me wrong, I would love AMD to smack NV around like in the past. Just won't hold my breath or get excited until they do.
 
Call a majority of us gun shy but we have heard that from AMD and "reliable" AMD sources before. From Hawaii to now, AMD always seems to be behind NV especially at the top tier.

Don't get me wrong, I would love AMD to smack NV around like in the past. Just won't hold my breath or get excited until they do.

Nvidia's power efficiency wasn't as susbstantial this gen with ampere, amd likely has a competitor based on maths
 
Call a majority of us gun shy but we have heard that from AMD and "reliable" AMD sources before. From Hawaii to now, AMD always seems to be behind NV especially at the top tier.

Don't get me wrong, I would love AMD to smack NV around like in the past. Just won't hold my breath or get excited until they do.

I get that. Raja promised the moon and didn't even deliver a cloud. But Lisa Su seems to be completely different. She will tease a product but does not seem to overhype and has REALLY delivered on the Ryzen side. She still seems super confident that RDNA 2 will be a NVIDIA killer, and that combined with some back of the Napkin math I just really do not think they are going to disappoint. I wouldn't buy a card till the RDNA2 announcement. Bit surprised there wasnt another teaser today
 
Call a majority of us gun shy but we have heard that from AMD and "reliable" AMD sources before. From Hawaii to now, AMD always seems to be behind NV especially at the top tier.

Don't get me wrong, I would love AMD to smack NV around like in the past. Just won't hold my breath or get excited until they do.

This. The last time I bought an AMD it was a 7970, and in the 5% of cases where the drivers worked, the software I was running it with worked, and it wasn't overheating to reduce my performance, it was fucking fantastic. But eventually I got fed up with all the (majority case) caveats and bought a GTX 680 not long after. Quieter, faster, less power... I didn't look back. And from what I understand it was only a downhill slope for AMD from there (barring mining - which I did dabble in with the 290s).

Even after generations of trying to support ATI/AMD and having them ultimately fail me (the last card they made that I felt true satisfaction with was the Radeon 9800 Pro), I hold little hope. But I really WANT to believe.
 
Back
Top