Has your G-Sync display locked you into the Nvidia EcoSystem?

Has your G-Sync display locked you into the Nvidia EcoSystem?


  • Total voters
    84
It hasn't locked me in, but I wouldn't switch to AMD because they don't have equivalent technology.

GSYNC was and is amazing. I bought the conversion kit and upgraded my monitor to use it before you could even get a standalone monitor with it. That was 7 years ago. AMD just recently caught up in VRR tech if you can even really say they caught up. I'm getting a LG CX OLED and AMD's new cards should be able to run it at 4k@120hz with VRR just like NVIDIA's new cards can.

Now NVIDIA has DLSS, ray tracing, rtx voice, CUDA, and all sorts of really good technology that AMD doesn't have, and the stuff that AMD does have is typically vastly inferior.

And then there's the fact that AMD's highest end cards aren't nearly as good as NVIDIA's high end cards, and that's what I buy. If AMD comes out with a new card that performs better at 4k than NVIDIA's best I would definitely consider it. But I wouldn't if it just has equivalent performance.
 
I'm curious about the RX 6000. Initially, I felt as though it would probably slot somewhere close to a RTX 3070, but I'm starting to think Big Navi may surprise us. If some rumors are true anyway.
This.

I'm slowly upgrading my gaming rigs. My HTPC is the worst of the lot... It needs some love. I've got it hooked up, through a Denon AVR, to a 1080p projector and an LG 4k OLED, their 65" C9. If AMD can bring some "ooomph" to the game, with reasonable pricing and power/cooling needs, then I'm in for several of them. Nvidia's release fiasco with the 3000's is just more proof to me that they're more about posturing these days. (Sure, they have some impressive tech and impressive market share, but that's not everything.)

A week or so to go...
 
The newer G-Sync modules now work with both G-Sync and Freesync GPUs. The old G-Sync modules (like the one in the AW3418DW) are still locked in with NVIDIA GPUs only and will not provide VRR on AMD GPUs. I don't know if it would be possible for NVIDIA to easily unlock that capability for the old modules. I have a feeling it would require a firmware updates which would need to be done by the screen manufacturers... doubtful we'll ever see that.
Thanks for sharing.

I had read something about this before thinking it wouldn't apply to my AW3420DW, but I did a little more searching it turns out it supports both. There's a post over on techpowerup forums specifically about this model. Moreover, when I'm in the Nvidia settings in Linux it says G-sync unvalidated and in Windows it says G-Sync capable. I used to think maybe I didn't have a real Gsync monitor with a module, but now it all makes sense. Time to think about AMD for the first time in a long time.

Edit - Added some links below. FYI I saw many replies in several threads (Reddit) confirming Freesync was active when coupling an AMD card with AW3420DW.

Techpowerup thread cited above - https://www.techpowerup.com/forums/threads/freesync-working-on-g-sync-monitors-since-when.261025/
Post from r/hardware which further confirms compatibility - https://www.reddit.com/r/hardware/comments/gl3suy/reminder_nvidia_is_in_the_process_of_adding_vesa/
 
Last edited:
Thanks for sharing.

I had read something about this before thinking it wouldn't apply to my AW3420DW, but I did a little more searching it turns out it supports both. There's a post over on techpowerup forums specifically about this model. Moreover, when I'm in the Nvidia settings in Linux it says G-sync unvalidated and in Windows it says G-Sync capable. I used to think maybe I didn't have a real Gsync monitor with a module, but now it all makes sense. Time to think about AMD for the first time in a long time.

There was talk about these monitors being able to do FreeSync and G-Sync, but I understood that hardware G-Sync displays couldn't do both, but maybe they can. I haven't really looked into it. I have an AW3418DW but I'm not stuck on keeping it over the long haul. I bought it on sale awhile back as a kind of stop gap until I could get what I really wanted which is 120Hz, G-Sync, HDR, and 4K. I was also waiting for GPU's that could handle that.
 
No issue swapping here, if the performance is there I would buy AMD. Prob wouldn’t swap the PG279Q though, well... I would most likely will buy a new monitor when I get back to the US, or if something pops up locally.

However, if I have to deal with warranty work it generally is tied to the region you purchased it in.
 
Technically, it supports Freesync and HDMI Forum VRR; it doesn't support Gsync as it doesn't have the Gsync module, that's why it's "Gsync Compatible", as well as why NVIDIA had to backport HDMI Forum VRR into HDMI 2.0 a few months back.

Hey guys, check out the hater over here telling people that a Gsync compatible TV isn't actually letting you run Gsync!

Haters gonna hate!
 
There was talk about these monitors being able to do FreeSync and G-Sync, but I understood that hardware G-Sync displays couldn't do both, but maybe they can. I haven't really looked into it. I have an AW3418DW but I'm not stuck on keeping it over the long haul. I bought it on sale awhile back as a kind of stop gap until I could get what I really wanted which is 120Hz, G-Sync, HDR, and 4K. I was also waiting for GPU's that could handle that.
Interesting because it says here In March 2020 they will not be getting updates.

https://www.dell.com/community/Alienware-Desktops/FreeSync-over-G-Sync-module-Nr-2/td-p/7505287
 
Interesting because it says here In March 2020 they will not be getting updates.

https://www.dell.com/community/Alienware-Desktops/FreeSync-over-G-Sync-module-Nr-2/td-p/7505287

That's what I said. I didn't think they would be able to support FreeSync on monitors with hardware G-Sync support. Dell only confirms that. I was responding to someone who seemed to think that you could do FreeSync on hardware G-Sync displays. Something I didn't think was the case, but admitted I hadn't looked into it. AMD hasn't made a GPU I'd be interested in for a very long time.
 
I don't see why anyone would purposely lock themselves into Gsync when Freesync2 exists. I have had multiple Nvidia cards work perfectly fine with FS.
 
Interesting because it says here In March 2020 they will not be getting updates.

https://www.dell.com/community/Alienware-Desktops/FreeSync-over-G-Sync-module-Nr-2/td-p/7505287
That's what I said. I didn't think they would be able to support FreeSync on monitors with hardware G-Sync support. Dell only confirms that. I was responding to someone who seemed to think that you could do FreeSync on hardware G-Sync displays. Something I didn't think was the case, but admitted I hadn't looked into it. AMD hasn't made a GPU I'd be interested in for a very long time.

Perhaps Dell isn't allowed to officially give this specific monitor Freesync certification. Thread linked above is probably Dell's way of wiping their hands clean of anything having to do with it.

PS - I edited my last reply and added some relevant links which confirm it can do both. I even saw someone who claimed that Freesync can utilize the G-sync module for an even smoother experience.
 
I don't see why anyone would purposely lock themselves into Gsync when Freesync2 exists. I have had multiple Nvidia cards work perfectly fine with FS.

Last AMD card I had was 18 years ago and the last card they made that could have interrested me was probably a decade ago. So really, HAVING to go Nvidia for the GSync has been kind of a non issue so far. Big Navi looks to be shaking things off, but it's all speculation a this point.
 
I don't see why anyone would purposely lock themselves into Gsync when Freesync2 exists. I have had multiple Nvidia cards work perfectly fine with FS.
Most of us, I’d expect, bought our G-Sync monitor before Freesync compatible was a thing.
I had three 32” Omen monitors that were Freesync only when I had my Fury X, But when Vega launch disappointed me and I switched to 1080TI, the lack of freesync, despite the higher frame rates on the 1080TI was jarring. I sold the three Omens so I could have adaptive sync again and bought a G-Sync monitor in the AW3418DW. At that time there was no monitor that would do both. You had to pick your ecosystem.

What was it, only a couple of years ago Gsync Compatible was announced? That’s not that far back. Before that you had to pick a display to match your video card - there wasn’t the luxury of getting one that would do both.
 
Most of us, I’d expect, bought our G-Sync monitor before Freesync compatible was a thing.
I had three 32” Omen monitors that were Freesync only when I had my Fury X, But when Vega launch disappointed me and I switched to 1080TI, the lack of freesync, despite the higher frame rates on the 1080TI was jarring. I sold the three Omens so I could have adaptive sync again and bought a G-Sync monitor in the AW3418DW. At that time there was no monitor that would do both. You had to pick your ecosystem.

What was it, only a couple of years ago Gsync Compatible was announced? Before that you had to pick a display to match your video card.

I get that. I mean more currently. I wouldn't buy a new G-Sync monitor right now. I would even think about selling an older display for the compatibility.
 
AW3418DW here, similar dilemma, and a huge part of the rationale for upgrading to Samsung's Odyssey G9 (LC49G97TSSNXDC, directly from Samsung, $1243, $1349 after sale tax, with military veteran discount, arriving Oct 20th)

With how underwhelming Nvidia's 8nm EUV process is (RTX 3080 = 10% more performance vs overclocked 2080 Ti at the same power draw, the 3090 = 25% @ 1440p) the intended scarcity economics / paper launch, Nvidia encouraging scalping, overstating performance difference vis a vis Pascal (going by Digital Foundry review the 3080 is 65% faster than the 2080 Ti when in reality it's 10% at 1440p and 20% at 4K when run at the same power draw) this was what pushed me over the edge towards the upgrade to the newly relisted G9 directly from Samsung as now I am freed from NV's ecosystem and can choose either AMD or Nvidia.

https://www.3dmark.com/spy/14187325

Coming from AW3418DW I'm extremely excited (non-existent blacks, 120 Hz limit)

Samsung's new Quantum Dot LCD VA ghosting is a thing of the past with Samsung's new Quantum Dot QLED VA technology.

Check this out: https://www.tftcentral.co.uk/reviews/samsung_odyssey_g7_c32g75t.htm#mbr

https://www.tftcentral.co.uk/images/samsung_c32g75t/va_comparison.png

This is seriously next level panel technology, we have:

TN responsiveness / input latency

IPS color / near IPS viewing angles

VA blacks and contrast

0-10% Black Smearing (for reference PG35VQ is 60%.)

I'm extremely excited to get my G9. It's the newer re-listed model directly from Samsung.

Coming from AW3418DW, I can't wait to see actual dark blacks and better contrast, not to mention not being capped at 120 Hz, more curvature, HDR1000, and TN-like input latency.
 
AMD needs to do more than make fps graphs longer. Their feature set just isn’t up to scratch.

I have a shield connected to the TV in my bedroom and regularly stream games at 4K from my desktop in another room with extremely low latency. It’s a niche use case for sure but one that AMD has no answer for right now. I would consider AMD if they truly knocked it out the park but it would have to be a massive win for me to switch ecosystems.
 
I bought a Freesync 2 monitor that works with G-Sync, so...nah. I'm good either way.
 
AMD needs to do more than make fps graphs longer. Their feature set just isn’t up to scratch.

I have a shield connected to the TV in my bedroom and regularly stream games at 4K from my desktop in another room with extremely low latency. It’s a niche use case for sure but one that AMD has no answer for right now. I would consider AMD if they truly knocked it out the park but it would have to be a massive win for me to switch ecosystems.
Actually, I think they did have an answer for a while (although I only tried it with one game). I'm not sure if it's updated/supported anymore, and I forget the name, but it was part of their gaming software suite at some point. Might just be thinking of their video capture software...
 
Hey guys, check out the hater over here telling people that a Gsync compatible TV isn't actually letting you run Gsync!

Haters gonna hate!
Because it's not using Gsync. That was the whole deal with NVIDIA backporting HDMI 2.1 features (HDMI Forum VRR) into it's drivers a few months ago so LG owners could take advantage of the VRR that the TV does support. Rtings confirmed as much when they reviewed it.
 
Because it's not using Gsync. That was the whole deal with NVIDIA backporting HDMI 2.1 features (HDMI Forum VRR) into it's drivers a few months ago so LG owners could take advantage of the VRR that the TV does support. Rtings confirmed as much when they reviewed it.
And I'm sure what you say makes it not G-Sync, as obviously Nvidia doesn't know what they're doing when they allow a company to slap G-Sync on the box.
 
Last edited:
AW3418DW here, similar dilemma, and a huge part of the rationale for upgrading to Samsung's Odyssey G9 (LC49G97TSSNXDC, directly from Samsung, $1243, $1349 after sale tax, with military veteran discount, arriving Oct 20th)

With how underwhelming Nvidia's 8nm EUV process is (RTX 3080 = 10% more performance vs overclocked 2080 Ti at the same power draw, the 3090 = 25% @ 1440p) the intended scarcity economics / paper launch, Nvidia encouraging scalping, overstating performance difference vis a vis Pascal (going by Digital Foundry review the 3080 is 65% faster than the 2080 Ti when in reality it's 10% at 1440p and 20% at 4K when run at the same power draw) this was what pushed me over the edge towards the upgrade to the newly relisted G9 directly from Samsung as now I am freed from NV's ecosystem and can choose either AMD or Nvidia.

https://www.3dmark.com/spy/14187325

Coming from AW3418DW I'm extremely excited (non-existent blacks, 120 Hz limit)

Samsung's new Quantum Dot LCD VA ghosting is a thing of the past with Samsung's new Quantum Dot QLED VA technology.

Check this out: https://www.tftcentral.co.uk/reviews/samsung_odyssey_g7_c32g75t.htm#mbr

https://www.tftcentral.co.uk/images/samsung_c32g75t/va_comparison.png

This is seriously next level panel technology, we have:

TN responsiveness / input latency

IPS color / near IPS viewing angles

VA blacks and contrast

0-10% Black Smearing (for reference PG35VQ is 60%.)

I'm extremely excited to get my G9. It's the newer re-listed model directly from Samsung.

Coming from AW3418DW, I can't wait to see actual dark blacks and better contrast, not to mention not being capped at 120 Hz, more curvature, HDR1000, and TN-like input latency.

What enthusiast cares about 50-75 watts on a $800-1800 card? That’s ~$15 at the end of the year (8hrs/day), it’s nothing.

What I do care about is the 3080,3090 are much faster; especially RT which they are ~ 50,65% faster than a 2080ti.

Totally agree on the new Samsung screens. Absolutely amazing.
 
What enthusiast cares about 50-75 watts on a $800-1800 card? That’s ~$15 at the end of the year (8hrs/day), it’s nothing.

What I do care about is the 3080,3090 are much faster; especially RT which they are ~ 50,65% faster than a 2080ti.

Totally agree on the new Samsung screens. Absolutely amazing.

No they aren't, here, have a look, I've spent considerable time with analysis, scroll down for a breaking down of Gamers Nexus 3090 review comparing the 3090 against overclocked 2080 Ti and 3080 (OC.net handle "Mooncheese")

https://www.overclock.net/threads/official-nvidia-rtx-3090-owners-club.1753930/page-60

My 2080 Ti does 10,300 Port Royal @ 340w.
https://www.3dmark.com/pr/318529

RTX 3080 does 11.5k Port Royal at 350w

We are talking about a 10% difference here.

All of the benchmarks are comparing the 3080 @ 320w against 2080 Ti @ 260w, a 35% difference. When you run 2080 Ti at the same power draw the numbers shrink from 20% @ 1440p and 30% at 4K to 10% at 1440p and 20% at 4K (see Gamers Nexus 3090 review and use OC 2080 Ti Strix for comparison).



8nm EUV Ampere is straight trash. If you really believe that the 3080 is 50% faster in RT than 2080 Ti at the same power draw you are smooth brain for whom Digital Foundry infotainment "reviews" are effective.

Here's an idea, Nvidia is a multi-trillion dollar corporation, if you don't think they can approach a tech-tuber and offer them exclusive content 2 weeks before a product goes live and before anyone else if they show specially curated "benchmarks" exaggerating the performance or outright buy said tech-tuber youre incredibly naïve.

3080 50% faster than 2080 Ti. LMFAO.

Good lord people are braindead.

Try 10% at 1440p and 20% at 4K.

Here: https://www.overclock.net/threads/official-nvidia-rtx-3090-owners-club.1753930/page-60

#1,196 18 d ago
Krzych04650 said:
Sold out completely, even here where good models are like 3x average monthly salary :p Not getting any for few months it seems.

Performance isn't that great though, even more favorable reviews like TPU have 45% average gain vs 2080 Ti before considering zero OC potential of Ampere, so real world average OC vs OC is likely less than 35%. In practice this is very enough to make you able to hit performance target instead of being far from it, performance target is kind of a 0-1 situation especially in most demanding scenarios, so all of these percentages are not telling full story, but still one would hope for more than another 35% gen on gen leap.
Try 20-25%.

Do the math, compare 2080 Ti at the same power draw (the OC Strix on the chart) to both the 3080 and 3090.

Here, let me help you:

RDR2 4K
3090 FE Stock: 92 FPS, 3080 FE OC: 90 FPS, 2080 Ti OC: 72 FPS, Percentage change: 27% and 25%

RDR2 1440p
3090 FE Stock: 134, 3080 FE OC: 131, 2080 Ti OC: 118, Percentage change: 14% and 11%

Rainbow Six Siege 4K
3090 FE Stock: 203, 3080 FE OC: 184, 2080 Ti OC: 161, Percentage change: 26% and 14%
Rainbow Six Siege 1440p
3090 FE stock: 358, 3080 FE OC: 337, 2080 Ti OC: 299, Percentage Change: 20% and 13%
SOTTR 4K
3090 FE Stock: 104, 3080 FE OC: 94, 2080 Ti OC: 82, Percentage Change: 27% and 15%
SOTTR 1440p
3090 FE Stock: 169, 3080 FE OC: 158, 2080 Ti OC: 138, Percentage Change: 22% and 14%


Now, while we are at it, lets have a look at overclocked 2080 Ti vs overclocked 1080 Ti performance for comparison, including synthetics. Because I'm tired and because there isn't a meaningful percentage change between 1080 Ti and 2080 Ti at 4K I will only look at 1440p below. Youre more than welcome to look at 4K on your own.

RDR2 1440p
2080 Ti OC: 118, 1080 Ti OC: 74, Percentage Change: 59%
Rainbow Six Siege 1440p
2080 Ti OC: 299, 1080 Ti OC: 198, Percentage Change: 51%
SOTTR 1440p
2080 Ti OC: 138, 1080 Ti OC: 95, Percentage Change: 45%
 
Last edited:
No they aren't, here, have a look, I've spent considerable time with analysis, scroll down for a breaking down of Gamers Nexus 3090 review comparing the 3090 against overclocked 2080 Ti and 3080 (OC.net handle "Mooncheese")

https://www.overclock.net/threads/official-nvidia-rtx-3090-owners-club.1753930/page-60

My 2080 Ti does 10,300 Port Royal @ 340w.
https://www.3dmark.com/pr/318529

RTX 3080 does 11.5k Port Royal at 350w

We are talking about a 10% difference here.

All of the benchmarks are comparing the 3080 @ 320w against 2080 Ti @ 260w, a 35% difference. When you run 2080 Ti at the same power draw the numbers shrink from 20% @ 1440p and 30% at 4K to 10% at 1440p and 20% at 4K (see Gamers Nexus 3090 review and use OC 2080 Ti Strix for comparison):



8nm EUV Ampere is straight trash. If you really believe that the 3080 is 50% faster in RT than 2080 Ti at the same power draw you are smooth brain for whom Digital Foundry infotainment "reviews" are effective.

Here's an idea, Nvidia is a multi-trillion dollar corporation, if you don't think they can approach a tech-tuber and offer them exclusive content 2 weeks before a product goes live and before anyone else if they show specially curated "benchmarks" exaggerating the performance or outright buy said tech-tuber youre incredibly naïve.

3080 50% faster than 2080 Ti. LMFAO.

Good lord people are braindead.

Try 10% at 1440p and 20% at 4K.

Here:

#1,196 18 d ago

Try 20-25%.

Do the math, compare 2080 Ti at the same power draw (the OC Strix on the chart) to both the 3080 and 3090.

Here, let me help you:

RDR2 4K
3090 FE Stock: 92 FPS, 3080 FE OC: 90 FPS, 2080 Ti OC: 72 FPS, Percentage change: 27% and 25%

RDR2 1440p
3090 FE Stock: 134, 3080 FE OC: 131, 2080 Ti OC: 118, Percentage change: 14% and 11%

Rainbow Six Siege 4K
3090 FE Stock: 203, 3080 FE OC: 184, 2080 Ti OC: 161, Percentage change: 26% and 14%
Rainbow Six Siege 1440p
3090 FE stock: 358, 3080 FE OC: 337, 2080 Ti OC: 299, Percentage Change: 20% and 13%
SOTTR 4K
3090 FE Stock: 104, 3080 FE OC: 94, 2080 Ti OC: 82, Percentage Change: 27% and 15%
SOTTR 1440p
3090 FE Stock: 169, 3080 FE OC: 158, 2080 Ti OC: 138, Percentage Change: 22% and 14%


Now, while we are at it, lets have a look at overclocked 2080 Ti vs overclocked 1080 Ti performance for comparison, including synthetics. Because I'm tired and because there isn't a meaningful percentage change between 1080 Ti and 2080 Ti at 4K I will only look at 1440p below. Youre more than welcome to look at 4K on your own.

RDR2 1440p
2080 Ti OC: 118, 1080 Ti OC: 74, Percentage Change: 59%
Rainbow Six Siege 1440p
2080 Ti OC: 299, 1080 Ti OC: 198, Percentage Change: 51%
SOTTR 1440p
2080 Ti OC: 138, 1080 Ti OC: 95, Percentage Change: 45%

It's absolute shit, boys and girls. Don't buy a 3090 because it's literally as fast overclocked on air as SLi watercooled overclocked 1080 Tis, and blows the tits off of the 2080 Ti too.

https://www.3dmark.com/compare/spy/14215251/spy/11470743

Oh, and a 3090 overclocked on air is still faster than the fastest Ln2 3080.

https://www.3dmark.com/compare/pr/319857/pr/356777

Absolute shit, I agree. Don't buy one and let others enjoy the performance increase that the 20 series should have offered.
 
Last edited:
No they aren't, here, have a look, I've spent considerable time with analysis, scroll down for a breaking down of Gamers Nexus 3090 review comparing the 3090 against overclocked 2080 Ti and 3080 (OC.net handle "Mooncheese")

https://www.overclock.net/threads/official-nvidia-rtx-3090-owners-club.1753930/page-60

My 2080 Ti does 10,300 Port Royal @ 340w.
https://www.3dmark.com/pr/318529

RTX 3080 does 11.5k Port Royal at 350w

We are talking about a 10% difference here.

All of the benchmarks are comparing the 3080 @ 320w against 2080 Ti @ 260w, a 35% difference. When you run 2080 Ti at the same power draw the numbers shrink from 20% @ 1440p and 30% at 4K to 10% at 1440p and 20% at 4K (see Gamers Nexus 3090 review and use OC 2080 Ti Strix for comparison).



8nm EUV Ampere is straight trash. If you really believe that the 3080 is 50% faster in RT than 2080 Ti at the same power draw you are smooth brain for whom Digital Foundry infotainment "reviews" are effective.

Here's an idea, Nvidia is a multi-trillion dollar corporation, if you don't think they can approach a tech-tuber and offer them exclusive content 2 weeks before a product goes live and before anyone else if they show specially curated "benchmarks" exaggerating the performance or outright buy said tech-tuber youre incredibly naïve.

3080 50% faster than 2080 Ti. LMFAO.

Good lord people are braindead.

Try 10% at 1440p and 20% at 4K.

Here: https://www.overclock.net/threads/official-nvidia-rtx-3090-owners-club.1753930/page-60

#1,196 18 d ago

Try 20-25%.

Do the math, compare 2080 Ti at the same power draw (the OC Strix on the chart) to both the 3080 and 3090.

Here, let me help you:

RDR2 4K
3090 FE Stock: 92 FPS, 3080 FE OC: 90 FPS, 2080 Ti OC: 72 FPS, Percentage change: 27% and 25%

RDR2 1440p
3090 FE Stock: 134, 3080 FE OC: 131, 2080 Ti OC: 118, Percentage change: 14% and 11%

Rainbow Six Siege 4K
3090 FE Stock: 203, 3080 FE OC: 184, 2080 Ti OC: 161, Percentage change: 26% and 14%
Rainbow Six Siege 1440p
3090 FE stock: 358, 3080 FE OC: 337, 2080 Ti OC: 299, Percentage Change: 20% and 13%
SOTTR 4K
3090 FE Stock: 104, 3080 FE OC: 94, 2080 Ti OC: 82, Percentage Change: 27% and 15%
SOTTR 1440p
3090 FE Stock: 169, 3080 FE OC: 158, 2080 Ti OC: 138, Percentage Change: 22% and 14%


Now, while we are at it, lets have a look at overclocked 2080 Ti vs overclocked 1080 Ti performance for comparison, including synthetics. Because I'm tired and because there isn't a meaningful percentage change between 1080 Ti and 2080 Ti at 4K I will only look at 1440p below. Youre more than welcome to look at 4K on your own.

RDR2 1440p
2080 Ti OC: 118, 1080 Ti OC: 74, Percentage Change: 59%
Rainbow Six Siege 1440p
2080 Ti OC: 299, 1080 Ti OC: 198, Percentage Change: 51%
SOTTR 1440p
2080 Ti OC: 138, 1080 Ti OC: 95, Percentage Change: 45%

You’re comparing hilariously CPU constrained situations. I pulled my numbers from multiple Guru3d games benchmarked at 4k, which is my use case.

If your use case is Rainbow Six Siege at 358 fps then I guess those numbers are true for you. People generally don’t use CPU constrained situations for comparisons though.
 
I'm still on an old 4k gsync only display so swapping probably isn't in the cards for me yet. I also enjoy nvidia's superior nvenc shadowplay and am hoping dlss catches on well. I'm trying to become a 3080 owner asap atm.
 
Yes, but at the time, and still now, don't see buying an AMD anytime soon. 3x Alienware 34" curved gsync.
 
And I'm sure what you say makes it not G-Sync, as obviously Nvidia doesn't know what they're doing when they allow a company to slap G-Sync on the box.
I think you missed the entire point of "Gsync Compatible". "Gsync Compatible" is what NVIDIA calls anything that does non-Gsync VRR that would otherwise meet it's Gsync certification requirements. Example: Gsync over Displayport is really just a re-implementation of VESA Adaptive Sync; it's not Gsync (as again, the proprietary Gsync module is not present).

NVIDIA is just throwing every VRR option under "Gsync Compatible" for PR purposes; if it's not using the Gsync module NVIDIA developed, it's not actually using Gsync.

Also note that at no point as LG ever claimed Gsync support; that's all NVIDIA. The CX (and C9) series only supports HDMI Forum VRR and Freesync over HDMI (via Firmware update). LG and NVIDIA made an agreement last year for NVIDIA to backport HDMI Forum VRR to HDMI 2.0 so users wouldn't have to wait for Ampere/big Navi to get VRR support. The fact NVIDIA throws all this under "Gsync Compatible" is just marketing fluff; the TVs don't support Gsync, and never will.

tldr: Don't fall for marketing fluff.
 
CPU constrained at 4K? Are you retarded? I pulled everything in that post directly from Gamers Nexus 3090 review. There is no CPU bottleneck at 4K. We are talking about 90 FPS here, not 150, not 300, 90 FPS. You don't see a CPU bottleneck unless youre at 1080p, even with a 3090. Also, they used a 10700k @ 5.1 GHz. There's no CPU bottleneck, youre just regurgitating idiotic statements made by other morons who think that the 3080 and 3090 are faster than they are when compared to the outgoing 80 Ti card with an overclock. Why don't you actually click on the various segments of the benchmark and do the math yourself, you see "RDR2 4K" below? Yeah it's a hot-link to that section of the 3090 review

Let's see, who has more credibility, Gamers Nexus or obviously bought out Digital Foundry?


RDR2 4K
3090 FE Stock: 92 FPS, 3080 FE OC: 90 FPS, 2080 Ti OC: 72 FPS, Percentage change: 27% and 25%

RDR2 1440p
3090 FE Stock: 134, 3080 FE OC: 131, 2080 Ti OC: 118, Percentage change: 14% and 11%

Rainbow Six Siege 4K
3090 FE Stock: 203, 3080 FE OC: 184, 2080 Ti OC: 161, Percentage change: 26% and 14%
Rainbow Six Siege 1440p
3090 FE stock: 358, 3080 FE OC: 337, 2080 Ti OC: 299, Percentage Change: 20% and 13%
SOTTR 4K
3090 FE Stock: 104, 3080 FE OC: 94, 2080 Ti OC: 82, Percentage Change: 27% and 15%
SOTTR 1440p
3090 FE Stock: 169, 3080 FE OC: 158, 2080 Ti OC: 138, Percentage Change: 22% and 14%


Now, while we are at it, lets have a look at overclocked 2080 Ti vs overclocked 1080 Ti performance for comparison, including synthetics. Because I'm tired and because there isn't a meaningful percentage change between 1080 Ti and 2080 Ti at 4K I will only look at 1440p below. Youre more than welcome to look at 4K on your own.

RDR2 1440p
2080 Ti OC: 118, 1080 Ti OC: 74, Percentage Change: 59%
Rainbow Six Siege 1440p
2080 Ti OC: 299, 1080 Ti OC: 198, Percentage Change: 51%
SOTTR 1440p
2080 Ti OC: 138, 1080 Ti OC: 95, Percentage Change: 45%

RDR2 is a terrible engine and is known for CPU constraints.... games can definitely be CPU constrained at 4k. Your own numbers show this. Why do you think a card with ~20% more cores and bandwidth has ~1% more fps? I also have zero idea why you’re talking about Digital Foundary.
 
Not related to g-sync at all but Nvidia GameSteam and Moonlight have locked me into the Nvidia ecosystem. It's great to be able to game from my laptop without needing to spend gaming laptop money or live with gaming laptops qualities.

I would still consider an AMD card but it would need to be a decisive winner. Not their typical close enough to nvidia for $50-100 less.
 
I think you missed the entire point of "Gsync Compatible". "Gsync Compatible" is what NVIDIA calls anything that does non-Gsync VRR that would otherwise meet it's Gsync certification requirements. Example: Gsync over Displayport is really just a re-implementation of VESA Adaptive Sync; it's not Gsync (as again, the proprietary Gsync module is not present).

NVIDIA is just throwing every VRR option under "Gsync Compatible" for PR purposes; if it's not using the Gsync module NVIDIA developed, it's not actually using Gsync.

Also note that at no point as LG ever claimed Gsync support; that's all NVIDIA. The CX (and C9) series only supports HDMI Forum VRR and Freesync over HDMI (via Firmware update). LG and NVIDIA made an agreement last year for NVIDIA to backport HDMI Forum VRR to HDMI 2.0 so users wouldn't have to wait for Ampere/big Navi to get VRR support. The fact NVIDIA throws all this under "Gsync Compatible" is just marketing fluff; the TVs don't support Gsync, and never will.

tldr: Don't fall for marketing fluff.
And you don't understand that I understood that from the beginning. Quite possibly before you even knew about the tech.

What you fail to comprehend is that it is still G-Sync per Nvidia, even without the module. It might not have the same range, but G-Sync it is just like all of the other G-Sync compatible displays that do it over DP.

If you still don't want to come to terms with that, may I suggest you take your firefighting skills directly to Nvidia so that they can update their big ol list of Gsync displays to more properly conform to your world view? https://www.nvidia.com/en-us/geforce/products/g-sync-monitors/specs/
 
Last edited:
CPU constrained at 4K? Are you retarded? I pulled everything in that post directly from Gamers Nexus 3090 review. There is no CPU bottleneck at 4K. We are talking about 90 FPS here, not 150, not 300, 90 FPS. You don't see a CPU bottleneck unless youre at 1080p, even with a 3090. Also, they used a 10700k @ 5.1 GHz. There's no CPU bottleneck, youre just regurgitating idiotic statements made by other morons who think that the 3080 and 3090 are faster than they are when compared to the outgoing 80 Ti card with an overclock. Why don't you actually click on the various segments of the benchmark and do the math yourself, you see "RDR2 4K" below? Yeah it's a hot-link to that section of the 3090 review

Let's see, who has more credibility, Gamers Nexus or obviously bought out Digital Foundry?


RDR2 4K
3090 FE Stock: 92 FPS, 3080 FE OC: 90 FPS, 2080 Ti OC: 72 FPS, Percentage change: 27% and 25%

RDR2 1440p
3090 FE Stock: 134, 3080 FE OC: 131, 2080 Ti OC: 118, Percentage change: 14% and 11%

Rainbow Six Siege 4K
3090 FE Stock: 203, 3080 FE OC: 184, 2080 Ti OC: 161, Percentage change: 26% and 14%
Rainbow Six Siege 1440p
3090 FE stock: 358, 3080 FE OC: 337, 2080 Ti OC: 299, Percentage Change: 20% and 13%
SOTTR 4K
3090 FE Stock: 104, 3080 FE OC: 94, 2080 Ti OC: 82, Percentage Change: 27% and 15%
SOTTR 1440p
3090 FE Stock: 169, 3080 FE OC: 158, 2080 Ti OC: 138, Percentage Change: 22% and 14%


Now, while we are at it, lets have a look at overclocked 2080 Ti vs overclocked 1080 Ti performance for comparison, including synthetics. Because I'm tired and because there isn't a meaningful percentage change between 1080 Ti and 2080 Ti at 4K I will only look at 1440p below. Youre more than welcome to look at 4K on your own.

RDR2 1440p
2080 Ti OC: 118, 1080 Ti OC: 74, Percentage Change: 59%
Rainbow Six Siege 1440p
2080 Ti OC: 299, 1080 Ti OC: 198, Percentage Change: 51%
SOTTR 1440p
2080 Ti OC: 138, 1080 Ti OC: 95, Percentage Change: 45%
Don't look here, people. Keep listening to butthurt FuD:

https://www.techpowerup.com/review/asus-geforce-rtx-3090-strix-oc/32.html
 
It's absolute shit, boys and girls. Don't buy a 3090 because it's literally as fast overclocked on air as SLi watercooled overclocked 1080 Tis, and blows the tits off of the 2080 Ti too.

https://www.3dmark.com/compare/spy/14215251/spy/11470743

Oh, and a 3090 overclocked on air is still faster than the fastest Ln2 3080.

https://www.3dmark.com/compare/pr/319857/pr/356777

Absolute shit, I agree. Don't buy one and let others enjoy the performance increase that the 20 series should have offered.

Oh yeah, 25% is really "blowing the tits" off the outgoing 80 Ti card. And just to be clear, the 3090 is 100% the 80 Ti card, it's not a rebadged Titan.

Here's overclocked 2080 Ti vs overclocked 1080 Ti for comparison:

https://www.3dmark.com/compare/spy/14187325/spy/2949106

60%.

Overclocked 2080 Ti at 373w power draw = 17k Timespy GPU
RTX 3090 FE @ 370w = 19k Timespy GPU

What is this, 20%?

Golf clap?

This is your definition of "blowing the tits" off of something? I suppose Digital Foundry's marketing is proving to be very effective.


Also, I paid $900 for my 2080 Ti local sale craigslist, no sales tax, 2 years remaining on it's transferable warranty.

How much did you pay for your 3090? $1800 after taxes for 25% gain?

Ampere is straight trash.

Edit:

I noticed you were careful not to compare the 3090 to 2080 Ti SLI, best to not do that, especially Port Royal, but since youre in the mood for comparing the new 80 Ti card against the an 80 Ti card two generations ago here's overclocked 2080 Ti against overclocked 980 Ti SLI:

https://www.3dmark.com/compare/spy/14187325/spy/11472204

That's a 30% difference, not the 3% difference between RTX 3090 and GTX 1080 Ti SLI.
 
Nah, it wouldn’t take much for me to get another CX 48. I bought and returned one when it turned out I wasn’t going to end up with an HDMI 2.1 GPU anytime soon but would just get another and ditch my G-sync display if I want to buy AMD this gen.
 

Youre not proving anything. All of the 2080 Ti benchmark comparisons are with the 2080 Ti limited to 260w whereas the 3080 is at 320w, 35% more, and the 3090 is at 350-370w, or 40% more. When you simply run 2080 Ti at the same power draw the performance disparity looks like this:

2080 Ti = 10% slower than 3080 and 25% slower than the 3090 @ 1440p and 20% slower than the 3080 and 35% slower than the 3090 at 4K.

I've already proved this in my last post, and that was against OVERCLOCKED 3080. 3090 has NO overclocking headroom. Try 7-10% @ 480w TOPS.

Meanwhile 2080 Ti has a 30% overclock, taking Timespy GPU from 13,600 all the way to 17k no problem, a 3.5k increase here and in games.

3080 goes from 17.5k to 18.5-19k a 1.5k increase
3090 goes from 19k to 21.5k a 1.5k increase

8nm EUV is straight trash.

Here, you apparently didn't read my analysis:

https://www.overclock.net/threads/official-nvidia-rtx-3090-owners-club.1753930/page-60

Do the math, compare 2080 Ti at the same power draw (the OC Strix on the chart) to both the 3080 and 3090.

Here, let me help you:

RDR2 4K
3090 FE Stock: 92 FPS, 3080 FE OC: 90 FPS, 2080 Ti OC: 72 FPS, Percentage change: 27% and 25%

RDR2 1440p
3090 FE Stock: 134, 3080 FE OC: 131, 2080 Ti OC: 118, Percentage change: 14% and 11%

Rainbow Six Siege 4K
3090 FE Stock: 203, 3080 FE OC: 184, 2080 Ti OC: 161, Percentage change: 26% and 14%
Rainbow Six Siege 1440p
3090 FE stock: 358, 3080 FE OC: 337, 2080 Ti OC: 299, Percentage Change: 20% and 13%
SOTTR 4K
3090 FE Stock: 104, 3080 FE OC: 94, 2080 Ti OC: 82, Percentage Change: 27% and 15%
SOTTR 1440p
3090 FE Stock: 169, 3080 FE OC: 158, 2080 Ti OC: 138, Percentage Change: 22% and 14%


Now, while we are at it, lets have a look at overclocked 2080 Ti vs overclocked 1080 Ti performance for comparison, including synthetics. Because I'm tired and because there isn't a meaningful percentage change between 1080 Ti and 2080 Ti at 4K I will only look at 1440p below. Youre more than welcome to look at 4K on your own.

RDR2 1440p
2080 Ti OC: 118, 1080 Ti OC: 74, Percentage Change: 59%
Rainbow Six Siege 1440p
2080 Ti OC: 299, 1080 Ti OC: 198, Percentage Change: 51%
SOTTR 1440p
2080 Ti OC: 138, 1080 Ti OC: 95, Percentage Change: 45%

Before going forward I feel the need to place emphasis that, as I stated repeatedly, GA-102 is right at the very edge of the performance-efficiency curve and any more power does not translate linearly into performance gain. We see this with someone here commenting that Strix @ 480w yields 3% improvement over FE @ 390w and the fact that GA102-200 has a whopping 7% overclock and all of the figures above are comparing overclocked 2080 Ti to overclocked 3080 and that 20% at 1440p and 30% at 4K average has shrunk down to 11-14% at 1440p and 14-25% at 4K. I will include both non-overclocked and overclocked performance gains between architectures in the following synthetics. We will be looking at both Timespy and Port Royal, to be fair to the meaningful RT gains of the new architecture with exception of the 980 Ti to 1080 Ti comparison where Firestrike is used instead:

3DMark Timespy GPU
3090 FE: 20,144 (390w) Overclocked: (no Timespy GPU data here, but it seems 3090 FE overclocked is good for another 5%)

3080 FE: 17,832 (320w)

2080 Ti FE: 13,610 (260w)

2080 Ti FE OC on air at same power draw as 3080: 15,500 (320w)

1080 Ti FE: 9521 (260w) (earlier benches have this card at 8660, they may have re-benched this with newer drivers)

Percentage change normalizing for power output:

1080 Ti FE to 2080 Ti FE: 43% (with newer bench, 57% with original bench of 8660)

2080 Ti FE to 3080 FE @ 320w: 15%

2080 Ti FE to 3090 FE @ comparable power draw (373w vs 390w): 20% and 25% (3090 overclocked)


GTX 1080 Ti vs 980 Ti @ 300w (3DMark site is slow right now, but it's 20k vs 30k, or 50%)

Port Royal: 2080 Ti @ 373w = 10,500, 3080 @ 320w = 11,450, 3090 @ 390w = 13,500, percentage change: 9% and 30%

The 3090 shows a meaningful gain in RT, but imagine this metric with a 628mm2 7nm TSMC die, or about 30-40% higher than this for the same amount of money. But don't worry, NGreedia saved $26 per yield going with Samsung 8nm EUV and no, the queue wasn't full and wasn't forecasted to be full (current 7nm TSMC capacity is at 30%) because crApple was switching their demand to 5nm and freeing up that much capacity.

This is a good one, I think HyperMatrix made this statement:

"The 3080 is unrivaled in efficiency, we have never seen a gain of 25% over the outgoing 80 Ti card at the same power draw".

GTX 1080 = 23% faster than 980 Ti @ 36% less power draw.

GTX 1080 Ti = 50% faster than 980 Ti @ the same power draw.

AND THEY ONLY DROPPED THE NODE SIZE 43%.

12nm FFT to 8nm EUV = 50%.

And we are celebrating this mediocrity?

Additionally, not only is this mediocre performance uplift considering the node halved but imagine the performance uplift between a 425mm2 3080 (same performance as now) and 628mm2 3090 on 7nm TSMC.

10% and most of you will buy this crap, telling NV "hey, that's great, keep making garbage and slap whatever price-tag on it, as long as it's 10% faster than the next fastest card I will buy it at any price because I have more money than sense and don't understand that I've created this very situation by buying your overpriced Titan Xp and your overpriced 2080 Ti at full price and then some"

It's just disgusting honestly.

And please save your "you can't afford it so youre complaining nonsense". I could buy this if I wanted to, but with the water block bringing the total to $1850 after taxes (FE + Bykski block) I would probably have to cancel my HP Reverb G2 pre-order, and for what, 25% gain at 1440p? Maybe 45% with DLSS on?

This is garbage.

I will wait to see what AMD has in store.

Ampere is basically Nvidia's Comet Lake moment where the value proposition become absurd. Like Comet Lake, Ampere is an expensive inferior node and needs an incredible amount of power for marginal gains over whatever came before it. Meanwhile AMD is taking over as the CPU industry leader if they haven't done so already.


$26 per yield for this mediocrity. That's all NV is saving having gone with 8nm EUV over 7nm TSMC and they knew that with slick marketing, a dumbed down consumer-base and a growing niche "boutique" market demographic (the .01%, the only demographic that can buy this crap) that they could make whatever and sell it.

They literally made you whatever and youre buying it and cheering all along the way.

If AMD release a card (full 80 CU unit Navi 21) that runs faster than the 3080 with 16GB of video memory @ $1000 NV is done.

With any console port, which 90% of PC games are, NV is going to have implement it's exclusive technologies on top of an engine that has been heavily optimized around RDNA 2, the architecture of Big Navi. The 3080 may still be faster on paper, in 3DMark Timespy and Port Royal, or it may be slower, but watch Big Navi run the console games, I don't know, 25% faster because the console based path tracing, AI super-sampling and "RTX IO" will be done natively and therefore with less driver and API involvement.

See this is how it works, when you keep buying whatever a manufacturer makes you invariably end up in a situation where they sit on their laurels and don't push the envelop because they know that they have a captive market who will buy whatever is shoveled their way.

See: Intel

Jim from AdoredTV is right. This is Nvidia's Dumbest Decision because now AMD is going to do to Nvidia what they did to Intel. We are going to see a HUGE upset in the discrete GPU industry this year.

And guess what?

See Intel? They are getting off the 14nm node.

They have to.

Nvidia?

They will be forced to develop Hopper early.

I don't bear any allegiance to any brand or flag. As a combat veteran thrown under the bus by my country long ago I learned that lesson the hard way.

Stop being a fan boy.

Don't buy this garbage.

Compel NV to produce a quality product.

Remember,
NV is saving $26 per yield with 8nm EUV.

Imagine the 3090 with a 628mm2 7nm TSMC die that is 30-40% faster than the 425mm2 die (that is as fast as the current GA-102-200 628mm2 8nm EUV die).

Same price.

40% faster, easily.

Now THAT is card worthy of a moniker denoting dual GPU, THAT would be a 3090.
 
Youre not proving anything. All of the 2080 Ti benchmark comparisons are with the 2080 Ti limited to 260w whereas the 3080 is at 320w, 35% more, and the 3090 is at 350-370w, or 40% more. When you simply run 2080 Ti at the same power draw the performance disparity looks like this:

2080 Ti = 10% slower than 3080 and 25% slower than the 3090 @ 1440p and 20% slower than the 3080 and 35% slower than the 3090 at 4K.

I've already proved this in my last post, and that was against OVERCLOCKED 3080. 3090 has NO overclocking headroom. Try 7-10% @ 480w TOPS.

Meanwhile 2080 Ti has a 30% overclock, taking Timespy GPU from 13,600 all the way to 17k no problem, a 3.5k increase here and in games.

3080 goes from 17.5k to 18.5-19k a 1.5k increase
3090 goes from 19k to 21.5k a 1.5k increase

8nm EUV is straight trash.

Here, you apparently didn't read my analysis:

https://www.overclock.net/threads/official-nvidia-rtx-3090-owners-club.1753930/page-60

Do the math, compare 2080 Ti at the same power draw (the OC Strix on the chart) to both the 3080 and 3090.

Here, let me help you:

RDR2 4K
3090 FE Stock: 92 FPS, 3080 FE OC: 90 FPS, 2080 Ti OC: 72 FPS, Percentage change: 27% and 25%

RDR2 1440p
3090 FE Stock: 134, 3080 FE OC: 131, 2080 Ti OC: 118, Percentage change: 14% and 11%

Rainbow Six Siege 4K
3090 FE Stock: 203, 3080 FE OC: 184, 2080 Ti OC: 161, Percentage change: 26% and 14%
Rainbow Six Siege 1440p
3090 FE stock: 358, 3080 FE OC: 337, 2080 Ti OC: 299, Percentage Change: 20% and 13%
SOTTR 4K
3090 FE Stock: 104, 3080 FE OC: 94, 2080 Ti OC: 82, Percentage Change: 27% and 15%
SOTTR 1440p
3090 FE Stock: 169, 3080 FE OC: 158, 2080 Ti OC: 138, Percentage Change: 22% and 14%


Now, while we are at it, lets have a look at overclocked 2080 Ti vs overclocked 1080 Ti performance for comparison, including synthetics. Because I'm tired and because there isn't a meaningful percentage change between 1080 Ti and 2080 Ti at 4K I will only look at 1440p below. Youre more than welcome to look at 4K on your own.

RDR2 1440p
2080 Ti OC: 118, 1080 Ti OC: 74, Percentage Change: 59%
Rainbow Six Siege 1440p
2080 Ti OC: 299, 1080 Ti OC: 198, Percentage Change: 51%
SOTTR 1440p
2080 Ti OC: 138, 1080 Ti OC: 95, Percentage Change: 45%

Before going forward I feel the need to place emphasis that, as I stated repeatedly, GA-102 is right at the very edge of the performance-efficiency curve and any more power does not translate linearly into performance gain. We see this with someone here commenting that Strix @ 480w yields 3% improvement over FE @ 390w and the fact that GA102-200 has a whopping 7% overclock and all of the figures above are comparing overclocked 2080 Ti to overclocked 3080 and that 20% at 1440p and 30% at 4K average has shrunk down to 11-14% at 1440p and 14-25% at 4K. I will include both non-overclocked and overclocked performance gains between architectures in the following synthetics. We will be looking at both Timespy and Port Royal, to be fair to the meaningful RT gains of the new architecture with exception of the 980 Ti to 1080 Ti comparison where Firestrike is used instead:

3DMark Timespy GPU
3090 FE: 20,144 (390w) Overclocked: (no Timespy GPU data here, but it seems 3090 FE overclocked is good for another 5%)

3080 FE: 17,832 (320w)

2080 Ti FE: 13,610 (260w)

2080 Ti FE OC on air at same power draw as 3080: 15,500 (320w)

1080 Ti FE: 9521 (260w) (earlier benches have this card at 8660, they may have re-benched this with newer drivers)

Percentage change normalizing for power output:

1080 Ti FE to 2080 Ti FE: 43% (with newer bench, 57% with original bench of 8660)

2080 Ti FE to 3080 FE @ 320w: 15%

2080 Ti FE to 3090 FE @ comparable power draw (373w vs 390w): 20% and 25% (3090 overclocked)


GTX 1080 Ti vs 980 Ti @ 300w (3DMark site is slow right now, but it's 20k vs 30k, or 50%)

Port Royal: 2080 Ti @ 373w = 10,500, 3080 @ 320w = 11,450, 3090 @ 390w = 13,500, percentage change: 9% and 30%

The 3090 shows a meaningful gain in RT, but imagine this metric with a 628mm2 7nm TSMC die, or about 30-40% higher than this for the same amount of money. But don't worry, NGreedia saved $26 per yield going with Samsung 8nm EUV and no, the queue wasn't full and wasn't forecasted to be full (current 7nm TSMC capacity is at 30%) because crApple was switching their demand to 5nm and freeing up that much capacity.

This is a good one, I think HyperMatrix made this statement:

"The 3080 is unrivaled in efficiency, we have never seen a gain of 25% over the outgoing 80 Ti card at the same power draw".

GTX 1080 = 23% faster than 980 Ti @ 36% less power draw.

GTX 1080 Ti = 50% faster than 980 Ti @ the same power draw.

AND THEY ONLY DROPPED THE NODE SIZE 43%.

12nm FFT to 8nm EUV = 50%.

And we are celebrating this mediocrity?

Additionally, not only is this mediocre performance uplift considering the node halved but imagine the performance uplift between a 425mm2 3080 (same performance as now) and 628mm2 3090 on 7nm TSMC.

10% and most of you will buy this crap, telling NV "hey, that's great, keep making garbage and slap whatever price-tag on it, as long as it's 10% faster than the next fastest card I will buy it at any price because I have more money than sense and don't understand that I've created this very situation by buying your overpriced Titan Xp and your overpriced 2080 Ti at full price and then some"

It's just disgusting honestly.

And please save your "you can't afford it so youre complaining nonsense". I could buy this if I wanted to, but with the water block bringing the total to $1850 after taxes (FE + Bykski block) I would probably have to cancel my HP Reverb G2 pre-order, and for what, 25% gain at 1440p? Maybe 45% with DLSS on?

This is garbage.

I will wait to see what AMD has in store.

Ampere is basically Nvidia's Comet Lake moment where the value proposition become absurd. Like Comet Lake, Ampere is an expensive inferior node and needs an incredible amount of power for marginal gains over whatever came before it. Meanwhile AMD is taking over as the CPU industry leader if they haven't done so already.


$26 per yield for this mediocrity. That's all NV is saving having gone with 8nm EUV over 7nm TSMC and they knew that with slick marketing, a dumbed down consumer-base and a growing niche "boutique" market demographic (the .01%, the only demographic that can buy this crap) that they could make whatever and sell it.

They literally made you whatever and youre buying it and cheering all along the way.

If AMD release a card (full 80 CU unit Navi 21) that runs faster than the 3080 with 16GB of video memory @ $1000 NV is done.

With any console port, which 90% of PC games are, NV is going to have implement it's exclusive technologies on top of an engine that has been heavily optimized around RDNA 2, the architecture of Big Navi. The 3080 may still be faster on paper, in 3DMark Timespy and Port Royal, or it may be slower, but watch Big Navi run the console games, I don't know, 25% faster because the console based path tracing, AI super-sampling and "RTX IO" will be done natively and therefore with less driver and API involvement.

See this is how it works, when you keep buying whatever a manufacturer makes you invariably end up in a situation where they sit on their laurels and don't push the envelop because they know that they have a captive market who will buy whatever is shoveled their way.

See: Intel

Jim from AdoredTV is right. This is Nvidia's Dumbest Decision because now AMD is going to do to Nvidia what they did to Intel. We are going to see a HUGE upset in the discrete GPU industry this year.

And guess what?

See Intel? They are getting off the 14nm node.

They have to.

Nvidia?

They will be forced to develop Hopper early.

I don't bear any allegiance to any brand or flag. As a combat veteran thrown under the bus by my country long ago I learned that lesson the hard way.

Stop being a fan boy.

Don't buy this garbage.

Compel NV to produce a quality product.

Remember,
NV is saving $26 per yield with 8nm EUV.

Imagine the 3090 with a 628mm2 7nm TSMC die that is 30-40% faster than the 425mm2 die (that is as fast as the current GA-102-200 628mm2 8nm EUV die).

Same price.

40% faster, easily.

Now THAT is card worthy of a moniker denoting dual GPU, THAT would be a 3090.
Yes, lets limit the cards artificially. That makes total sense.

(for those without a sarcasm meter, that was sarcasm)

As on the EVGA forums, I'm going to ignore future posts from you. Why? You have such a serious problem with this new series being faster than the 2080 Ti that it's fucking sad. I'm sorry that you likely spent too much on the new tech 2080 Ti and now feel the need to crusade against the 3080 which is near 1/2 the cost for more performance.

While I am busy ignoring anything you might have said in the future, I will be enjoying single card performance that before I only had in very few games with an SLI 1080 Ti watercooled setup, and ~75% more performance than that free 2080 Super (also watercooled) that I tried out.
 
Last edited:
That's what I said. I didn't think they would be able to support FreeSync on monitors with hardware G-Sync support. Dell only confirms that. I was responding to someone who seemed to think that you could do FreeSync on hardware G-Sync displays. Something I didn't think was the case, but admitted I hadn't looked into it. AMD hasn't made a GPU I'd be interested in for a very long time.

Viewsonic has the XG270QG which uses the newer G-Sync hardware module allowing for Freesync support, and Viewsonic has said these monitors have the newer firmware installed allowing them to work with AMD Cards.
 
I hated the idea of getting a monitor that would lock me into a particular video card side (Nividia vs AMD). I waited long enough until Nividia finally had their "G-Sync Compatible" imitative and bought a monitor that can do both freesync and g-sync without issue.
This is basically what I did. My Dell S2719DGF actually isn't in this initiative, but it can still do both just fine. No way in hell i am locking myself into just one cpu or gpu vendor
 
And you don't understand that I understood that from the beginning. Quite possibly before you even knew about the tech.

What you fail to comprehend is that it is still G-Sync per Nvidia, even without the module. It might not have the same range, but G-Sync it is just like all of the other G-Sync compatible displays that do it over DP.

If you still don't want to come to terms with that, may I suggest you take your firefighting skills directly to Nvidia so that they can update their big ol list of Gsync displays to more properly conform to your world view? https://www.nvidia.com/en-us/geforce/products/g-sync-monitors/specs/
Funny, given NVIDIA agrees with me and not you:

https://www.nvidia.com/en-us/geforce/news/g-sync-ces-2019-announcements/

There are good monitors out there though, and so to bring these monitors to GeForce gamers, and expand the G-SYNC ecosystem, we’re introducing “G-SYNC Compatible”. We will test monitors that deliver a baseline VRR experience on GeForce GTX 10-Series, GeForce GTX 16-Series and GeForce RTX 20-Series graphics cards, and activate their VRR features automatically, enabling GeForce gamers to find and buy VRR monitors that will improve their gaming experience.

G-SYNC Compatible testing validates that the monitor does not show blanking, pulsing, flickering, ghosting or other artifacts during VRR gaming. They also validate that the monitor can operate in VRR at any game frame rate by supporting a VRR range of at least 2.4:1 (e.g. 60Hz-144Hz), and offer the gamer a seamless experience by enabling VRR by default.

"GSync Compatible" just means NVIDIA accepts the monitor's VRR implementation meets it's GSync branding criteria; nothing more and nothing less.

Second, as I noted and you ignored, NVIDIA had to specifically backport HDMI-VRR into Turing to enable VRR on LGs TVs. The XB1 did the same. See:

https://www.overclock3d.net/news/gp...i_2_1_vrr_support_to_its_rtx_20_series_gpus/1
https://www.overclock3d.net/news/gp...he_world_s_first_g-sync_compatible_oled_tvs/1

We are excited to bring G-SYNC Compatible support to LG’s 2019 OLED TVs and HDMI Variable Refresh Rate support to our GeForce RTX 20-Series GPUs. Gamers will be blown away by the responsiveness and the lifelike visuals on these TVs when playing the latest PC games featuring real-time ray-tracing powered by GeForce.

OC3D put it best:

Let's be clear here; these screens are merely G-Sync Compatible. There's no G-Sync modules or anything fancy here. These screens support HDMI 2.1's VRR (Variable Refresh Rate) standard and pass Nvidia's G-Sync Compatible test suite. Any graphics card that supports HDMI 2.1 VRR can enjoy variable refresh rates on these screens.

Nvidia has done what they have done with VESA Adaptive-Sync. They have put the feature under their G-Sync banner by adding HDMI 2.1 VRR screens to their G-Sync Compatible rating system. With this move, Nvidia has inserted its branding into what is nothing more than HDMI 2.1's VVR support, albeit what appears to well-implemented VRR support.

You are simply buying into the branding, and ignoring the fact that every monitor marked as "Gsync Compatible" is using VESA Adaptive Sync or HDMI-VRR under the hood.
 
Back
Top