AMD Rockets Past 100 FreeSync Displays

Megalith

24-bit/48kHz
Staff member
Joined
Aug 20, 2006
Messages
13,000
AMD just broke a record with Lenovo’s Y27f, which happens to be the 101st display released that supports their FreeSync technology. As you probably already know, these monitors provide a stutter- and tear-free experience.

AMD’s game-smoothing FreeSync monitors launched a full year after Nvidia’s rival G-Sync displays, but they’ve been coming fast and furious ever since. Late Thursday, the company revealed that its technology surpassed not one, but two major milestones with the launch of the 27-inch Y27f ($400 on Amazon) earlier this month. This curved, 144Hz 1080p display is both Lenovo’s first-ever FreeSync display, as well as the 101st FreeSync display released overall. FreeSync and G-Sync monitors synchronize the refresh rate of your graphics card with your display. That eliminates stutter and tearing, resulting in gameplay so buttery smooth that you’ll never be able to use a traditional non-variable refresh rate monitor again.
 
They sold 100? Woohooo! :p:D

pzio2pH.gif
 
It would be nice if they decided to make one of those free syncs 144hz 4k display.
 
Is this an old article? Because if I go on Newegg right now, there are 164 different Freesync screens.

This whole freesync vs gsync thing really pisses me off. Well, actually, G-sync and nvidia are the ones that piss me off. Its another tool to try to lock people in to Nvidia GPU's, because they know people replace theri GPU's about 5 times more often than they do their screens.

Nvidia could easily implement Freesync compatibility with their GPU's, but they won't, because it will counter-act their lock-in philosophy.

I really hate Nvidia's business tactics. Everything from their proprietary "Game Works" bullshit to Gsync serves only to limit consumer choice and lock people in. I hate the fact that their GPU's are really my only choice at the high end right now. Nothing AMD is even remotely fast enough for my needs, so I'm forced to support a company that pisses me off with this bullshit.
 
Is this an old article? Because if I go on Newegg right now, there are 164 different Freesync screens.

This whole freesync vs gsync thing really pisses me off. Well, actually, G-sync and nvidia are the ones that piss me off. Its another tool to try to lock people in to Nvidia GPU's, because they know people replace theri GPU's about 5 times more often than they do their screens.

Nvidia could easily implement Freesync compatibility with their GPU's, but they won't, because it will counter-act their lock-in philosophy.

I really hate Nvidia's business tactics. Everything from their proprietary "Game Works" bullshit to Gsync serves only to limit consumer choice and lock people in. I hate the fact that their GPU's are really my only choice at the high end right now. Nothing AMD is even remotely fast enough for my needs, so I'm forced to support a company that pisses me off with this bullshit.

I used to be 100% into AMD, and for a time things were "okay". I was with them up until Pascal came out. And the experience going from Xfire 290's on free sync, to a 1070 with Gsync has made me understand exactly why they they can charge what they want and do what they want. The overall experience really was light night and day between the two, after the shit I went through with AMD, I'll happily pay more for the nvidia stuff.

Still have a 270X in my media rig for light duty movie watching and such.
 
I used to be 100% into AMD, and for a time things were "okay". I was with them up until Pascal came out. And the experience going from Xfire 290's on free sync, to a 1070 with Gsync has made me understand exactly why they they can charge what they want and do what they want. The overall experience really was light night and day between the two, after the shit I went through with AMD, I'll happily pay more for the nvidia stuff.

Still have a 270X in my media rig for light duty movie watching and such.

I have no problem with Nvidia charging what they want (after all, I have a pascal Titan X :p )

My problem is with their other coercive tactics, like Nvidia specific effects in games with Game Works and locking users into their GPU's by selling them Gsync screens which won't work on non-Nvidia parts. That type of behavior really harshes my mellow. I should be able to pick between the independent offerings communicating with each other over non-proprietary interconnects so that each generation I can make the choice that is right for me. Not be locked in, or coerced by the fact that Nvidia goes in and makes certain effects in games only available on their hardware.
 
I have no problem with Nvidia charging what they want (after all, I have a pascal Titan X :p )

My problem is with their other coercive tactics, like Nvidia specific effects in games with Game Works and locking users into their GPU's by selling them Gsync screens which won't work on non-Nvidia parts. That type of behavior really harshes my mellow. I should be able to pick between the independent offerings communicating with each other over non-proprietary interconnects so that each generation I can make the choice that is right for me. Not be locked in, or coerced by the fact that Nvidia goes in and makes certain effects in games only available on their hardware.

Yet, you and folks like you are perpetuating the problem by purchasing NVidia hardware. So, the only ones at fault are not NVidia themselves but those who keep buying their stuff and giving them no reason to change. Your choice and your consequences, not my fault nor anyone elses.
 
Yet, you and folks like you are perpetuating the problem by purchasing NVidia hardware. So, the only ones at fault are not NVidia themselves but those who keep buying their stuff and giving them no reason to change. Your choice and your consequences, not my fault nor anyone elses.
I'd happily buy AMD products, just like I have in the past. The only problem is, AMD isn't producing anything competitive from a performance standpoint and even from a price standpoint they're not that great.
 
I'd happily buy AMD products, just like I have in the past. The only problem is, AMD isn't producing anything competitive from a performance standpoint and even from a price standpoint they're not that great.

Sad but true. I went 2600k after a decade or more of AMD because AMD couldn't compete.
Hopefully with Zen and Vega I can return to an AMD powered rig again.

I must say AMD have the drivers down now for single card, have not had issues for many years.

Everything 4K 40"+ is freesync for some strange reason.
 
Sad but true. I went 2600k after a decade or more of AMD because AMD couldn't compete.
Hopefully with Zen and Vega I can return to an AMD powered rig again.

I must say AMD have the drivers down now for single card, have not had issues for many years.

Everything 4K 40"+ is freesync for some strange reason.

There's an easy answer for that: anything over 35" is considered a TV by most people. So monitors in that size range are low-volume. The cost of adding the G-Sync module is very risky, so you'll get Freesync if anything at all.

And G-Sync is not targeted at TV buyers, for obvious reasons (no DisplayPort).

Only the people of small forums like this can recognize that TVs are just monitors. All my friend s are continuously wowed that my computer is hooked into my TV, and has been for years. But most of them lack balls enough to do it themselves.
 
Last edited:
  • Like
Reactions: N4CR
like this
Yet, you and folks like you are perpetuating the problem by purchasing NVidia hardware. So, the only ones at fault are not NVidia themselves but those who keep buying their stuff and giving them no reason to change. Your choice and your consequences, not my fault nor anyone elses.

Because people like "us" choose to buy the better product? Sorry but just like with games I don't care about the greater good nonsense, I buy what works for me. I don't base my purchases on other people's wants/needs. If AMD is better, I buy AMD like I did with my 7950s back in 2012/2013. If Nvidia is better I buy Nvidia like with the 980Ti this year.
 
I don't love anti-competitive practices but I do love my GSync monitor.

It's a shame AMD has been losing so long, if not we might have both brands using Freesync and top-end cards not costing $1200.
 
So AMD's got the monitors, and not the GPU's to drive them, while nVidia got the GPU's while their monitor selection is choked by the closed G-Sync standard.

What a day to be high-end gamer.

If AMD can come up with a Fury X like performance for $300ish, couple that with 1080p/144hz+ FreeSync monitor though, that should make a LOT of common gamers really happy.

Genuine question though, does FreeSync offer any function similar to ULMB? I am under the impression that ULMB seem to be a lot more generic on high refresh G-Sync monitors than motion blur reduction.
 
I was looking at a 1080p 60hz Samsung curved screen with Freesync and there are plenty of AMD cards that can push that no problem. But I have Nvidia video card and Fressync is tied to AMD cards just like Gsync is tied to Nvidia cards. They both do the proprietary thing but Gsync costs twice as much as Freesync monitors. I opted for a 144hz 1ms monitor instead.
 
Last edited:
144hz and 1ms really is the happy median right now. I managed to somehow, by the grace of whatever God I havent pissed off yet, find a gsync 144hz 24" 1ms monitor on craigslist for $150 to pair with my 1080. Ive played with gsync vs vsync and while in some shittier engines like Crytek (MWO, which makes it worse) it's pretty noticeable, its nothing you couldnt live without. the 144hz is definitely more important than F/Gsync.

and while I wholeheartedly believe Freesync will win, as most open industry standards tend to and there's plenty of historical examples, I'm so done with ATI that I dont care if Nvidia ends up being monopoly and charging twice as much for their cards. I'm done with the whole "wow, i REALLY want to play that game this weekend, but I have ATI, so I better wait a full 6 months."
 
144hz and 1ms really is the happy median right now. I managed to somehow, by the grace of whatever God I havent pissed off yet, find a gsync 144hz 24" 1ms monitor on craigslist for $150 to pair with my 1080. Ive played with gsync vs vsync and while in some shittier engines like Crytek (MWO, which makes it worse) it's pretty noticeable, its nothing you couldnt live without. the 144hz is definitely more important than F/Gsync.

and while I wholeheartedly believe Freesync will win, as most open industry standards tend to and there's plenty of historical examples, I'm so done with ATI that I dont care if Nvidia ends up being monopoly and charging twice as much for their cards. I'm done with the whole "wow, i REALLY want to play that game this weekend, but I have ATI, so I better wait a full 6 months."

That's cool your choice. However, you must understand that if your point of view came to pass, game makers would just shift back to the much less expensive consoles and the PC market would do nothing but suffer because of that. I am sure that even on here, most folks have better things to do with their money and time then to waste it on products that take twice as long to develop for minimal gains and will only work well on the most expensive cards. (No one benefits from a monopoly for long, least of all the customers.) Sorry but, this is the same thing as saying you would not have an issue with only one High Speed internet provider throughout the US.

Me, I have no issues with others owning Nvidia, they work well enough. However, I will not and besides, AMD works better on my setup and monitor, even with down scaling. (Samsung 28 inch 4k monitor.) Oh, and I have no idea where you would have to wait 6 months to play a game on AMD hardware when you can do that from day one.
 
  • Like
Reactions: N4CR
like this
I love my freesync Acer (XR341CK) monitor. I'm driving it with a RX480 and it's the perfect complement to it IMO. Even with frames dropping into the mids 40's, everything feels rock sold.
 
Seeing that people usually swap GPU's way more often than they do monitors, what I'd like to see are monitors that are both Gsync and Freesync compatible, so that buying a monitor does not lock you into a GPU brand.

Knowing Nvidia - however - it's probably a part of the Gsync contract that you can't do this. :mad:
 
I love my freesync Acer (XR341CK) monitor. I'm driving it with a RX480 and it's the perfect complement to it IMO. Even with frames dropping into the mids 40's, everything feels rock sold.

Monitor looks beautiful. A bit beyond my budget but, still.
 
Fallout 4, as an example. I preordered it, and it was completely unplayable on crossfire 7970's for probably 8 weeks after it came out? And SLI/Xfire support in general these days is pure ass (DOOM, Fallout 4, MWO, hell at least 2 others I tried in the last 2 years dont start to support multiple GPUs), therefore going with a single card solution, Nvidia wins and wins hard with the 1080. Even AMD's response was "well... we'll go back to our safe place and say we're cheaper!" which has been their fallback mantra since the K6 days.

I was exaggerating when I said I'd support a monopoly, but I'm done with the ATI driver headache. Done with the multi-card headache. Speaking of better things to do - when I pay hundreds of dollars (low thousands if you count monitors and playseats, etc) to play video games, I dont want to be spending hours in ini files, forums, and testing every driver version like a QA intern at AMD. Christ sake, I'm busy in my professional and personal life. AMD has made a mockery of the phrase "free time" in my free time over the last few years. I feel like I need a 1099 check cut from a QA or dev department somewhere.
 
Fallout 4, as an example. I preordered it, and it was completely unplayable on crossfire 7970's for probably 8 weeks after it came out? And SLI/Xfire support in general these days is pure ass (DOOM, Fallout 4, MWO, hell at least 2 others I tried in the last 2 years dont start to support multiple GPUs), therefore going with a single card solution, Nvidia wins and wins hard with the 1080. Even AMD's response was "well... we'll go back to our safe place and say we're cheaper!" which has been their fallback mantra since the K6 days.

I was exaggerating when I said I'd support a monopoly, but I'm done with the ATI driver headache. Done with the multi-card headache. Speaking of better things to do - when I pay hundreds of dollars (low thousands if you count monitors and playseats, etc) to play video games, I dont want to be spending hours in ini files, forums, and testing every driver version like a QA intern at AMD. Christ sake, I'm busy in my professional and personal life. AMD has made a mockery of the phrase "free time" in my free time over the last few years. I feel like I need a 1099 check cut from a QA or dev department somewhere.


Mostly agree. I have generally found single GPU AMD drivers to be fine, but even their top single GPU isn't enough right now, so I agree.
 
Is there any way to OBJECTIVELY compare Freesync vs G-Sync? All the subjective articles I've read seem to paint G-Sync as far superior. If that's true, than it makes sense for NV to charge more.

And correct me if I'm wrong, but Nvidia actually sells the G-Sync module as a physical hardware device that has to be built into the monitor by the manufacturer. Freesync is an open software/firmware standard that the manufacturer needs to incorporate in their design.
 
objectively, no. I tried to describe it to my friends, and you just sound like somebody high and trying to describe their favorite art. The best I could come up with was "it is to feel what graphics are to visuals." Ill merrily take 144hz and whatever-sync over uber-god-HD-TypeR-SS-Resolution anyday now that I've seen it, and 3 for 3 with my friends agree. Resolution is heavily overrated after you've experienced 144hz, 1ms, and augmented syncs.
 
Is there any way to OBJECTIVELY compare Freesync vs G-Sync? All the subjective articles I've read seem to paint G-Sync as far superior. If that's true, than it makes sense for NV to charge more.

That seems to be the conclusion I have read as well, but its really tough since you can't run them using the same GPU, so it's really not an apples to apples comparison.. Is Freesync really worse, or is it just that you are implicating AMD's drivers and GPU's as well?

And correct me if I'm wrong, but Nvidia actually sells the G-Sync module as a physical hardware device that has to be built into the monitor by the manufacturer. Freesync is an open software/firmware standard that the manufacturer needs to incorporate in their design.

That is accurate. Gsync is Nvidia hardware. it is unclear to me if they manufacture it and sell it to monitor companies, or if they just license it to them, but either way it is Nvidia proprietary hardware. The hardware offloads the screen syncing load, which is one of the reasons that has been attributed to why it generally performs better, whereas with FreeSync that duty falls to software that loads the CPU and GPU slightly.

Personally I would just prefer an open standard that works on ALL GPU's so I don't have to be locked in to a GPU vendor when I choose my monitor.

Nvidia is always trying this proprietary lock-in bullshit, and it pisses me off.
 
objectively, no. I tried to describe it to my friends, and you just sound like somebody high and trying to describe their favorite art. The best I could come up with was "it is to feel what graphics are to visuals." Ill merrily take 144hz and whatever-sync over uber-god-HD-TypeR-SS-Resolution anyday now that I've seen it, and 3 for 3 with my friends agree. Resolution is heavily overrated after you've experienced 144hz, 1ms, and augmented syncs.


It's subjective.

IMHO, anything over 60fps puts you into audiophile-like placebo territory, but the human brain is a funny thing.

If you think something will be better, you will actually experience it that way. That's how audiophile companies can keep selling overpriced useless crap to people with too much money :p
 
It's subjective.

IMHO, anything over 60fps puts you into audiophile-like placebo territory, but the human brain is a funny thing.

If you think something will be better, you will actually experience it that way. That's how audiophile companies can keep selling overpriced useless crap to people with too much money :p

But isn't the whole point of these technologies is to keep things working well BELOW 60fps?
 
uhm, no, believe me its not placebo, and its not even something that requires a game to visually see. it's just much more apparent in games. let me guess - you've never seen a 144hz monitor?
 
That seems to be the conclusion I have read as well, but its really tough since you can't run them using the same GPU, so it's really not an apples to apples comparison.. Is Freesync really worse, or is it just that you are implicating AMD's drivers and GPU's as well?



That is accurate. Gsync is Nvidia hardware. it is unclear to me if they manufacture it and sell it to monitor companies, or if they just license it to them, but either way it is Nvidia proprietary hardware. The hardware offloads the screen syncing load, which is one of the reasons that has been attributed to why it generally performs better, whereas with FreeSync that duty falls to software that loads the CPU and GPU slightly.

Personally I would just prefer an open standard that works on ALL GPU's so I don't have to be locked in to a GPU vendor when I choose my monitor.

Nvidia is always trying this proprietary lock-in bullshit, and it pisses me off.

I'm not a fan of lock-in options, and I'd like to be able to keep my options open myself. That being said, it might not be possible for Nvidia's solution to work without their hardware. It the overall design needs hardware on both the monitor and the GPU end, why wouldn't they charge for it? Nvidia might have tried a software-only solution and couldn't get the performance they wanted, and went to hardware instead. That's always a risk with the open-standards type solution. How well it works may depend on how the manufacturer implements your design. If Dell does a crap job implementing Freesync, it reflects badly on AMD and Dell. Nvidia's answer avoids that problem by keeping control of all the hardware/interface elements.

That's why I'd love to see some sort of objective comparison between G-Sync and Freesync. I know it'd damn near impossible given the fact you're running different cards and different drivers, but damn it'd be nice to know if that G-Sync module is worth the money.
 
But isn't the whole point of these technologies is to keep things working well BELOW 60fps?

IMHO, yes, that is the main purpose.

With my current setup, my pascal Titan X gives me 60+ fps in most titles I play at 4k, except for some pretty rare drops slightly below. Adaptive Vsync handles this pretty well, but its still annoying when it happens. if I had a gsync screen, those slight drops below 60fps would be less noticible.

Still, my preferred solution to this problem would be to just get a faster GPU next generation, and hopefully be able to stay above 60fps ALL the time.

The way many are using it - however - is to get monitor refresh rate syncing at the max framerates their high end GPU is able to produce on their 144hz gsync screens.

To me this is just a recipe for more fan noise, more heat and a higher power bill, while all you get in return is placebo level frame rates.

I'd take 4k @ 60hz over 1440p at 144hz any day, hands down. Anything above 60fps is placebo in my book.
 
  • Like
Reactions: N4CR
like this
uhm, no, believe me its not placebo, and its not even something that requires a game to visually see. it's just much more apparent in games. let me guess - you've never seen a 144hz monitor?

I've never used a 144hz LCD, no, but back in the CRT days I did play CS synced to high refresh rate CRT's so I know what the difference feels like. Most of my CS days I spent synced to 100hz at 1600x1200. Even tried 120hz to see what it was like.

To me anything under 30fps is absolutely unplayable. 30fps looks OK, but mouse feel is crap due to the inherently long frame times. By the time you reach 60fps, mouse feel is great, and smoothness looks as good as it is going to get. there might be minute improvements above 60fps, but in my personal experience they are not worth writing home about. If any noticeable improvement at all, at best I'll say there were some severe diminishing returns going on, but I do feel it's a whole lot of placebo as well.
 
  • Like
Reactions: N4CR
like this
I'm not a fan of lock-in options, and I'd like to be able to keep my options open myself. That being said, it might not be possible for Nvidia's solution to work without their hardware. It the overall design needs hardware on both the monitor and the GPU end, why wouldn't they charge for it? Nvidia might have tried a software-only solution and couldn't get the performance they wanted, and went to hardware instead. That's always a risk with the open-standards type solution. How well it works may depend on how the manufacturer implements your design. If Dell does a crap job implementing Freesync, it reflects badly on AMD and Dell. Nvidia's answer avoids that problem by keeping control of all the hardware/interface elements.

That's why I'd love to see some sort of objective comparison between G-Sync and Freesync. I know it'd damn near impossible given the fact you're running different cards and different drivers, but damn it'd be nice to know if that G-Sync module is worth the money.

But the thing is, we know that a open standard exists, and it is built into the display-port standard, so there isn't a need for a proprietary one. The only real purpose it serves is vendor lock-in.
 
you keep saying that, but I can only assume you havent seen it. I said and thought the same thing until I found a cheapo monitor on Craigslist to test before dropping $1000, and I'm glad I did. You keep dropping "placebo this, placebo that" but you havent said you've actually SEEN it.
 
responses were slow to update, my bad. ok, so something you've never seen is a placebo, and you made the comparison of LCDs to CRTs. ok then.
 
IMHO, yes, that is the main purpose.

With my current setup, my pascal Titan X gives me 60+ fps in most titles I play at 4k, except for some pretty rare drops slightly below. Adaptive Vsync handles this pretty well, but its still annoying when it happens. if I had a gsync screen, those slight drops below 60fps would be less noticible.

Still, my preferred solution to this problem would be to just get a faster GPU next generation, and hopefully be able to stay above 60fps ALL the time.

The way many are using it - however - is to get monitor refresh rate syncing at the max framerates their high end GPU is able to produce on their 144hz gsync screens.

To me this is just a recipe for more fan noise, more heat and a higher power bill, while all you get in return is placebo level frame rates.

I'd take 4k @ 60hz over 1440p at 144hz any day, hands down. Anything above 60fps is placebo in my book.

I run a pair of 980Tis and a 3440x1440@60Hz monitor, with no G-Sync myself. Almost everything I play stays above 60Hz, and I keep Vsync on too. So this whole discussion is really academic to me as well. I've played on a friend's 144Hz setup, and it looked smooth if I was really looking for it, but I couldn't really see it otherwise.

Now, if I could GET a >120Hz, 3440x1440 IPS monitor with G-Sync at a reasonable price, then I'd take it. I'd like a GPU setup that keeps me above 60Hz minimum, but I'd be OK with taking advantage of any extra FPS possible. I've watercooled, so I don't worry about heat and fan noise!
 
But the thing is, we know that a open standard exists, and it is built into the display-port standard, so there isn't a need for a proprietary one. The only real purpose it serves is vendor lock-in.

I'd need to see a side-by-side comparison then of the various technologies. Just because and open standard DOES exist, doesn't mean it's GOOD, or that a proprietary solution might not be significantly better in the real world.
 
I run a pair of 980Tis and a 3440x1440@60Hz monitor, with no G-Sync myself. Almost everything I play stays above 60Hz, and I keep Vsync on too. So this whole discussion is really academic to me as well. I've played on a friend's 144Hz setup, and it looked smooth if I was really looking for it, but I couldn't really see it otherwise.

Now, if I could GET a >120Hz, 3440x1440 IPS monitor with G-Sync at a reasonable price, then I'd take it. I'd like a GPU setup that keeps me above 60Hz minimum, but I'd be OK with taking advantage of any extra FPS possible. I've watercooled, so I don't worry about heat and fan noise!

Oh I'm with you. I'm really liking 4k though, and it's tough to get up to those framerates in anything but older titles, but if there were a decent 4k 40"+ screen (I don't see the point of using small 4k screens only to have to scale everything up) with higher than 60hz refresh rates and gsync) I'd at the very least be tempted, but no such product exists, and if it did, I'd be going back and forth as I'd be unhappy about the Nvidia lock-in.
 
Oh I'm with you. I'm really liking 4k though, and it's tough to get up to those framerates in anything but older titles, but if there were a decent 4k 40"+ screen (I don't see the point of using small 4k screens only to have to scale everything up) with higher than 60hz refresh rates and gsync) I'd at the very least be tempted, but no such product exists, and if it did, I'd be going back and forth as I'd be unhappy about the Nvidia lock-in.


I bought a 4k and this 34" 21:9 at the same time, and used them both for a few days. I returned the 4k and kept the ultrawide. I just really liked the "immersion" effect of the widescreen. Maybe I'm getting old, but I just really couldn't see the big difference with the 4k. It was a bit sharper in games, but it wasn't noticeable to me unless they were side-by-side. But that's totally personal preference, I could see how people would want the 4k.

And I'm just not willing to drop $1300 on the new Acer/Asus G-sync version of a 32" 4k OR a 34" UW. I'd rather spend the money on a Titan X Volta next year and keep the fps pegged at 60.
 
I bought a 4k and this 34" 21:9 at the same time, and used them both for a few days. I returned the 4k and kept the ultrawide. I just really liked the "immersion" effect of the widescreen. Maybe I'm getting old, but I just really couldn't see the big difference with the 4k. It was a bit sharper in games, but it wasn't noticeable to me unless they were side-by-side. But that's totally personal preference, I could see how people would want the 4k.

And I'm just not willing to drop $1300 on the new Acer/Asus G-sync version of a 32" 4k OR a 34" UW. I'd rather spend the money on a Titan X Volta next year and keep the fps pegged at 60.


Well, that was my point earlier when I said I don't see the point in smaller 4K screens. They might be slightly sharper, but the difference is subtle.

4k really makes sense when you get up to 40+ inch screen sizes. Now the differences are no longer subtle, and you can see so much more.

Keep in mind, a 40" 16:9 screen is about 3.6" wider than a 34" 21:9 screen, so you are still getting all of that wide immersion, without giving up the height, so it is 6.2" taller.

If you go with the 48" version I did (which I actually think is a little bit too big. The sweet spot is probably about 42") you are getting 10.6" wider and 10.1" taller than a 34" 21:9 display.

I wonder though. For some titles that are a little bit too demanding for full 4K resolution, I wonder if I could create a custom 3840x1607 21:9 resolution and run them at that. Should ahve 25% lower GPU demand than full 4k...
 
Back
Top