Intel to support freesync

I'm mainly curious if my G3258 will support FreeSync, or it would be relegated to newer Intel processors.

Never mind I just found my answer.....

Intel Desktop GPUs currently support eDP 1.2. But the Framebuffer and timing controller needed for the Panel Self refresh was only introduced in eDP 1.3.
 
Intel's iGPUs get better and better every generation. They're not the same crap that they were even five years ago.

Biggest problem still is Intel is trying to package all that into CPU's. They need to stop fucking around and start planning a legit video card architecture. Imagine Intel swallowing AMDs resources and tech and doing something good with all of it...We can only dream. Then again, with AMD's brass giving out handouts to each other I dont think its far of.
 
Biggest problem still is Intel is trying to package all that into CPU's. They need to stop fucking around and start planning a legit video card architecture. Imagine Intel swallowing AMDs resources and tech and doing something good with all of it...We can only dream. Then again, with AMD's brass giving out handouts to each other I dont think its far of.

Two Things.

Intel will never go into the discrete market. And Intel will never take over AMD.
 
Freesync...GSync...I don't really care...all I want is an IPS monitor that supports these that isn't $800.
 
Maybe I should have also specified around $300 max, lol.

Not spending $500-$800 on a monitor, but thanks for the links / info.
 
Maybe I should have also specified around $300 max, lol.

Not spending $500-$800 on a monitor, but thanks for the links / info.

Not bashing you by any means but I don't understand this. One's monitor probably gets the most use and longevity out of any hardware purchase. It is also the gateway/window into seeing all that powerful hardware put to use.

I don't understand dropping $700-1000+ on a computer but not wanting to spend more than $300 on a monitor. I purchased 3x UltraSharp's and have been using them since 2010 originally for eyefinity. I replaced the middle monitor with a 24" 144hz BenQ last year and shortly I'll be looking to upgrade to either 3x 27" 1440P IPS 144hz or 2-3 4K 27"+ IPS.

This is good news and hopefully nVidia will wise up and just support FreeSync along with the other 2/3rds or more of the market. Though, they've got to do something with all those useless denver/ARM boards taking up warehouse space.
 
LG 29UM67P Black 29" Adaptive-Sync (Free-Sync) 21:9 UltraWide, 60 Hz 5ms (GTG) IPS, 5,000,000:1 LED Backlight Gaming Monitor with Built-in Speaker, exclusive game mode - $349 @ newegg right now

Acer KN242HYL Black 23.8" 4ms (G to G) HDMI Widescreen LED Backlight LCD Monitor IPS 250 cd/m2 ACM 100,000,000:1 (1000:1) Built-in Speakers - $189
 
Freesync monitors have us covered pretty well right now, unlike gsync. All we need is freesync on nvidia cards and were good to go
 
Not bashing you by any means but I don't understand this. One's monitor probably gets the most use and longevity out of any hardware purchase. It is also the gateway/window into seeing all that powerful hardware put to use.

I don't understand dropping $700-1000+ on a computer but not wanting to spend more than $300 on a monitor. I purchased 3x UltraSharp's and have been using them since 2010 originally for eyefinity. I replaced the middle monitor with a 24" 144hz BenQ last year and shortly I'll be looking to upgrade to either 3x 27" 1440P IPS 144hz or 2-3 4K 27"+ IPS.
/QUOTE]

Indeed. Good monitor is going to last for several years.
 
Freesync monitors have us covered pretty well right now, unlike gsync. All we need is freesync on nvidia cards and were good to go

that would be nice, while g-sync might be better, I would be perfectly happy with freesync.

since adaptive sync is going to be part of display port specs, would that be usable with nvidia card over displayport?
 
that would be nice, while g-sync might be better, I would be perfectly happy with freesync.

since adaptive sync IS part of display port specs, would that be usable with nvidia card over displayport?

Fixed that for you :)
NV will need to add dp1.2a+ support into their GPU.. but as others have pointed out, they already support eDP 1.3 (see laptops), I suspect it's doable.. but of course NV wont allow that until they rung every possible cent from prospective buyers pockets 1st.
 
unfortunately, its optional.

Nvidia doesnt have to include it to still be in spec.
 
Fixed that for you :)
NV will need to add dp1.2a+ support into their GPU.. but as others have pointed out, they already support eDP 1.3 (see laptops), I suspect it's doable.. but of course NV wont allow that until they rung every possible cent from prospective buyers pockets 1st.

Well, Nvidia's laptop version still makes use of on-display framebuffer so they can have better range of framerates supported.

I don't think Nvidia is going to give in until Freesync can find a way to support that.
 
Not bashing you by any means but I don't understand this. One's monitor probably gets the most use and longevity out of any hardware purchase. It is also the gateway/window into seeing all that powerful hardware put to use.

Fair points, but I have also never had to spend $500+ on a monitor in the past to have some of the latest technologies and resolutions. I expect this is just too "new", though GSync has been around for awhile now with not a huge drop in price.

Too go with your analogy, no, I'm not willing to spend almost as much on a monitor as I did on the whole machine itself. *shrug*
 
Well, Nvidia's laptop version still makes use of on-display framebuffer so they can have better range of framerates supported.

I don't think Nvidia is going to give in until Freesync can find a way to support that.

What are you talking about? The eDP spec requires a timing control and a framebuffer. So Intel and AMD mobile parts that support eDP 1.3 have the same framebuffer.

IT's the same in the desktop cards. For Nvidia the Scaler, Framebuffer, etc are all in the Gsync unit on the monitor.

With Freesync, the Framebuffer is on the GPU instead of the monitor.

The advantage this has for Nvidia is that they have complete control over the scaler as it's part of the Gsync unit.

The range is decided by the scaler used. AOC, for example are releasing Freesync monitors with a full 30-144hz range.
 
Freesync monitors have us covered pretty well right now, unlike gsync. All we need is freesync on nvidia cards and were good to go

I have yet to see a freesync display that wouldn't be compromise on quality
 
Well I haven't seen a FreeSync monitor that operates from 9Hz - 240Hz so there are plenty of improvements to be made. :)
 
Well I haven't seen a FreeSync monitor that operates from 9Hz - 240Hz so there are plenty of improvements to be made. :)

Personally I don't care if a monitor can go down to 9hz or higher than 144hz, I mean gaming below 30fps (maybe 20fps) is horrible even with Gsync. And above 120fps, I would bet all my money that nobody could tell the difference between 120 and 144fps. I can't see the point in a 240hz monitor.
 
I have yet to see a freesync display that wouldn't be compromise on quality

What quality, image quality or build quality? If you're referring to the VRR window, well surprise surprise, 30 FPS gaming sucks dick whether G-Sync is on or not, at least when playing shooter based on my own experience. And Eurogamer agrees with me:

When the game operates in a 45-60fps "sweet spot", G-Sync gameplay is exceptional, but the variation in performance overall just didn't work for us when we moved out of this window. Turning TressFX off gave us a locked 60fps experience from start to finish in the same area, producing the optimal way to play the game - but not exactly the kind of G-Sync stress test we were hoping for.

To be clear, the experience is clearly and noticeably better than running the game on standard v-sync, and vastly superior to putting up with screen-tear, but the notion that G-Sync approximates the kind of consistency we get from a locked 60fps frame-rate doesn't really hold true when the underlying frame-rate can vary so radically, so quickly. We move down to 2x MSAA to improve matters and end up turning off the multi-sampling altogether as we reach the end of the Shanghai shoot-out. Frame-rates stay closer to our target 60fps, and inevitable dips in performance appear far less noticeable - G-Sync irons out the inconsistencies nicely, providing just the kind of consistent presentation we want.

G-Sync definitely helps to improve Crysis 3 over the existing options, but again, ramping things up to the very high settings we crave simply causes the delta between lowest and highest frame-rates to expand, encompassing a range that sits outside of the window Nvidia's new tech works best within. The situation is perhaps best exemplified by the initial level, which sees the player moving between internal and external environments, the latter saturated in a taxing stormy weather effect that can see frame-rate halve - even on a graphics card as powerful as a GTX 780. The jump between frame-rates in this instance is just too jarring for the G-Sync effect to truly work its magic.

And this pretty much nails the problem:

When we first looked at G-Sync at Nvidia's Montreal launch event, we marvelled at the sheer consistency of the experience in the pendulum and Tomb Raider demos. A drop down to 45fps incurred a little ghosting (frames were displayed on-screen for longer than the 60Hz standard 16.67ms, after all) but the fluidity of the experience looked very, very similar to the same demos running at 60fps - a remarkable achievement. However, the reason they looked so good was because of the regularity in the frame-rate - and that's not something typically associated with PC gaming. By running games completely unlocked, actual consistency while you play remains highly variable. G-Sync can mitigate the effects of this - but only to a certain degree.

There's a frame-rate threshold where the G-Sync effect begins to falter. It'll change from person to person, and from game to game, but across our testing, we found the sweet spot to be between 50-60fps in fast action games. Continual fluctuations beneath that were noticeable and while the overall presentation is preferable to v-sync, it still looked and felt not quite right


If you're talking about build quality, well I have plenty I could complain about my Acer XB270HU, especially the sub-par build quality, cheap plastic feel, and poor QC that's absolutely intolerable and should NOT happen for a panel at its price point. Oh and did I mention the buttons controlling the OSD just recently quit on me, and now I can't even control overdrive, or adjust brightness/contrast/color without a software solution? What a fucking joke.
 
As a current G-sync user, I actually hope Freesync takes off. Wider adoption and standardization across the board means that when I'm in the market for a new monitor a few years from now, the tech will be cheaper as well as improved.

This is good news.
 
Implying that the perfect combination of features is currently present in a g-sync monitor? Which one is that, in your opinion?

xb270hu has all these features, also good luck with that ghosting on freesync monitors..
 
xb270hu has all these features, also good luck with that ghosting on freesync monitors..

What features? I asked a question, I didn't make a statement.

As far as ghosting, benq has done a firmware update to fix the od/freesync issue, and newer models from all mfg have the issue sorted out. Thanks for your concern though.
 
That's like posting NVIDIA SLI DRIVERS DONT WORK IN WINDOWS 10 OMGZZZ, a day after the patch was released.
 
What features? I asked a question, I didn't make a statement.

As far as ghosting, benq has done a firmware update to fix the od/freesync issue, and newer models from all mfg have the issue sorted out. Thanks for your concern though.

still has ghosting with that firmware.. starting to think this is reddit where people don't bother to research.
 
still has ghosting with that firmware.. starting to think this is reddit where people don't bother to research.

No, this is [H], where loud mouth fanboys like to blow things out of proportion for their own agenda. If wanted to get into an argument like that I'd reply to Prime1.

http://www.pcper.com/reviews/Displa...-Display-V002-Firmware-Tested-Overdrive-Fixed



You also didn't answer my "statement". What list of features makes the xb270hu the perfect, no compromise, display over any freesync monitor. As per the actual statement my question was in reply to.

I have yet to see a freesync display that wouldn't be compromise on quality
 
did you read what you posted ? still has ghosting..

I did. I didn't say there was no ghosting. What I said was -

"loud mouth fanboys like to blow things out of proportion for their own agenda"

I stick by that statement. You, apparently, have either reading or comprehension issues. Because you're continually misreading what I write, and apparently what the reviewers write also.

Allyn Malventano said:
Here we see a very good overdrive implementation. This is actually the best we have seen a FreeSync display overdrive so far, as the ASUS IPS FreeSync panel showed the odd artifacts shown earlier.


Allyn Malventano said:
So we can now say there are two FreeSync displays with confirmed functional overdrive

Ryan Shrout said:
It is great to see a second monitor with FreeSync technology properly implement support for overdrive to improve the overall visual experience compared to NVIDIA's current crop of G-Sync monitors.

You're choosing to try and blow the remaining ghosting out of proportion, to make it appear to be a larger issue than it is.

Allyn @ pcper has been very vocal in his opposition to the implementation of freesync vs g-sync, if he feels like the firmware has lowered the ghosting to an acceptable level, then that, coupled with the pictures they took, means I'm comfortable agreeing with that conclusion.

Also, on the topic of reading comprehension...

xb270hu?
 
Last edited:
Back
Top