Project Scorpio Supports FreeSync and Next-Gen HDMI

Megalith

24-bit/48kHz
Staff member
Joined
Aug 20, 2006
Messages
13,000
And the hits keep coming for Microsoft’s 4K Xbox. It turns out that EuroGamer “held back” (man, they are really going to stretch this exclusive out, aren’t they) one pleasant detail about Project Scoprio: you will probably experience a lot less tearing and stutter, as the console supports AMD’s FreeSync. The system will even flaunt the latest HDMI (2.1) spec. Thanks to cageymaru for this one.

Adaptive refresh technology like AMD's FreeSync completely eliminates tearing and reduces stutter significantly by allowing the GPU to trigger the display refresh instead of adhering to a hard and fast 60Hz cycle. Essentially, the screen produces the next image immediately after the GPU finishes rendering it. The technology was pioneered by Nvidia's G-Sync, but it's the open standard variants - FreeSync and the upcoming HDMI 2.1 implementation - that Scorpio aims to support. In fact, Microsoft has actually implemented the FreeSync 2 standard, meaning compatibility with HDR and full support across the range of potential frame-rates.
 
So will TVs start supporting Freesync or should you buy a 27" 4k PC monitor to play on?
 
A monitor is just fine, TV's are overrated. I believe there are a few that support it though.

Freesync really only matters to me in FPS games where monitors with non-existent input lag are great for.
TV's however...
 
This is one of those small things that long term could be a big win for AMD. Saturate the market with AMD products. Cheap Free Sync enabled TVs that also make nice PC gaming monitors, get coders more in tune with optimizing for the platform, etc.

Glad to see Microsoft throwing all their hats in the ring, though. The XBone was kind of lackluster.
 
Free sync over hdmi, i believe there are only a handful of monitors that support it.
 
I applaud the attempt, but it really seems anti-climatic. Maybe this is a Zune scenario where Microsoft is simply ahead of its time?
 
TVs usualy have crapier panels than monitors, and even with monitors you rarly find one with minimum and maximum refresh rate gap to make freesync work correctly, how is it going to be on TVs ?
 
The system isn't out yet. It won't be out till Q4 this year. The questions of how many TV's support this is really head scratching. It uses HDMI 2.1 for features that are new and expected to be in future devices. It's as if you can't provide good news anymore without someone trying to pee in your coffee.
 
So will TVs start supporting Freesync or should you buy a 27" 4k PC monitor to play on?

HDMI 2.1 has the ability to do the same thing as FreeSync or G-Sync. TVs with the tech? Probably 2020 at the EARLIEST.
 
Yes.

YES.

It's about time this PC tech made its way to the TV and console group. Now how about starting to use DP in consoles? How about multi-screen in consoles as well? Then we just need more than a thimble-full count of FreeSync TV's.

All I know of right now is Wasabi Mango.
 
Yes.

YES.

It's about time this PC tech made its way to the TV and console group. Now how about starting to use DP in consoles? How about multi-screen in consoles as well? Then we just need more than a thimble-full count of FreeSync TV's.

All I know of right now is Wasabi Mango.

Well, displayport for consoles probably won't happen unless TVs start supporting it, right now HDMI is king on TVs...and with HDMI 2.1, will be for some time to come. And the TVs wont be "FreeSync" or "G-Sync", but a similar OPEN tech that Nvidia and AMD will have to support.
 
TVs usualy have crapier panels than monitors, and even with monitors you rarly find one with minimum and maximum refresh rate gap to make freesync work correctly, how is it going to be on TVs ?


Other way around. Commercial OLED displays were pushed heavily on tv well before the pc monitor side. Local dimming for improved contrast on lcd displays? Only on higher end tvs, on monitors it was nowhere to be found. There are many higher color bit depth monitors, and monitors do tend to have lower latency and input lag, but I think it will be easier to build those things onto superior high end tv panels than the other way around.


This is GREAT news, because now that an actual console can output variable refresh rates and has enough bandwidth to support higher than 8 bit color @4k and higher refresh rates, we are more likely to see tvs start offering such things as features.

I'd imagine letting an amd gpu take over some of the workload of color mapping to specific HDR displays with freesync 2 certification might yield some lower latencies than you otherwise might get as well on tvs with their underpowered arm chips doing the work. And all this means we will start to see freesync 2 capable tvs sooner than we otherwise might. And in smaller sizes with HDR, as often some kids in their room might have their own tv for a console, and the "premium" console tv can support variable refresh rates for smoother gameplay and allow panel makers to make more money.

Finally, the long dark ages of low refresh nothing perfect for gaming at larger sizes/resolutions/quality is nearing an end.
 
That is awesome news...if the free sync 4k tvs are available before launch. Still, glad that they are thinking ahead. It will make a big difference in game play when frame rate drops.

"Xbox one and Xbox 360 games will also benefit from the feature" Those games should have no problem running 60 FPS at 1080p so it will not be noticeable.
 
MY only issue with this is I worry that we'll start seeing super expensive "gaming TV's" instead of just TV's that support gaming. And we all know what happens to prices when equipment is marketed as "for gaming"... :-/
 
Well, displayport for consoles probably won't happen unless TVs start supporting it, right now HDMI is king on TVs...and with HDMI 2.1, will be for some time to come. And the TVs wont be "FreeSync" or "G-Sync", but a similar OPEN tech that Nvidia and AMD will have to support.
Panasonic had a line of 4K televisions that came equipped with DisplayPort 1.2 inputs back in 2014 (AX800, et al.). Doesn't look like they produce televisions anymore, though.
 
So is Scorpio truly an APU? As in CPU and GPU are integrated into a single chip?

If this is the case --- then isn't that a BIG DEAL!?!?! Not so much for Console performance (for a dedicated console - the 6TF is a bit underwhelming to me) ----- but in relation to porting this type of GPU/CPU technology into much smaller - thinner laptops? 6TF of processing power into a super thin laptop chasis or even a 7" tablet??? That is a BIG deal!

I remember AMD and Intel hyping this type of technology for years - but it's always been "in the future". Is it here now???

Why don't we have any laptops like this --- if this tech is actually available? Onboard video has been pretty much crap performance previously right? To get high performance GPUs in a laptop they typically have to be "gamer laptops" big - hulky, thick laptops with big fans, noise, and heat.

This seems like a big deal in the context of it being a single chip APU? Or am I just out of the loop?




Scorpio engine.png
Scorpio memory interface.png
 
Last edited:
Does this mean that games are going to have vsync options or (worse) start defaulting to vsync off unless you have a FreeSync monitor? I'm not a fan of either.
 
It's still going to have a pretty big cooler vapor-chamber cooling. I don't think we will see this chip in a laptop anytime soon.

3079334-9075459317-xbox-.png
project-scorpio.png
 
It's still going to have a pretty big cooler vapor-chamber cooling. I don't think we will see this chip in a laptop anytime soon.

3079334-9075459317-xbox-.png
project-scorpio.png

Perhaps an Aluminum chassis like MacBooks? Where the chip could use the entire chassis as a heatsink? Maybe???? Just wondering aloud.
 
Does this mean that games are going to have vsync options or (worse) start defaulting to vsync off unless you have a FreeSync monitor? I'm not a fan of either.

I'm under the impression that console devs chose what to use, either vsync or letting screen tearing happen.
 
I'm under the impression that console devs chose what to use, either vsync or letting screen tearing happen.

I know that's how it is now, but it seems like at least most choose to keep vsync on. I'm hoping this isn't an excuse for everyone to leave it off since it boosts performance and can always be fixed if you have a very, very specific monitor.
 
So is Scorpio truly an APU? As in CPU and GPU are integrated into a single chip?

If this is the case --- then isn't that a BIG DEAL!?!?! Not so much for Console performance (for a dedicated console - the 6TF is a bit underwhelming to me) ----- but in relation to porting this type of GPU/CPU technology into much smaller - thinner laptops? 6TF of processing power into a super thin laptop chasis or even a 7" tablet??? That is a BIG deal!

I remember AMD and Intel hyping this type of technology for years - but it's always been "in the future". Is it here now???

Why don't we have any laptops like this --- if this tech is actually available? Onboard video has been pretty much crap performance previously right? To get high performance GPUs in a laptop they typically have to be "gamer laptops" big - hulky, thick laptops with big fans, noise, and heat.

This seems like a big deal in the context of it being a single chip APU? Or am I just out of the loop?

Perhaps an Aluminum chassis like MacBooks? Where the chip could use the entire chassis as a heatsink? Maybe???? Just wondering aloud.


Can it do this? Can it do that? Does it go hear? Does it go there? Huh? What! - Jesus Christ, you are freaking me out. There is no magic physics in play here. If you want the most performance (for the time), you need surface area to dissipate the heat.
 
Can it do this? Can it do that? Does it go hear? Does it go there? Huh? What! - Jesus Christ, you are freaking me out. There is no magic physics in play here. If you want the most performance (for the time), you need surface area to dissipate the heat.

I don't think you understand what I'm trying to confirm?

maybe Brent_Justice can confirm what I'm asking?
-------------
Let me break it down here:
Intel's best consumer CPU has a built in video card on DIE -- that GPU function is called the Iris Plus Graphics 650 --- It is attached/built in to all Kaby Lake CPUs. It's processing power is rated at a measly 0.8 teraflops. That's utterly useless for current generation games --- which is why if you want a gaming laptop (or gaming PC) you don't use onboard video. Instead you buy a laptop or desktop with a NVidia or AMD daughter card. Of course that option makes the laptop much bigger, louder, heavier, and increases battery consumption.

In comparison --- If I'm reading this right --- the Xbox Scorpio's CPU die includes a GPU (APU) rated at 6 teraflops???

If I'm understanding that right -- That's a big dang deal. RX480 or even RX580 GPU ability on a small CPU (APU) die? Is that accurate? Is the Scorpio CPU/GPU on a single APU chip?

I had assumed there'd be a more traditional separate video card in the Xbox Scorpio -- like traditional console and PC hardware. An APU with the power of the RX480 is a big deal for general products (laptops, tablets) outside of the Xbox Scorpio.
 
As much as I like FreeSync being on more devices, I always wonder if that would have ANY impact on games that are locked to 30 fps?

One take-away thing I noticed from TVs in general is that the people who say 30fps are not noticeable from 60fps might actually be comparing them when looking it on a TV, which adds post processing that makes a low fps motion look a lot smoother than on a corresponding monitor (this is done by comparing the same MKV filed, played on the same device on my 4k monitor and 4k TV).

Locked 30fps on the console will still be locked to 30fps on FreeSync (IE nothing would change). The only time where games WOULD make a difference on FreeSync would be if the fps is somewhere between 30 and 60 fps, but from what I know of Consoles, it's one or the other, rarely in between, but my main complaint is that there is no way to undo the 30fps lock in some cases.

Unless Scorpio can change that, I honestly don't see how FreeSync is ever going to work for consoles unless they completely do away with fps locks, on both hardware and software level.
 
As much as I like FreeSync being on more devices, I always wonder if that would have ANY impact on games that are locked to 30 fps?

One take-away thing I noticed from TVs in general is that the people who say 30fps are not noticeable from 60fps might actually be comparing them when looking it on a TV, which adds post processing that makes a low fps motion look a lot smoother than on a corresponding monitor (this is done by comparing the same MKV filed, played on the same device on my 4k monitor and 4k TV).

Locked 30fps on the console will still be locked to 30fps on FreeSync (IE nothing would change). The only time where games WOULD make a difference on FreeSync would be if the fps is somewhere between 30 and 60 fps, but from what I know of Consoles, it's one or the other, rarely in between, but my main complaint is that there is no way to undo the 30fps lock in some cases.

Unless Scorpio can change that, I honestly don't see how FreeSync is ever going to work for consoles unless they completely do away with fps locks, on both hardware and software level.

According to the article, Freesync 2 broadens the range that it works, quoting the article,

In fact, Microsoft has actually implemented the FreeSync 2 standard, meaning compatibility with HDR and full support across the range of potential frame-rates. Paired with a supported screen, this will even eliminate tearing on games running with adaptive v-sync with frame-rates under 30fps, something not supported on most FreeSync 1 screens (VRR range varied on a per-screen basis, with 40Hz to 60Hz commonplace).

It also allows devs to enable higher framerates for users with monitors that have the standard.

In here and now, what we can say is that Scorpio's adaptive sync support is baked in at the system level - the developer doesn't need to worry about it (though they could enable higher frame-rate caps for VRR users if the overhead is there).
 
According to the article, Freesync 2 broadens the range that it works, quoting the article.

Thanks, but I wasn't referring to the FreeSync range, but the actual FPS.

A locked 30fps on FreeSync will still look like a locked 30fps.
 
Thanks, but I wasn't referring to the FreeSync range, but the actual FPS.

A locked 30fps on FreeSync will still look like a locked 30fps.

Tried figuring out if anyone tested this. I've got a video from Nvidia showing off G-Sync's benefits vs Vsync and non-vsync 30 fps.

 
Tried figuring out if anyone tested this. I've got a video from Nvidia showing off G-Sync's benefits vs Vsync and non-vsync 30 fps.

My takeaway conclusion with G-Sync and FreeSync is that it, essentially, causes the monitor to run at variable refresh rates on the fly so that frames are only drawn when the whole frame is rendered, but unlike V-Sync, it forces the monitor to immediate display that frame, rather than forcing the frame to rate for an arbitrary set timing within the monitor's controller itself. So a 55 fps content on a 60hz G-Sync/FreeSync display won't be a mix of 30 and 60 hz with V-Sync, but the monitor itself would change to being a 55hz display.

This has obvious benefits when the content in question is of variable fps, which in games will usually do.

My main problem is that, when it comes to specifically 30fps, it is already a whole number divisor of 60, so on a typical 60hz panel, FreeSync or not (assuming it is locked), should still feel the same. I usually completely avoid 30fps during games if at all possible, the only game I tolerate it on is FFX because I have no choice, and I still notice its low framerate even with G-Sync, but 30fps is at the exact cutoff point for my G-Sync range, so I have no idea if 31fps will be noticeably better than 30 under that circumstance.

This is my primary question. If consoles were not locked to 30 fps, but allowed to change its fps between 30 and 60 (like a normal PC), I would not even be asking questions about it here, the benefits of supporting FreeSync is immediately obvious. It's the locked 30fps that made me wonder if supporting FreeSync in this case has ANY benefit whatsoever, especially given the fact that consoles will generally be played on TVs and they already have post processing that smooths out the perceived low frame rates.
 
My takeaway conclusion with G-Sync and FreeSync is that it, essentially, causes the monitor to run at variable refresh rates on the fly so that frames are only drawn when the whole frame is rendered, but unlike V-Sync, it forces the monitor to immediate display that frame, rather than forcing the frame to rate for an arbitrary set timing within the monitor's controller itself. So a 55 fps content on a 60hz G-Sync/FreeSync display won't be a mix of 30 and 60 hz with V-Sync, but the monitor itself would change to being a 55hz display.

This has obvious benefits when the content in question is of variable fps, which in games will usually do.

My main problem is that, when it comes to specifically 30fps, it is already a whole number divisor of 60, so on a typical 60hz panel, FreeSync or not (assuming it is locked), should still feel the same. I usually completely avoid 30fps during games if at all possible, the only game I tolerate it on is FFX because I have no choice, and I still notice its low framerate even with G-Sync, but 30fps is at the exact cutoff point for my G-Sync range, so I have no idea if 31fps will be noticeably better than 30 under that circumstance.

This is my primary question. If consoles were not locked to 30 fps, but allowed to change its fps between 30 and 60 (like a normal PC), I would not even be asking questions about it here, the benefits of supporting FreeSync is immediately obvious. It's the locked 30fps that made me wonder if supporting FreeSync in this case has ANY benefit whatsoever, especially given the fact that consoles will generally be played on TVs and they already have post processing that smooths out the perceived low frame rates.

Did you see this quote in the article?
In fact, Microsoft has actually implemented the FreeSync 2 standard, meaning compatibility with HDR and full support across the range of potential frame-rates. Paired with a supported screen, this will even eliminate tearing on games running with adaptive v-sync with frame-rates under 30fps, something not supported on most FreeSync 1 screens (VRR range varied on a per-screen basis, with 40Hz to 60Hz commonplace).

Of course, how things work in reality will differ from this discussion now, so we'll have to wait for Freesync 2/HDMI 2.1 TVs to test to actually see what benefits actually occur.
 
I don't think you understand what I'm trying to confirm?

maybe Brent_Justice can confirm what I'm asking?
-------------
Let me break it down here:
Intel's best consumer CPU has a built in video card on DIE -- that GPU function is called the Iris Plus Graphics 650 --- It is attached/built in to all Kaby Lake CPUs. It's processing power is rated at a measly 0.8 teraflops. That's utterly useless for current generation games --- which is why if you want a gaming laptop (or gaming PC) you don't use onboard video. Instead you buy a laptop or desktop with a NVidia or AMD daughter card. Of course that option makes the laptop much bigger, louder, heavier, and increases battery consumption.

In comparison --- If I'm reading this right --- the Xbox Scorpio's CPU die includes a GPU (APU) rated at 6 teraflops???

If I'm understanding that right -- That's a big dang deal. RX480 or even RX580 GPU ability on a small CPU (APU) die? Is that accurate? Is the Scorpio CPU/GPU on a single APU chip?

I had assumed there'd be a more traditional separate video card in the Xbox Scorpio -- like traditional console and PC hardware. An APU with the power of the RX480 is a big deal for general products (laptops, tablets) outside of the Xbox Scorpio.

Now I understand. Yes, It is a single chip and far surpasses any APU in existence. It combines Jaguar with a modified polaris chip - quite a feat. There is good news for PC down the road in regards to APUs. Raven Ridge will combine a quad core Ryzen with a smaller Vega chip. Not sure if it will be as powerful as Scorpio, but it will be a HUGE improvement over the current PC APUs. It could make for an amazing SFF gaming machine. Sadly, it will not release until AFTER Scorpio. BTW, I actually thought Broadwell (Iris 6200) had the best on die graphics from Intel.
 
Now I understand. Yes, It is a single chip and far surpasses any APU in existence. It combines Jaguar with a modified polaris chip - quite a feat. There is good news for PC down the road in regards to APUs. Raven Ridge will combine a quad core Ryzen with a smaller Vega chip. Not sure if it will be as powerful as Scorpio, but it will be a HUGE improvement over the current PC APUs. It could make for an amazing SFF gaming machine. Sadly, it will not release until AFTER Scorpio. BTW, I actually thought Broadwell (Iris 6200) had the best on die graphics from Intel.

Looks like I'll be re-buying AMD stock. That's a HUGE future step. Intel has nothing of the sort. An APU like that would go MUCH father in the huge middle upper market than an Intel + Nvidia 960m type config. I wasn't impressed with Scorpio until now --- and it's definitely not for the reasons the console fan are impressed.
 
Tried figuring out if anyone tested this. I've got a video from Nvidia showing off G-Sync's benefits vs Vsync and non-vsync 30 fps.

Few things:

-I think his point is it doesn't matter if it has freesync if the GAME ITSELF is locked to 30fps.

-Your video example is at 60fps, not 30fps. 30 tends to still look bad. If you want to see what gsync looks like at 30fps, change the resolution on the video to 480p, that will turn off 60fps and you can see it at 30.

-Most of the benefits of freesync / gsync disappear if you can maintain your framerate above 60fps. The video you linked to would be typical of a game struggling to hit 60fps. If it was at 60 or above, I think the two would be almost indistinguishable.
 
Perhaps an Aluminum chassis like MacBooks? Where the chip could use the entire chassis as a heatsink? Maybe???? Just wondering aloud.
Looks like I'll be re-buying AMD stock. That's a HUGE future step. Intel has nothing of the sort. An APU like that would go MUCH father in the huge middle upper market than an Intel + Nvidia 960m type config. I wasn't impressed with Scorpio until now --- and it's definitely not for the reasons the console fan are impressed.


All this to say I was right? :)
 
Back
Top