Radeon Technologies Group Update 2016: FreeSync and HDR (PCPer article)

cageymaru

Fully [H]
Joined
Apr 10, 2003
Messages
22,060
Radeon Technologies Group Update 2016: FreeSync and HDR
http://www.pcper.com/reviews/Graphics-Cards/Radeon-Technologies-Group-Update-2016-FreeSync-and-HDR

AMD holds a private symposium for select press members including PCPer to get a look at the new technology that is coming from RTG in 2016 / 2017. Looks like the R9 300 series of cards are going to get new life in 2016 when paired with HDR displays. Displayport 1.3 has 80% more bandwidth and still not enough to push a 5K HDR display. FreeSync over HDMI is coming, and they were bragging on the new implementation of FreeSync in the Crimson drivers when you get out of the VRR range (LFC).

1080p HDR displays look better than 4K displays. Repeat this guys. 1080p looks better than 4K. 80% more bandwidth to drive them. This is crazy cool!

Oh and now you can go to Best Buy and purchase a Lenovo Y700 FreeSync notebook which starts at $899.
 
HDR 1080p monitors that blow 4K standard monitors out of the water has my interest.
 
4k 120hz, FreeSync just getting better and better. My monit or can use hdr 4k, will it look better than 5k?

Consoles with freesync? Many monitors people alrewdy own will be updated with freesync capability (the true win). So much fucking win, I never thought freesync would become so amazing
 
Last edited:
HDR 1080p monitors that blow 4K standard monitors out of the water has my interest.

Me too, but I worry about how sharp 1080p will be once you go larger for normal tasks.

For games and movies it's probably fine... sort of. But After jumping from 1080p to 1440p I appreciate the fact that I can't so easily make out pixels on the screen at that size and distance on a 27"


I really hoped displayport would be able to do 4k/120Hz/HDR - but it seems we have to pick two and stick with that.


Also, I wonder what the actual cutoff is. It seems those 3440x1440 can still get HDR @ 144Hz, Could they push a 4k display to say 75Hz and HDR?
 
dp 1.3 4k 120hz, seems as santa did plan for next years christmas.
 
Are there any relevant articles which describe HDR in its use for home-entertainment? what it requires, what the difference is... etc.

Edit: Never mind: I RTFM'd
 
Nothing but hate in a thread about all good things. Still waiting for this list of broken promises?

There's actually many promises nvidia still has yet to deliver on. Want to use g sync, sli, and dsr? Tough shit it don't work, but on amd it does (vsr, xfire, freesync). Same thing with sli and mfaa.
 
Last edited:
I remember the Half-Life:Lost Coast HDR demo thinking it looked pretty damn good. I didn't realize that true HDR is going to require updated monitors. Does anybody know exactly what changes would be required to increase monitor luminance from 250 nits to 2,000? And a possible associated cost?
 
"true HDR" isn't about the monitor per say.

Flat screen monitors still can't compete with CRT's when it comes to how black or bright their pixels get. Although they have gotten quite a bit better. This is why you have the refresh times and luminescence ratings for flat screen monitors.

Flat Screen TV's still don't have the tech that monitors right now have and this is why when you watch movies on most flat screen monitors the lighting tends to look flat and it kinda of looks like you are watching a soap opera instead of a movie.

The HDR flat screen tech should help solve these problem.

Expect the price to 2-3k for a top end monitor and TV. Prices won't change too much at the top end just that older tech gets cheaper.
 
From the PCPER article:

"Chances are most of you remember a lot of commotion many years ago from game developers claiming to have included HDR support in their engines. Though that’s true, the images had to be tone mapped back down to the SDR color space to be displayed on current monitors, washing away much of the advantage the rendering feature provided.

We already have HDR support in both GPU's and software. But it appears our current monitors aren't currently capable of accurately rendering the HDR images being generated. There are so many questions that weren't answered by that article.

  1. What components need to be upgraded to accurately display HDR images on a monitor?
  2. What is the cost associated with this upgrade?
  3. Will certain criteria need to be met before a monitor can be labeled as HDR capable?
  4. Will the component changes in HDR monitors affect other aspects (decreased refresh rates, pixel response time, etc)?
  5. What will be a ballpark price difference between an HDR monitor and its regular SDR equivalent?

I'll take increased visual quality over increased number of pixels any day of the week.
 
From the PCPER article:

"Chances are most of you remember a lot of commotion many years ago from game developers claiming to have included HDR support in their engines. Though that’s true, the images had to be tone mapped back down to the SDR color space to be displayed on current monitors, washing away much of the advantage the rendering feature provided.

We already have HDR support in both GPU's and software. But it appears our current monitors aren't currently capable of accurately rendering the HDR images being generated. There are so many questions that weren't answered by that article.

  1. What components need to be upgraded to accurately display HDR images on a monitor?
  2. What is the cost associated with this upgrade?
  3. Will certain criteria need to be met before a monitor can be labeled as HDR capable?
  4. Will the component changes in HDR monitors affect other aspects (decreased refresh rates, pixel response time, etc)?
  5. What will be a ballpark price difference between an HDR monitor and its regular SDR equivalent?

I'll take increased visual quality over increased number of pixels any day of the week.


HDR is High dynamic range lighting. Its has nothing to do with resolution, it has everything to do with how each pixel of the monitor can get as dark or as bright.

With current LCD technology, there is a limitation of the contrast ratio because of the backlight and how much it can luminate or how quickly it can go back to black and how dark its black can get. CRT's didn't have this limitation since it was a projection.

I don't think you need to upgrade any other components other than a graphics card because of the increased bandwidth needs for it and of course the monitor.

The criteria is as above, their contrast ratio and color ranges.

Take the HDR monitors as the new high end and the next level down being last generation. Prices will not change so top end monitors will still be at 2k, and current monitors drop a level, to 400 to 800 bucks depending on features....
 
HDR seems to mostly apply to maximum brightness of any given pixel. For something to be HDR "compatible" it would need a dramatic increase in the ability to brighten parts of the screen beyond current "normal" levels of brightness. And it would need the ability to read the extra data coming over the cable to define that extra level of brightness.

I don't believe this is ever going to make a huge dent in consumer space. Core LCD technology isn't very good at allowing one part of the screen to be super bright while allowing to black levels in other parts of the screen (mostly because local dimming solutions aren't cheap). Also this won't even be possible with projectors without a dramatic increase in power usage. And does anyone really want a screen that can get so bright it is difficult to look at areas of it? Are people going to go "Wow that sky shot was SO BRIGHT I don't even want a 4K TV now!" :rolleyes:
 
From FlatpanelsHD.com:

TVs obviously need to be able to output a higher brightness level but also a very low black level. Plasma TVs were not able to do that but LCDs and OLEDs are. On a LCD you will need a “local dimming” system to be able to control brightness locally in zones. The more zones the better. We have already seen edge LED based LCD TVs claim HDR support but in our opinion this is stretching it. Ideally you would want to be able to control light output from 0 nit to a maximum brightness level of 800-1000 nits (or much higher for Dolby Vision, typically 4,000-10,000) in every single pixel. Does that sound familiar? Yes that is how OLED displays work.

You will not be able to experience HDR on your current TV. You need a TV that is able to output a higher brightness level. The TV also needs to support the new PQ format, as discussed above.
HDR is layered on top of the signal as an extra package and signaled to the TV with metadata (requires HDMI 2.0a). If the TV does not support HDR it will simply ignore the extra package.

So it sounds as if OLED panels are a natural candidate for HDR capability rather than trying to adapt LCD.
 
Until OLEDs are free from potential image retention issues, I suspect manufacturers are going to be super excited to allow parts of the screen to get super bright.
 
HDR seems to mostly apply to maximum brightness of any given pixel. For something to be HDR "compatible" it would need a dramatic increase in the ability to brighten parts of the screen beyond current "normal" levels of brightness. And it would need the ability to read the extra data coming over the cable to define that extra level of brightness.

I don't believe this is ever going to make a huge dent in consumer space. Core LCD technology isn't very good at allowing one part of the screen to be super bright while allowing to black levels in other parts of the screen (mostly because local dimming solutions aren't cheap). Also this won't even be possible with projectors without a dramatic increase in power usage. And does anyone really want a screen that can get so bright it is difficult to look at areas of it? Are people going to go "Wow that sky shot was SO BRIGHT I don't even want a 4K TV now!" :rolleyes:


Yeah pretty much well its brightness and darkness, it better have darkness too lol
 
From FlatpanelsHD.com:




So it sounds as if OLED panels are a natural candidate for HDR capability rather than trying to adapt LCD.


OLED's are too expensive right now for larger panels, if mass production increases possibly don't really have a clue about it though.
 
Until OLEDs are free from potential image retention issues, I suspect manufacturers are going to be super excited to allow parts of the screen to get super bright.

Hmmm.... From wikipedia:

OLEDs also have a much faster response time than an LCD. Using response time compensation technologies, the fastest modern LCDs can reach as low as 1 ms response times for their fastest color transition and are capable of refresh frequencies as high as 144 Hz. OLED response times are up to 1,000 times faster than LCD according to LG, putting conservative estimates at under 10 μs (0.01 ms), which in theory could accommodate refresh frequencies approaching 100 kHz (100,000 Hz). Due to their extremely fast response time, OLED displays can also be easily designed to interpolate black frames, creating an effect similar to CRT flicker in order to avoid the sample-and-hold behavior used on both LCDs and some OLED displays that creates the perception of motion blur.
If the wiki is correct, image retention should not be an issue with OLED.
 
I thought hdr also touched on color range as well? Maybe that will have more to do with native 10 bit monitors/tvs that support an increased color space.


I think the real benefits will come from hdr content designed for both an increased range of contrast AND color. And we do need different displays to achieve that. I remember hp had a dreamcolor display that supported native 10 bit color, but we need 10 bit oled screens.
 
On oled response times, they are supposed to be fast but has anyone noticed a related issue of bad input lag on the current lg oled tvs? Is that just lg being sloppy with the post processing or something related to the tech to worry about?
 
LOL well guess Terry is a bit of a sensitive guy, might have hurt his feelings :D

I think amd just needs to sit Kyle in front of a 4k oled hdr tv with a movie/game mastered for the format. That will melt the ice and shade filling his heart.
 
If the wiki is correct, image retention should not be an issue with OLED.

I'll believe this when they are on the market for longer. It is going to take a lot more than a Wiki to sell me that the tech is fine for displaying static images over a long timespan.

I've had two AMOLED phones that people said "Don't worry about burn in, it won't happen". Both have notable screen burn now. I had three or four plasma TVs since 2011 when everyone said "Image Retention is a thing of the past", and each TV had image retention (which led to me abandoning the whole thing even though I love plasma picture quality). Now we are onto OLED TVs and some users are experiencing image retention and burn in while others go "I have zero retention, no one has to worry about that anymore!"
 
I think amd just needs to sit Kyle in front of a 4k oled hdr tv with a movie/game mastered for the format. That will melt the ice and shade filling his heart.

And this has been the issue lately with AMD, a lot of talk and not a lot of demonstration. Paper launch with Nano, then denying HardOCP a sample. Paper launch with Crimson, expecting writeups about how great it will be with not ever showing anything beyond a slide deck.
 
Burn in is a very common issue with AMOLED phones, specially the early models, it is not a defect in the screen it is more like a standard side effect, i would imagine the same will hold true for OLED panels, so change your wallpaper, icons etc every 2 weeks and you are good lol.
 
"true HDR" isn't about the monitor per say.

Flat screen monitors still can't compete with CRT's when it comes to how black or bright their pixels get. Although they have gotten quite a bit better. This is why you have the refresh times and luminescence ratings for flat screen monitors.

This is the reason why "true HDR" is about the monitor. The only monitors/TVs that will support HDR are local dimming LED and OLED (and whichever else technology allows variable luminosity across the surface of the panel).
 
I've failed to see you calling out Nvidia on their delays...

Delays and paper launches are not the same thing. I am an AMD fan and even I can see through their crap. However, I do like the Crimson control panel but, why have people around just to show them a bunch of slides before it was released?

Oh well, good luck AMD, you will need it.
 
Delays and paper launches are not the same thing. I am an AMD fan and even I can see through their crap. However, I do like the Crimson control panel but, why have people around just to show them a bunch of slides before it was released?

AMD's PR department is trying to very hard to release any information that keeps people talking about AMD. AMD is not in the best spot spot, however being part of the discussion is preferred over people simply forgetting that AMD is releasing new products and features.

Heck I originally clicked on the "Radeon Technologies Group Update 2016" because I was hoping it had information about 2016 products >_>
 
AMD's PR department is trying to very hard to release any information that keeps people talking about AMD. AMD is not in the best spot spot, however being part of the discussion is preferred over people simply forgetting that AMD is releasing new products and features.

I say give up some information on the next generation of Video cards and processors then. That is what folks want to know and one real reason I purchased an 980Ti instead of waiting. (I also could not fit a Fury in my case either.) I am gaming at 4k so waiting was not something I wanted to do anymore.

The one problem for AMD as I see it is they have been doing lots of paper launches over the last few years. I remember them hyping up being in the tablet space and also showed us working prototypes but, we cannot buy anything with AMD inside at all in the tablet space.
 
I've failed to see you calling out Nvidia on their delays... At least, with such a vitriolic post. You just sound butthurt because the other kids don't want to play with you.


I think he saying what AMD is saying isn't really their technology that is making this happen, and that is the truth, it isn't their tech, they are not spear heading this, its all about panel manufactures, there is a benefit for IHV's in the graphics card business because of this new technology though.

So by AMD putting freesync and this into the same "press presentation" doesn't do anything for AMD's bottom line at least not directly because well you still have another IHV that benefits form this too who have much more marketshare.

I'm 100% sure Kyle would have called that out, I know I would.
 
Delays and paper launches are not the same thing. I am an AMD fan and even I can see through their crap. However, I do like the Crimson control panel but, why have people around just to show them a bunch of slides before it was released?

Oh well, good luck AMD, you will need it.

To go over the features given, I don't see peoples problem with this. It's not like there was a long gap delay for crimson talk/release.

It was basically, here is what the new yearly mega update brings, here are the new features, and then it's released.
 
I've failed to see you calling out Nvidia on their delays... At least, with such a vitriolic post. You just sound butthurt because the other kids don't want to play with you.

I would reply with a rebuttal, but I am very aware of folks like you. No matter what I say you are going to cling to your belief that you fully understand what is going on in my head and fully know my motivations better than I do myself. So water, a duck's back, etc. :D
 
Heck I originally clicked on the "Radeon Technologies Group Update 2016" because I was hoping it had information about 2016 products >_>

Yeah me too. Now the HDR bit is interesting. I would kill for a 3440x1440 34" curved HDR with high Hz.

But instead what I saw in half the thread are people attacking [H] because AMD has been shady as fuck lately. Not 3.5GB shady, but consistently shady never the less. To me, it sounded like [H] was more annoyed because AMD wants sites to regurgitate slides based on something they can't verify and they have gotten burned in the past. At least that's how I read it.
 
Last edited:
I've failed to see you calling out Nvidia on their delays... At least, with such a vitriolic post. You just sound butthurt because the other kids don't want to play with you.

Kyle was even part of a large media event when celebrating the GPU from AMD/ATI.
The lack of substance is somewhat of a problem because then you have to refer to a source and tell the folks on [H} it is all true because person X at AMD said so. This is rather weird to tell the people which take you seriously to rely on a story from someone else while your website is about the experience of the [H] crew with this product.

The hot and cold press relations for featured products from AMD is also something which is not good for either party when they try and tell you your audience does not matter to us because were targeting another.
 
Yes, yes, AMD's past failings are well known. Their improvements in drivers the past few months by itself meant nothing, as we've seen that before.. HOWEVER - they have furthered this with the introduction of Crimson (it didn't come that long after announcement), and their earnest interest in HDR and improving Freesync positive as well. While it makes sense for Kyle to be frustrated with them - [H]'s format requires something tangible to report on, not marketing slides/presentations - it hardly behooves the rest of us to constantly bicker amongst ourselves regarding AMDs failings and Nvidia's supposed virtues.

Essentially, it's more beneficial to treat this with cautious optimism rather than vitriolic anger and disbelief. The coming months may become very interesting again for GPUs, if AMD continues with this momentum.
 
Yes, yes, AMD's past failings are well known. Their improvements in drivers the past few months by itself meant nothing, as we've seen that before.. HOWEVER - they have furthered this with the introduction of Crimson (it didn't come that long after announcement), and their earnest interest in HDR and improving Freesync positive as well. While it makes sense for Kyle to be frustrated with them - [H]'s format requires something tangible to report on, not marketing slides/presentations - it hardly behooves the rest of us to constantly bicker amongst ourselves regarding AMDs failings and Nvidia's supposed virtues.

Essentially, it's more beneficial to treat this with cautious optimism rather than vitriolic anger and disbelief. The coming months may become very interesting again for GPUs, if AMD continues with this momentum.
I'm more disappointed with Nvidia doing/announcing nothing, than I am with AMD for trumpeting their future accomplishments pre-maturely.
AMD might be blowing smoke but that's better than Nvidia who are apparently asleep at the wheel. The only thing Nvidia announced in the last 6+ months is how they're going to force driver updates through GFE... Fantastic.
 
I'm more disappointed with Nvidia doing/announcing nothing, than I am with AMD for trumpeting their future accomplishments pre-maturely.
AMD might be blowing smoke but that's better than Nvidia who are apparently asleep at the wheel. The only thing Nvidia announced in the last 6+ months is how they're going to force driver updates through GFE... Fantastic.

They announced GameWorks VR also. But like AMD we haven't seen much from it. Well at least I didn't see much other than a website.

I think that AMD releasing press releases is great to let the PC crowd know what is in the pipeline. But it would be 1000x better if they had say a Fury X2 release party at the same event and sent reviewers the card with plenty of time to do a proper review. Then Kyle could tease the card on the front of [H]ardocp instead of posting a press release with nothing to show.

That would let consumers know what AMD is doing now that they can physically touch and experience. By the same token it would build up hype for what is coming in the future. I miss when they had AIW cards with crazy ideas like Cablecard slots. Like ManofGod was talking about the tablets that we never heard a whisper about after the announcement. I was hyped up to buy one of those. Never saw a damn thing on the market.

Why not a BRIX style Steam box that has ROKU, Amazon fire support, etc. At least Nvidia tried with the Shield. Just seems like those little APUs from AMD could blow the lots of the competition out of the water if someone invented a purpose for them. Where is the little AMD APU handheld that I keep getting emails about every so often? I would buy that no questions asked just to play my Steam Library on it.

Maybe they have some stuff in the works, but money is tight so they have to make hard choices on what they pursue. Whatever it is I expect to see the best from them in the future with Vulkan, SteamOS, video cards, and Zen. The next consoles need to have FreeSync support over HDMI and AMD needs to figure out how to get their logo on more consumer items.

That's my 2 cents on the topic.
 
Go look at nVidia's Facebook page. They do neat stuff all the time. I can't wait for a game to use Physx Flex... YouTube that. They have 300 engineers just working on gameworks. They make some neat things and it's only getting better.

AMD has their own set of features that are less useful for my goals. Except maybe VR.
 
Damn, and I just got a shiney new 4k monitor...

Still, better monitor technology can't hurt, I am all for better contrast. Hopefully this will help migitate the Glow/bleed issues so many monitors are having.
 
Back
Top