Not too much testing has been done with chill, performance impact/real gaming impact. Does it impact the gaming experience? How does it work with VR? I am not even sure why bother with it in the first place. It seems to be more applicable to cell phone gaming and mobile gaming when on the battery. Plus Chill should have a setting so you could adjust the minimum frame rate value and sensitivity to the amount of motion - meaning I just don't see ever using it at this time.
Radeon 64 LC, Powersaver mode, 4K, Alien isolation max settings with tweaks to increase shadows map size and something else I don't remember, frames limited to 60 fps (to keep in Freesync range). The benchmark never went below 59fps, pulling around 235w from the wall. I was very surprised on that result and that was without chill on, 17.9.1 drivers. Rendering a frame rate beyond your refresh rate of your monitor or Freesync range not only degrades the gaming experience with tearing, judder etc. but also just waste energy. Only time I can see rendering faster than the max sync rate of your monitor is if you use Fastsync with Nvidia hardware and you are rendering 2x+ over your refresh rate to reduce game latency which for me I would probably not see a significant enough improvement to bother.
I am just totally rethinking what makes better gaming for me and it is just not the average FPS which could be higher but yet has stuttering, missed frames etc. I am finding the 1% threshold or FPS (1% of the frame rate would be below given value) to be a more accurate indication from a numbers standpoint the gaming performance of a GPU. The .1% data I've seen recorded on various sites looks to be very inaccurate due to low sampling of data, I would say you would need 15min + of gaming to get a good .1% data point.
Radeon 64 LC, Powersaver mode, 4K, Alien isolation max settings with tweaks to increase shadows map size and something else I don't remember, frames limited to 60 fps (to keep in Freesync range). The benchmark never went below 59fps, pulling around 235w from the wall. I was very surprised on that result and that was without chill on, 17.9.1 drivers. Rendering a frame rate beyond your refresh rate of your monitor or Freesync range not only degrades the gaming experience with tearing, judder etc. but also just waste energy. Only time I can see rendering faster than the max sync rate of your monitor is if you use Fastsync with Nvidia hardware and you are rendering 2x+ over your refresh rate to reduce game latency which for me I would probably not see a significant enough improvement to bother.
I am just totally rethinking what makes better gaming for me and it is just not the average FPS which could be higher but yet has stuttering, missed frames etc. I am finding the 1% threshold or FPS (1% of the frame rate would be below given value) to be a more accurate indication from a numbers standpoint the gaming performance of a GPU. The .1% data I've seen recorded on various sites looks to be very inaccurate due to low sampling of data, I would say you would need 15min + of gaming to get a good .1% data point.