Now you do need 8 GB VRAM, folks

Ok, I wasn't aware of the difference. I think the choice of name is confusing at best and purposefully deceptive at worst.

In that case, why not just use triple-buffering? And don't say it adds input lag. If done properly it doesn't add input lag.. and you don't have to worry about screen tearing.

Triple buffering still doesn't allow free flowing framerates, you just have more points it cuts off to when you cannot keep up the max framerate. Also the framerates it cuts to are not necessarily perfectly in sync with the screen refresh rate, meaning each frame is visible in different length of time which causes slight judder to the motion. This was a common problem in movies where the videos are 24fps and the screens were 60hz, they do not divide evenly. This is what Gsync/Freesync tries to fix, the monitor refresh rate changes together with the game. Even low framerates can look relatively smooth since there is no visible judder between frames.
 
Triple buffering still doesn't allow free flowing framerates, you just have more points it cuts off to when you cannot keep up the max framerate. Also the framerates it cuts to are not necessarily perfectly in sync with the screen refresh rate, meaning each frame is visible in different length of time which causes slight judder to the motion. This was a common problem in movies where the videos are 24fps and the screens were 60hz, they do not divide evenly. This is what Gsync/Freesync tries to fix, the monitor refresh rate changes together with the game. Even low framerates can look relatively smooth since there is no visible judder between frames.

So the console game makers would rather have screen tearing instead of just setting the detail/resolution levels to appropriate settings in order to maintain a constant 60fps?

Dynamically adjusting the screen resolution and turning v-sync on/off is just retarded.

Maybe it makes sense in the console world.. and maybe it is acceptable to the console gamers, but it is just sad all the same.

At the same time I don't see why people are at all surprised that the PC version uses up more system resources than the console versions. If you want it to use about the same resources then just set it to the crappy settings that the consoles are using.
 
consoles use TVs, they dont have an async option...

and its fairly new for us monitor users as well..

Existing solutions are independent of hardware. (vsync, TB, etc)
 
Lower settings in multiplayer? I thought it was about winning and gun collections/looks or something
 
This is why I own 2 Titan X's. I knew for a fact that 4 and 6GB was never going to be enough.
 
This is why I own 2 Titan X's. I knew for a fact that 4 and 6GB was never going to be enough.

I need to try multi-monitor with my R9 390 and see how high I can keep the settings and still have a steady 60fps with CODBOPS3.
 
ROFL at black ops needing 8GB. Same engine they have used for 10 years just slightly modded. Pass.
 
60FPS sucks. It's not a good thing.

I used to play with v-sync disabled. I haven't even tried this game with v-sync disabled though. All my monitors are 60Hz and I really never replace them unless they die.
 
Back
Top