Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
Look at the graph they posted, it looks enormously smoother even if the raw gain isn't massive. That's pretty nice.
Someone gif'd it
http://abload.de/img/test0gstb.gif[/QUOTE]
part of me just thinks that's because AMD help code the mantel part of the game and probably helped out the inept EA developers. Nothing about BF4 release history says the developers know how to optimize and write well done code.
I mean what's up with this
[IMG]http://i.imgur.com/0FpJP0B.jpg
None of the other sets ups have the @ xx.xHz differ but the last one the crossfire set up it's 59.9Hz and 120.0Hz instead of 60.0Hz
Look at the graph they posted, it looks enormously smoother even if the raw gain isn't massive. That's pretty nice.
Someone gif'd it
There's no scale for the horizontal axis of that graph, we don't know if it's relative or absolute
There's no scale for the horizontal axis of that graph, we don't know if it's relative or absolute
Edit: Actually, if it was an absolute representation of framerate, DirectX would have a better average framerate than Mantle there...
Where do you see that? The graph isn't labeled...It's showing frame-delivery, not FPS.
Sorry, meant vertical, but nether axis is actually labeled. The numbers below the graph also don't say if they represent FPS or frame-time.edit: oops, horizontal?? It's framerate dependent how fast it plots, just look at the numbers themselves.
It's reporting the DXGI-enumerated refresh rate. A 59.9x Hz refresh rate isn't alarming at all.None of the other sets ups have the @ xx.xHz differ but the last one the crossfire set up it's 59.9Hz and 120.0Hz instead of 60.0Hz
The numbers below the graph also don't say if they represent FPS or frame-time.
Where do you see that? The graph isn't labeled...
Sorry, meant vertical, but nether axis is actually labeled. The numbers below the graph also don't say if they represent FPS or frame-time.
We need them to inplement mantle in dayz, since that games runs like garbage on all cpus
And Arma for that matter, and anything Total War.
Without testing or experiencing for myself, I can tell you this test doesn't really mean anything to most gamers. You won't be able to tell a difference between 80FPS and 120FPS when you're actually playing the game, so a 50% performance increase doesn't really buy you anything here.
And Arma for that matter, and anything Total War.
I don't even... There's a reason why 120hz is a thing.
Then again, 99% of the people here with their $3k+ rigs don't game competitively/professionally so I shouldn't be surprised by that response.
It's reporting the DXGI-enumerated refresh rate. A 59.9x Hz refresh rate isn't alarming at all.
I just want to know what AMD is still working on/tweaking. How much caffeine has been injected into their workers, and what timezone amd will choose to release mantle in? Maybe it will still be the 31st in American Samoa... please don't go that far AMD
I hear ya Kyle , there was a nasty little issue we thought would be better to have fixed prior to giving you guys beta access.
That happens to me sometimes when i start games not logged into steam 1st.
It was an installation bug
There should be a press update this afternoon
http://www.guru3d.com/news_story/update_on_amd_catalyst_14_1_mantle_driver_release.html
Well I know I'm done, it's clear to me that AMD has no interest in supporting even last generation cards, let alone properly supporting current generations.
Okay, then. I look forward to your sale thread.
In the meantime, I'll keep looking forward to the drivers...
I don't even... There's a reason why 120hz is a thing.
Then again, 99% of the people here with their $3k+ rigs don't game competitively/professionally so I shouldn't be surprised by that response.
And yet Im the troll who called out this AMD marketing rep about his release date AMD should stop making promises of properly functioning drivers that will be released when they say they will be released.
Yet again, the reason why I moved to NVIDIA after having every generation of AMD card from 4870 > 5850 > 6970 > 7970. AMD makes a good hardware product, I just wish they could get their software side up to par, even if it costs a few dollars more per card to hire some competent help.
If they stop promising the gift of great drivers in a timely manner, noone would expect anything. But then again, more and more people would buy NV so I guess the marketing rep is doing his job :/
I do and as long as you're at or above that 80fps threshold the differences are moot. Even 60fps for that matter, but I do feel 80fps is the breaking point for any real gain in an fps. Once you practice and get attuned to 80fps you can be just as good as someone with 120fps provided you have the skill. Mastery of what you have comes into play more than a gain. Playing below 60 fps is too erratic imo, but above that I havent felt a difference. Then again my game of choice is Counterstrike which isnt a very demanding game to begin with.
If you say its going to rain every day, you will be right eventually
Yeah, cause NOBODY ever wants or uses 120+hz monitors.