Mantle is here

Look at the graph they posted, it looks enormously smoother even if the raw gain isn't massive. That's pretty nice.

Someone gif'd it
http://abload.de/img/test0gstb.gif[/QUOTE]
part of me just thinks that's because AMD help code the mantel part of the game and probably helped out the inept EA developers. Nothing about BF4 release history says the developers know how to optimize and write well done code.
I mean what's up with this
[IMG]http://i.imgur.com/0FpJP0B.jpg
None of the other sets ups have the @ xx.xHz differ but the last one the crossfire set up it's 59.9Hz and 120.0Hz instead of 60.0Hz
 
Last edited:
Look at the graph they posted, it looks enormously smoother even if the raw gain isn't massive. That's pretty nice.

Someone gif'd it
test0gstb.gif

There's no scale for the vertical axis of that graph, we don't know if it's relative or absolute :confused:

Edit: Actually, if it was an absolute representation of framerate, DirectX would have a better average framerate than Mantle there...
 
Last edited:
There's no scale for the horizontal axis of that graph, we don't know if it's relative or absolute :confused:

Edit: Actually, if it was an absolute representation of framerate, DirectX would have a better average framerate than Mantle there...

The top most point is 33.33 ms (or 30 fps) if I remember right, and the midpoint 16.67 (or 60 fps). The actual line can go off it.

edit: oops, horizontal?? It's framerate dependent how fast it plots, just look at the numbers themselves.
 
Last edited:
Maybe this will allow me to hang onto my i7-930 alittle longer as it's paired with a R9-290.. I guess I will need to test it at stock speed vs overclock as that should be interesting..
 
It's showing frame-delivery, not FPS.
Where do you see that? The graph isn't labeled...

edit: oops, horizontal?? It's framerate dependent how fast it plots, just look at the numbers themselves.
Sorry, meant vertical, but nether axis is actually labeled. The numbers below the graph also don't say if they represent FPS or frame-time.
 
None of the other sets ups have the @ xx.xHz differ but the last one the crossfire set up it's 59.9Hz and 120.0Hz instead of 60.0Hz
It's reporting the DXGI-enumerated refresh rate. A 59.9x Hz refresh rate isn't alarming at all.
 
I'm really disappointed that they haven't posted them already or even given us a concrete ETA.
 
The numbers below the graph also don't say if they represent FPS or frame-time.

Little bit of common sense mixed with deductive reasoning?

The numbers are a bit lower for Mantle than for DirectX, so they're obviously NOT showing FPS. The full screen shots show this because the actual displayed frame-rate does not correlate with the numbers displayed under the graph.

Also, if it was showing FPS, what would the CPU line be?
 
Where do you see that? The graph isn't labeled...


Sorry, meant vertical, but nether axis is actually labeled. The numbers below the graph also don't say if they represent FPS or frame-time.

They're milliseconds, the fps counter is separate.

Also I checked and the major lines are 30, 40, 60, 120... and a number greater than 578.

http://i.imgur.com/DzBjTJQ.gif
 
Without testing or experiencing for myself, I can tell you this test doesn't really mean anything to most gamers. You won't be able to tell a difference between 80FPS and 120FPS when you're actually playing the game, so a 50% performance increase doesn't really buy you anything here.

:confused: I don't even... There's a reason why 120hz is a thing.

Then again, 99% of the people here with their $3k+ rigs don't game competitively/professionally so I shouldn't be surprised by that response.
 
:confused: I don't even... There's a reason why 120hz is a thing.

Then again, 99% of the people here with their $3k+ rigs don't game competitively/professionally so I shouldn't be surprised by that response.

True because I dont think there is a single competitive game that needs more than a 1.5K rig to run extremely well. I mean SC2, LoL, WoW, CS, COD etc. all can run on 3 year old hardware. I dont really care for the 50% performance boost per say but for some reason I cant get a steady 60 fps in BF4 with CF 290s. So maybe I finally will.
 
I just want to know what AMD is still working on/tweaking. How much caffeine has been injected into their workers, and what timezone amd will choose to release mantle in? Maybe it will still be the 31st in American Samoa... please don't go that far AMD
 
It's reporting the DXGI-enumerated refresh rate. A 59.9x Hz refresh rate isn't alarming at all.

x2

It's not just an AMD thing either. Hooked up to my 120hz viewsonic, my GTX295 used to show 59hz, 60hz and 120hz as selectable refresh rates (and annoyingly, would default back to 59hz after a reboot)
 
I just want to know what AMD is still working on/tweaking. How much caffeine has been injected into their workers, and what timezone amd will choose to release mantle in? Maybe it will still be the 31st in American Samoa... please don't go that far AMD

well its one of the biggest features amd has done in years,

would be a laugh if nvidia revives glide
 
From a buddy at AMD. We still do not have the driver FYI.

I hear ya Kyle , there was a nasty little issue we thought would be better to have fixed prior to giving you guys beta access.
 
nice find there mac.

I don't mind waiting. I already waited this long anyways haha.
 
Did anyone have issues with steam yesterday? i leave mine logged in usually, and went to download SS demo, steam would not connect "no internet connection" this lasted for a couple of hours. Was this too much traffic all because of the SS demo or what?
 
That happens to me sometimes when i start games not logged into steam 1st.
 
That happens to me sometimes when i start games not logged into steam 1st.

Same here, but i was logged in since the night before so steam was up and running. it was weird. I couldn't even get on steam support page. So i watched a show non netflix for couple of hours, then it was up and running.
 
And yet Im the troll who called out this AMD marketing rep about his release date :rolleyes: AMD should stop making promises of properly functioning drivers that will be released when they say they will be released.

Yet again, the reason why I moved to NVIDIA after having every generation of AMD card from 4870 > 5850 > 6970 > 7970. AMD makes a good hardware product, I just wish they could get their software side up to par, even if it costs a few dollars more per card to hire some competent help.

If they stop promising the gift of great drivers in a timely manner, noone would expect anything. But then again, more and more people would buy NV so I guess the marketing rep is doing his job :/
 
If you say its going to rain every day, you will be right eventually

Troll elsewhere please, i gave at the office....
 
Well I know I'm done, it's clear to me that AMD has no interest in supporting even last generation cards, let alone properly supporting current generations.
 
Well I know I'm done, it's clear to me that AMD has no interest in supporting even last generation cards, let alone properly supporting current generations.

Okay, then. I look forward to your sale thread.

In the meantime, I'll keep looking forward to the drivers...
 
:confused: I don't even... There's a reason why 120hz is a thing.

Then again, 99% of the people here with their $3k+ rigs don't game competitively/professionally so I shouldn't be surprised by that response.

I do and as long as you're at or above that 80fps threshold the differences are moot. Even 60fps for that matter, but I do feel 80fps is the breaking point for any real gain in an fps. Once you practice and get attuned to 80fps you can be just as good as someone with 120fps provided you have the skill. Mastery of what you have comes into play more than a gain. Playing below 60 fps is too erratic imo, but above that I havent felt a difference. Then again my game of choice is Counterstrike which isnt a very demanding game to begin with.
 
And yet Im the troll who called out this AMD marketing rep about his release date :rolleyes: AMD should stop making promises of properly functioning drivers that will be released when they say they will be released.

Yet again, the reason why I moved to NVIDIA after having every generation of AMD card from 4870 > 5850 > 6970 > 7970. AMD makes a good hardware product, I just wish they could get their software side up to par, even if it costs a few dollars more per card to hire some competent help.

If they stop promising the gift of great drivers in a timely manner, noone would expect anything. But then again, more and more people would buy NV so I guess the marketing rep is doing his job :/

There was a post a few days ago about Mantle not being ready until February and the AMD people got it removed. And yet here we are.
 
I do and as long as you're at or above that 80fps threshold the differences are moot. Even 60fps for that matter, but I do feel 80fps is the breaking point for any real gain in an fps. Once you practice and get attuned to 80fps you can be just as good as someone with 120fps provided you have the skill. Mastery of what you have comes into play more than a gain. Playing below 60 fps is too erratic imo, but above that I havent felt a difference. Then again my game of choice is Counterstrike which isnt a very demanding game to begin with.

Yeah, cause NOBODY ever wants or uses 120+hz monitors.

:rolleyes:
 
Yeah, cause NOBODY ever wants or uses 120+hz monitors.

:rolleyes:

They can want to use whatever they want. A lot of guys still use 1024x768 or smaller resolutions too. It's not for the FPS boost, it's because they're used to it. Once you get locked in with a certain setting adjusting something can throw your game way off. If you have the coin to throw down on a damn good rig to run 120hz+ go for it. Plenty of people have and still will compete at the professional level without it. It's skill man.
 
Back
Top