Separate names with a comma.
Discussion in 'Video Cards' started by bigbeard86, Jan 30, 2014.
upgrade already, 6xxx is so 2009.
like I said all along, just give me a steady and smooth fps and ill take that any day over more fps that we get from normal driver updates that barely get noticed.
agreed, ill take 45fps and smooth over 59.9fps and jerky anyday...
Read my post, just said I already upgraded to a GTX 780
I've been waiting since 2009 for AMD to fix some of their more annoying driver-related bugs and issues. Didn't happen, didn't want to buy into a product with more of the same issues, so back to Nvidia I went.
And yes, some of the problems that drove me nuts on the HD 6970 are still present on the R9 290x, as well.
No mantle for me saddly as well...
But I'm very curious to see how it works for everyone and what [H] has to say. I'm probably going to replace my 6870s in a year or two and mantle could be the factor if i go nvidia or amd.
Assuming you're playing mantle-optimized games, it's cool you can build a rig and save on the CPU costs, maybe dump them into a better GPU.
Yup, Mantle will be basically useless for you because it won't run on your nV 580 at all. AMD-only, GCN (7000-series and newer) only right now. While it may be ported to future nVidia hardware (don't hold your breath), there's basically no chance it'll ever be ported to the 580.
it might, if it supports dyanmic shader grouping in hardware.
I dont know enough about NV hardware to answer that.
Unfortunately, I'm willing to bet that Mantle adoption will not be as fast and as widespread as anyone would like, so you'd be shooting yourself in the foot as far as other games (especially cpu intensive games like Civ, Total War, etc.)
Theve already listed a dozen games that i already planned on purchasing before mantle was annouced, so adoption seems widespread enough so to me.
Between DA3, ME4, Star Citizen, and Starcontrol; Who isnt going to buy all of those with or without mantle?
I'd laugh if AMD has to do a major architectural hardware design change in the future and then developers will have to have Mantle, Mantle 2.0, OpenGL, DirectX apis.
Welcome back to the 90's boys.
Im sure theve covered that with extensability.
Haven't looked into these preliminary results that are out, but from everyone's comments so far, it seems to me that one group that are going to benifit are the crossfire 79xx and 290 users with 2500ks and 3570ks, well the ones with 120hz monitors anyway. I know when I was trying to get 120fps with CF 7950s on bf3 and bf4 it was my 3570k that was dropping me into the 80s at times..
Anyway it will be interesting to see the results, though it won't impact my single r9 290/3770k too much I don't think. I'll drop my cpu back to stock and turn off HT to test the difference mantle makes one day though.
I was getting a bit worried that I was going to have to start looking at i7 5390k or 5960x's in the future if CPUs kept progressing as slowly as they are. If mantle takes off and gets adopted by the majority of develepors I'll probably be able to keep my ivy system for longer than I would have dreamed. Still remains to be seen how many will adopt it though. I'm sure nvidia will throw plenty of money at upcoming big title releases to ensure mantle has a hard time tasking off. It would be silly business not too.
Mantle fits right in with AMD's consoles development with their APUs.
What I find the most interesting is the console ports we're used to are mostly CPU bound because of the multi-platform engines used. Now, with games ported with Mantle, CPU performance may not be so important, but gpu performance will be.
However, with a DX port you may still be CPU bound and need much higher system specs overall to get the same, or lower performance, than Mantle equipped games.
Sounds pretty awesome.
It'd be nice to see if mantle is expanded to laptops, as well.
it is, if you have a CGN gpu.
Im still waiting for more price for benchmark comparisons, more single gpu solutions, and more variation on cpus in that range.
You never know you maybe eating your own words
Well, that's unfortunate. This use case seemed to me to be the main draw for Mantle.
Good to see that the gains per GPU with Mantle scale properly with crossfire.
I'm pretty sure that it's been known for a while (or at least speculated) that mantle would see it's largest gains when a system was bottlenecked by the CPU. The Kaveri APU's are nice, but they aren't anything spectacular compared to discreet graphics cards. I have a feeling though that you'd see much larger gains by simply adding a low/mid range AMD GPU to that system to run a crossfire setup with the APU
Yes, but this is only BF4. Is it draw call limited in DX11? Starswarm, allegedly, has much bigger benefits from Mantle, since it's build to take advantage of other features than just the removal of the CPU overhead.
That should be normal, since, as others have said, SLI/Crossfire systems are more CPU bottlenecked than single GPU systems. Would be nice to see benchmarks from a quadfire setup in BF4.
Are these people so disorganized they can't release something at the same time, even after it was delayed an entire month?
Look at the graph they posted, it looks enormously smoother even if the raw gain isn't massive. That's pretty nice.
Someone gif'd it
I agree with him. We won't likely see Mantle fully (hopefully it grows) taken advantage for about a year or two. I wouldn't rush out to buy an AMD setup just yet.
Really curious to know what type of performance gains are to be had on my hardware.
r9 290 stock
I really want to see the performance differences at higher than 1080p resolution, Since higher resolutions tend to be more gpu limited, doesn't that mean that mantle will be less of a boost there?
What is the effect for given gpu setups while varying the cpus?
CF 290s with:
test at 1080p and also at 1440p
so many results I still want to see.
I wonder to what this could be attributed. Do you know what the scale is of that graph?
Pretty much, Mantle moves the bottleneck back to the GPU where it should be. If you were already tapping out the card under DX then you won't see more than a 10-15% gain which is the result of the lower level access to the GPU itself. Look at the 7850K APU results and you will get a pretty good idea of how that scenario will look.
AMD is late at being late...
I know with the new drivers, they're supposed to move their frame pacing to "phase 2" (whatever that means). So it's possible that's what we're seeing (or maybe Mantle is just more stable?)
EDIT: I'm wrong, ignore me.
Erm looks right to me...
edit: you edited.
It's kind of interesting that you said this. I agree it's meaningless, but lately on this site reading all the bashing going on from NVIDIA fanboys and AMD fanboys it does actually mean something.
When the Ti came out, you had all those folks willing to pay a couple hundred dollars more for a card because it gets 'a few fps' faster than the 290x. All the AMD fanyboys came out in droves saying those few FPS are meaningless and they are paying way to much money. Now AMD has mantle and gain some FPS from it and one of the 1st things they hear on the thread is how it's meaningless because you can't tell the difference between 80fps and 120fps.
With the battle of words that goes on in this forum, i would say the difference between 80fps and 120fps is 'not' meaningless. haha!
I'm too old to buy goods and services based on marketing materials.
I still own an Xbox 360 with no plans on new hardware this year.
The difference between 80 and 120 FPS is huge. Its the difference between running full light boost and not tearing and not being able to do it. What the heck is David talking about.
If that's the case, it would be useful to see how similarly-performing Intel and NVIDIA parts handle this scene.
I was thinking the same thing. I'm looking forward to trying out Mantle tomorrow. I'm hoping to lock-in 120fps at 3840x800.