Fable Legends DX12 benchmark

I agree but at the same time that's about $90/year which can be otherwise put into a GPU upgrade and is instead wasted powering a GPU that is sucking down too much power relative to it's performance. It still makes the 970 a better buy, especially for someone that plans to upgrade and is on a limited budget and can't afford to burn $90/yr on electric for a GPU.


If you're gaming 8 hours a day and can't afford the $90....other issues exist.
 
Can people on all sides please stop talking about power savings. If you have the free time to game for the hours on end that this makes a difference, you have the disposable income to deal with literally a Starbucks a month worth of cost difference.
 
If you're gaming 8 hours a day and can't afford the $90....other issues exist.

You must not know very many gamers. There's some guys I know that do this all day long and stream on twitch even with jobs. I personally game maybe 4-5 hours a week if I'm lucky but there's people that love gaming and make up a big chunk of the midrange market. And cageymaru, maybe you should put that $90 into upgrading your CPU.
 
PC: Corsair 650D || Asus X99 Deluxe || i7 5930k || 32GB GSKILL DDR4 || 980 Ti ||
I see that the NVIDIA DEFENSE SQUAD is finally showing up. Allow me to explain the results to you, since the Fury X is capable of delivering smoother gameplay at 4K Ultra, this means unlike the 980 TI, I can turn down a few settings and enjoy better gameplay unlike the 980TI.
One could say the same about you using the same evidence. Pot meet kettle.
If you are going to start the childish accusations, be prepared to fall to your own hypocrisy.

Q9650 cooled by Thermaltake Water 2.0 Extreme + San Ace fans -- Crucial m4 256Gb -- R9 290x Lightning or GTX 580 <<< from back when NVIDIA WAS STILL GOOD
In your case, your sig comes complete with negative judgment towards your previous product, making you look like fanboys you are so critical of. People can own product from a company, put that in their sig for troubleshooting help, and not necessarily be a blind fanboy. Correlation is NOT causation.

If this isn't the case with you, then don't bring it up to begin with because it makes you look just as bad as them. These ridiculous posts from both sides make this section of the forum unbearable to read sometimes.
 
Last edited:
Anandtech screwed up their labeling here. Is AMD the orange bars as the graph suggests or black bars as the paragraph says?


Think you are miss reading the graph, this graph is about latency in completion of a certain task, which lower is better, that's why

When we do a direct comparison for AMD&#8217;s Fury X and NVIDIA&#8217;s GTX 980 Ti in the render sub-category results for 4K using a Core i7, both AMD and NVIDIA have their strong points in this benchmark. NVIDIA favors illumination (GI which is heavy compute shaders this is not being done asynchronously), compute shader work (this is async compute with culling and physics for foliage) and GBuffer rendering (all of these are representative of the shorter black bars) where AMD favors post processing, transparency and dynamic lighting (shorter orange bars)
 
Umm there's no such thing as "AMD Defense Squad", just people spreading the truth! AMD is the scrappy underdog, didn't you hear?
 
So the good thing from this benchmark is that even without ASYNC, AMD cards look really good. But with ASYNC they really come alive. So there is still good competition in the DX12 realm. Now back to picking out this bread machine. Think I may have to get some expert advice from my sister. Hopefully she isn't biased against Brand A or B.
 
Can people on all sides please stop talking about power savings. If you have the free time to game for the hours on end that this makes a difference, you have the disposable income to deal with literally a Starbucks a month worth of cost difference.

If money is no object then get dual Titan X's. I'm just assuming that people who are looking at midrange cards like the 970 and 390 are looking to limit their costs. In that case the 970 will save on electricity and you won't need a more expensive power supply.

As for DX12. It will most likely be several years before pure DX12 games hit the market. Until there are actual games with actual drivers available, these discussions are for entertainment purposes only.
 
all I was saying is that DX12 is giving AMD a boost.

Still i'm waiting on the [H] review.
 
If money is no object then get dual Titan X's. I'm just assuming that people who are looking at midrange cards like the 970 and 390 are looking to limit their costs. In that case the 970 will save on electricity and you won't need a more expensive power supply.

As for DX12. It will most likely be several years before pure DX12 games hit the market. Until there are actual games with actual drivers available, these discussions are for entertainment purposes only.

Keep in mind, for a lot of people, "their costs" = purchase price, not energy costs. When you get older your costs = everything, but a 14 year old kid? His parents are dealing with any differential energy costs, in college? the dorm he is overpaying to deals with those costs, etc etc.
 
The one thing I've learned about DX12 after seeing AotS and Fable benchmarks, is that it runs like shit. You've got the 970 and 390 barely scraping 60fps in a cartoon game and you have all GPUs running 30-40fps in Ashes. If we're using those to represent DX12 (as it seems we are) then can we please stay on DX11? :rolleyes:
 
The one thing I've learned about DX12 after seeing AotS and Fable benchmarks, is that it runs like shit. You've got the 970 and 390 barely scraping 60fps in a cartoon game and you have all GPUs running 30-40fps in Ashes. If we're using those to represent DX12 (as it seems we are) then can we please stay on DX11? :rolleyes:


How many high end cards have run the newer version of Dx games (or next version of Dx games, that they were built for) well? There has only been two that I can remember, 9700 pro and the 8800 gtx......

The first iteration of cards with a new version of DX usually are just good for developers to start using basic functionality of the next Dx.
 
all I was saying is that DX12 is giving AMD a boost.

Still i'm waiting on the [H] review.

Yes it is. Noone is denying that. Some of us are just having fun at the cost of people who argued that nvidia cards will be useless in dx12 games due to no async :D
 
You must not know very many gamers. There's some guys I know that do this all day long and stream on twitch even with jobs. I personally game maybe 4-5 hours a week if I'm lucky but there's people that love gaming and make up a big chunk of the midrange market. And cageymaru, maybe you should put that $90 into upgrading your CPU.

I doubt they care about $7 a month then, and if they do then like I said they have bigger problems like gaming addiction rather than looking for a better job that would get them better income

Also and upgrading CPU just did away with the power savings and ended up costing more so it doesn't quite work out for them, spending $90 on upgrade and then an extra $40-$90 in extra power.

Yes it is. Noone is denying that. Some of us are just having fun at the cost of people who argued that nvidia cards will be useless in dx12 games due to no async :D

Async benchmark is AOTS, Fable from everything read and implied does not use much AS, it uses it for culling only was the what one review site said.
 
Let's see....

200w 8hrs a day

(200x8)/1000 = 1.6kwh per day

$0.16 per kWh (here in CT, generation plus delivery cost)

1.6 x 0.16 = $0.256

Wow. If you game 8hrs a day, with the gpu pegged at 100% the whole time, it costs you 27 cents a day to run a 390 vs a 970.

Guess I'm not that worried about that $0.27 a day.


In this case, buying a 390 is like rolling coal, just because you can.
 
I doubt they care about $7 a month then, and if they do then like I said they have bigger problems like gaming addiction rather than looking for a better job that would get them better income

Also and upgrading CPU just did away with the power savings and ended up costing more so it doesn't quite work out for them, spending $90 on upgrade and then an extra $40-$90 in extra power.



Async benchmark is AOTS, Fable from everything read and implied does not use much AS, it uses it for culling only was the what one review site said.


the type of culling they are doing uses raytracing to perform which can get expensive, not sure exactly the algorithm they are using but looking at their assets, it could be pretty heavy. Just because culling has been around from T&L days, its not the same culling devs are using now, its much more accurate now.
 
I doubt they care about $7 a month then, and if they do then like I said they have bigger problems like gaming addiction rather than looking for a better job that would get them better income

Hey that would make for a great AMD marketing motto! "Get off your ass and get a better job pleb!". Like I said, even casual use can rack up about $50/yr which is a total waste, basically you're taking a lighter to money and burning it for no reason. Overall benchmarks don't show the 970 very far behind the 390x so why burn money for no reason? If people are so inclined to burn money, they'd all own Titan X's and not midrange AMD parts. Fortunately it seems most of the market isn't so inclined to burn money for no reason and it is why the 970 is one of the most popular cards ever made while AMD sales continue to sink.
 
Hey that would make for a great AMD marketing motto! "Get off your ass and get a better job pleb!". Like I said, even casual use can rack up about $50/yr which is a total waste, basically you're taking a lighter to money and burning it for no reason. Overall benchmarks don't show the 970 very far behind the 390x so why burn money for no reason? If people are so inclined to burn money, they'd all own Titan X's and not midrange AMD parts. Fortunately it seems most of the market isn't so inclined to burn money for no reason and it is why the 970 is one of the most popular cards ever made while AMD sales continue to sink.

Ah, pure unadulterated hypocrisy. Mmmm mmm.
 
Why is this tread called Fabel Legends DX12 benchmark ?

Shouldnt it be renamed to, something abaut the powerbill, depending on what grapic card you have ?

On topic, 980ti, does good on Anandtech vs the Fury, but consider a couple of things:

In fact, AMD sent us a note that there is a new driver available specifically for this benchmark which should improve the scores on the Fury X, although it arrived too late for this pre-release look at Fable Legends (Ryan did the testing but is covering Samsung&#8217;s 950 Pro launch in Korea at this time).

And in some treads that actually are abaut this DX12 benchmark, some claims that Anandtech uses an factory overclocked 980TI, wich should be okey, since 980TI clocks well, If that holds water i dont know.
 
I agree but at the same time that's about $90/year which can be otherwise put into a GPU upgrade and is instead wasted powering a GPU that is sucking down too much power relative to it's performance. It still makes the 970 a better buy, especially for someone that plans to upgrade and is on a limited budget and can't afford to burn $90/yr on electric for a GPU.

Other people have replied with basically the same info, but I wanted to make myself clear.

I consider that to be worst case scenario. It's power numbers from a fairly heavily oc'd 390, which are up to 100w higher than recorded by other sites. It's also based on power rates here in CT, which are the third highest in the country. Plus I assumed 8hrs per day, every single day, at 100%.

I think that that a more reasonable calculation (4hrs a day, 5 days a week, @ the 9.84 cent per kwh national average) would be $21 a year.
 
Some of the lengths you people go to to defend your favorite video card team are nuts. I hope they're paying you all well.
 
In this case, buying a 390 is like rolling coal, just because you can.

Hardly. It's an almost worst case scenario of $91 a year, and an average of another $21 a year. That's the same as a pair of 100w incandescent light bulbs (have to specify, I love ccfl and led's personally).

Leave on a porch light at night? Kitchen light? I'd hardly consider those to be the same as purposefully running your diesel super rich. If the 290(x) is faster to boot, then I fail to see the issue.
 
The one thing I've learned about DX12 after seeing AotS and Fable benchmarks, is that it runs like shit. You've got the 970 and 390 barely scraping 60fps in a cartoon game and you have all GPUs running 30-40fps in Ashes. If we're using those to represent DX12 (as it seems we are) then can we please stay on DX11? :rolleyes:

This is where we need more technical analysis from people in the know. Is it running meekly because dx12 is just not that useful or because the game devs are starting to push the visuals farther? If the latter then that is EXACTLY what we ought to want.


It makes sense that a fury x is not that far ahead of a 390x because the biggest hardware advantage is in shaders, not rops, if a game made much heavier use of compute shaders then it ought to pull ahead more. Scott speculated that the global illumination might be using conservative rasterization which might give an even bigger performance boost to nvidia for that subsection of total performance. Or it could just be better there in general, so the different mixes of effects and how the hardware handles different types of effects will determine overall performance.

The most interesting graphs to come out so far were not the raw framerate counts, but the performance levels for different effects broken down:

All-4K-render.png


All-4K-render-low.png
 
The $90 I'm saving per year in power is more than enough compensation.
Kidding, of course...

:D I guess the bigger question though is this: Will all of this even matter? In the next couple of years a lot can happen. New video cards come out. Vulkan may even trounce DX12 (I would love it to personally - I'd rather give up PC gaming than get Windows 10). Hell AMD may even be a thing of the past.

How these video cards do on these benchmarks now is just ammunition for the rabid fanbois who just want to whip out their e-peens and beat people into the ground with their endless arguments over which video card company is better.
 
Hardly. It's an almost worst case scenario of $91 a year, and an average of another $21 a year. That's the same as a pair of 100w incandescent light bulbs (have to specify, I love ccfl and led's personally).

Leave on a porch light at night? Kitchen light? I'd hardly consider those to be the same as purposefully running your diesel super rich. If the 290(x) is faster to boot, then I fail to see the issue.

Missing my point. Sure, in the end, you get to your destination... but you could have done it just as well without screwing the environment over. Too many of you have a "hurrr durr, it's only an extra $27 a month" attitude. Until you look at the scale of 100-300K cards doing the same damn thing, and all the energy wasted.
 
Missing my point. Sure, in the end, you get to your destination... but you could have done it just as well without screwing the environment over. Too many of you have a "hurrr durr, it's only an extra $27 a month" attitude. Until you look at the scale of 100-300K cards doing the same damn thing, and all the energy wasted.

AMD clearly should be fined for producing such wasteful products.
 
Missing my point. Sure, in the end, you get to your destination... but you could have done it just as well without screwing the environment over. Too many of you have a "hurrr durr, it's only an extra $27 a month" attitude. Until you look at the scale of 100-300K cards doing the same damn thing, and all the energy wasted.

After all this lecture, I just hope you were using AMD cards before the GTX 7xx release.
 
After all this lecture, I just hope you were using AMD cards before the GTX 7xx release.

Thinking back, I don't think I've ever owned an ATI card. I think I went 3DFX -> Nvidia. But, back then there wasn't much of a choice. Efficiency wasn't exactly a goal at the time.
 
Thinking back, I don't think I've ever owned an ATI card. I think I went 3DFX -> Nvidia. But, back then there wasn't much of a choice. Efficiency wasn't exactly a goal at the time.

Can you shut up about the energy savings then?
 
Think you are miss reading the graph, this graph is about latency in completion of a certain task, which lower is better, that's why

You are right, my bad I made a silly mistake. I saw the Dynamic Lightning performance and just wtf'ed for a moment because I thought that was NVIDIA strong point.
 
Missing my point. Sure, in the end, you get to your destination... but you could have done it just as well without screwing the environment over. Too many of you have a "hurrr durr, it's only an extra $27 a month" attitude. Until you look at the scale of 100-300K cards doing the same damn thing, and all the energy wasted.

If a 390(x) is faster than a 970/980, but also uses more power, in your opinion is that bad? If so, you're saying that performance isn't the end goal, and that power is more important.

I'm pretty sure the 4670/980 combo you're running now doesn't sip power compared to a celeron/750ti. I bet you could save loads of power by playing at 720p.
 
[OT]

Missing my point. Sure, in the end, you get to your destination... but you could have done it just as well without screwing the environment over.
Funny that I am actually consuming less power overall with my Fury X compared to 980Ti. The reason is the fact that 980Ti, or any other GPU except Fiji lineup cannot run my monitors with idle clocks. They have to up the clocks to low power 3D state or shit hits the fan (green screen :eek:).

So even if I am consuming slightly more compared to 980Ti when gaming, I consume way less when my pc is idling on windows. And I game usually 2-4h per day.

Same seems to happen if you use 144hz 1440p monitor. Can be fixed if you use 120hz when on desktop. I have personally tested this with 7970 / 290X / 680 / 980 / Fury X
[/OT]

Looking at these early DX benches and it is starting to feel like I made slight mistake when I sold my 290X CF setup. Fiji is definitely bottlenecked at some areas. Why the hell they didn't increase ROP's to 96 / 128 :confused:
 
If a 390(x) is faster than a 970/980, but also uses more power, in your opinion is that bad? If so, you're saying that performance isn't the end goal, and that power is more important.

I'm pretty sure the 4670/980 combo you're running now doesn't sip power compared to a celeron/750ti. I bet you could save loads of power by playing at 720p.

Well, I'm sure as hell not needing a 1kw PSU to keep it all running smoothly.
 
Sorry if I hit a sore spot. Sheesh.

I don't think you exactly hit a sore spot - you're just being ridiculous. Just like most other people on this particular sub-forum, you have stooped to bringing obnoxious and unrealistic arguments into the equation just to get the last word. Well congrats - you win. Nvidia's GPU's use less power at load than AMD's. Somehow I really doubt that you're losing sleep over the jeopardy that AMD no doubt puts our earth in more than you care about winning the argument. So to you good sir - I give you the gold star. Winner of this argument. The trees will agree with you, and no doubt you will have drawn the connection that Nvidia is also green. Green for the environment, and for our children's future... With all the money we save in power savings for their college fund.
 
Back
Top