Oxide now says that NVIDIA does support Async Compute

While this currently doesn't make a difference in games we are playing, I don't see why issues like this should be swept under the rug either, like some of you are acting. As a hardware enthusiast, I'm pretty interested in this feature since it seems to be worthwhile on dedicated consoles, and if it runs better on AMD, that might give some of the cards in my house a longer life. I'm happy with that. If it doesn't pan out, or by the time it matters NVIDIA has a better compute shader engine, then I'll upgrade. But don't give NVIDIA a free pass, at least push them to clarify how this works.

How can you sweep something under the rug you can't test yet?

Show me the games, then we'll talk.
 
In a few months time... you'll be apologizing. You can't see it now... because you still live in a DX11 world with a predominantly Serial GPU architecture.

That's fine... I can wait.

Oh and... in case you weren't aware... most Game developers are partnering up, throughout 2015/2016, with AMD.
FPxtSp3.jpg


Just a coincidence ;)

It's a fact.. go and verify it if you'd like.

Just a coincidence that majority of currently announced DX12 games are coming from EA - which put tons of software in AMD gaming evolved and SE - which put tons of software in Gaming Evolved. And Microsoft who wants to push their windows 10 store (as rumours suggest Fable won't be available on steam).

I'm surprised you missed that being vigilante journalist seeking truth everywhere.
 
How can you sweep something under the rug you can't test yet?

Show me the games, then we'll talk.
I meant the people who are basically saying it's a non-issue. It's not important now but it has the capability to make 2016 very interesting at least.
 
I meant the people who are basically saying it's a non-issue. It's not important now but it has the capability to make 2016 very interesting at least.

Some of us are just cynical from past years experience.

How many times have you seen feature announced with huge hype following it and then when it gets released it turns out to have minimal impact ?

Or it takes years to mature and most people changed their hardware since first generation.
 
I would say the opposite; it's important now, while the games are in developement. Thinking it's important later, will once again delay progress by years.
 
Some of us are just cynical from past years experience.

How many times have you seen feature announced with huge hype following it and then when it gets released it turns out to have minimal impact ?

Or it takes years to mature and most people changed their hardware since first generation.

Yes, I'm sure for you it's only that, not the fanboyism.
 
I would say the opposite; it's important now, while the games are in developement. Thinking it's important later, will once again delay progress by years.

It's important now because AMD is on the verge of bankruptcy. They are pulling every dirty trick they can to sell products. Spreading false info, cherry picking review sites, rebadging an entire line up, using their viral marketing division to attack anyone and everyone.

I think the recent market share numbers reflect that people just don't want their bullshit.
 
Can i just ask how much is nvidia paying you each month? Maybe i could use some extra money:D
 
I would say the opposite; it's important now, while the games are in developement. Thinking it's important later, will once again delay progress by years.

I agree with this statement. If Nvidia can't support a feature that the console games utilize, it means that fewer games from consoles get ported as developers will be reluctant to spend the money to rewrite the code. Nobody needs another Batman /Ass Creed Unity fiasco. So getting them to implement the features that the consoles have is very important before the games come out.
 
I would say the opposite; it's important now, while the games are in developement. Thinking it's important later, will once again delay progress by years.


Its not for years, its only a small portion of the game/engine that has to be rewritten, tested they can do regression testing instead of full testing, months yeah but not years.
 
Hey man it was a good run, creating all those new accounts on multiple forums on the same day and spamming the same FUD thread to all of them. You got, what, 10 days of mileage out of it? Not a bad pull. 'Grats

Now there's a nice long weekend to recoup and recover and plan that next brilliant attack.

I am with this guy ^^

+1
 
It's important now because AMD is on the verge of bankruptcy. They are pulling every dirty trick they can to sell products. Spreading false info, cherry picking review sites, rebadging an entire line up, using their viral marketing division to attack anyone and everyone.

I think the recent market share numbers reflect that people just don't want their bullshit.

an Alpha stage benchmark is sent to the media first, before anyone could even download it.

This just so happens to be the same benchmark AMD was using to showcase mantle on a 290x, 6 months ago, https://www.youtube.com/watch?v=t9UACXikdR0

Might there be a reason this benchmark does so well the 290x?

the developer has an agreement with AMD, just download the benchmark and you will see the AMD logo. But you will see the people making the loudest noise ignoring this entirely.

AMD and Oxide have an agreement, this alpha stage benchmark is rushed to the media and you would have to be dense to not see why. Clearly this is an AMD strategy, dont let anyone tell you otherwise
 
It's important now because AMD is on the verge of bankruptcy. They are pulling every dirty trick they can to sell products. Spreading false info, cherry picking review sites, rebadging an entire line up, using their viral marketing division to attack anyone and everyone.

I think the recent market share numbers reflect that people just don't want their bullshit.

Lol, what did AMD ever do to you to make you hate them so much? Genuinely curious.
 
Hey man it was a good run, creating all those new accounts on multiple forums on the same day and spamming the same FUD thread to all of them. You got, what, 10 days of mileage out of it? Not a bad pull. 'Grats

Now there's a nice long weekend to recoup and recover and plan that next brilliant attack.

Remind me again what good stuff you have added to these discussions? Oh, and welcome to the [Nv]ard forums.

It's important now because AMD is on the verge of bankruptcy. They are pulling every dirty trick they can to sell products. Spreading false info, cherry picking review sites, rebadging an entire line up, using their viral marketing division to attack anyone and everyone.

I think the recent market share numbers reflect that people just don't want their bullshit.

Lol, good one, cheering the loss of jobs on, bravo, bravo. :D Oh well, guess I will be sticking with AMD as long as they are around and make good stuff like have have been doing for a while.
 
It's important now because AMD is on the verge of bankruptcy. They are pulling every dirty trick they can to sell products. Spreading false info, cherry picking review sites, rebadging an entire line up, using their viral marketing division to attack anyone and everyone.

I think the recent market share numbers reflect that people just don't want their bullshit.

you're like some b movie antagonist spouting his movie end manifesto
 
yeah man I get it, AMD is moustache twirlingly evil and tis all but another nefarious plot to something something

not to worry when hard forums user PRIME1 is on the case to uppercut some educational elucidation square into my jaw
 
you're like some b movie antagonist spouting his movie end manifesto
It's in AMD's best interest to make async sound as important as possible over the next few months. It also helps that both Deus Ex and the new Tomb Raider game are Squeenix games, and they also had a heavy hand in Ashes. Really bad luck for Nvidia that most (or all?) of the AAA DX12 games coming out leading up to Pascal are all Gaming Evolved titles.

Conversely, if Ubisoft were leading the charge into DX12 instead, we probably wouldn't even be hearing about this feature whatsoever. It would probably still be causing a shitstorm in the VR community.

Sensationalism is completely unnecessary though. Don't take PRIME too seriously, nobody else does. :cool:
 
Sensationalism is completely unnecessary though. Don't take PRIME too seriously, nobody else does. :cool:


whoa man, sensationalism? spoken like someone who wasn't there

all i could think as the barrel chattered between my teeth was how well the color red fit her. knee deep in the bodies, once green with life, words manic told tales of asynchronicity. of compute.

the tug of a trigger, yet here i am.
 
I just dont see how a Async chips in AMD cards can give so much performance. A little add-on cannot overtake the primary function of the main GPU chip. If that was the case then why would AMD release Fiji in the fist place when their previous gen cards can get 30% more performance with DX12?

If they can squeeze 30% more performance out of 290/x cards then Fiji is basically pointless. They would have waited until HBM2 was mature and release the next iteration of the Fiji Architecture which surpasses HBM1 performance by a ton in terms of capacity and bandwidth.

What I think is going on is Oxides game is DEEPLY coded in Mantle code, purely optimized for AMD related cards to make a statement perhaps.
 
I just dont see how a Async chips in AMD cards can give so much performance. A little add-on cannot overtake the primary function of the main GPU chip. If that was the case then why would AMD release Fiji in the fist place when their previous gen cards can get 30% more performance with DX12?

If they can squeeze 30% more performance out of 290/x cards then Fiji is basically pointless. They would have waited until HBM2 was mature and release the next iteration of the Fiji Architecture which surpasses HBM1 performance by a ton in terms of capacity and bandwidth.

What I think is going on is Oxides game is DEEPLY coded in Mantle code, purely optimized for AMD related cards to make a statement perhaps.
PS4, Hawaii, and Fiji all have the same amount of ACEs so perhaps AMD anticipates a bottleneck from consoles and the extra horse power on the PC market would be wasted.

Theoretically if the 290X goes up 30%, the Fury X will also get a substantial increase too. We just haven't seen it yet. If you believe AMD's hype machine, it's going to happen with future DX12 games. In their eyes the entire graphics industry (PS4, Xbox One, and AMD hardware) is going to move 'the bar' up by 30%, or whatever async is truly capable of.

In other words graphics processing is about to leap forward with new technology and leave Nvidia in the dust.

For point of reference, the Xbox One has 2 ACE's (8 queues each) and the PS4 has 8 ACE's (once again, 8 queues each)
Xbox One has less compute power than the PS4 so maybe it will act as the lowest common denominator.
With both the Xbox One and DX12 being Microsoft tech, I don't think MS will want Sony to smoke them on their own API.
 
If you believe AMD's hype machine, it's going to happen with future DX12 games. In their eyes the entire graphics industry (PS4, Xbox One, and AMD hardware) is going to move 'the bar' up by 30%, or whatever async is truly capable of.

In other words graphics processing is about to leap forward with new technology and leave Nvidia in the dust.

But it's hard to believe that AMD would put out that kind of hype.
















...ohh, wait...
 
Are you the same Tainted Squirrel in reddit? Seriously, I've never seen such flip-flopping.
 
Are you the same Tainted Squirrel in reddit? Seriously, I've never seen such flip-flopping.
You see flip flopping, I see playing the field. I've said enough bad things about both AMD and Nvidia over the past week, I can't even figure out which way you think I've flip flopped. :D
Brand loyalty is bad for everyone.

Oh wait, my mistake, uhhh... Nvidia BAD! AMD GOOD! There we go, I'm in the clear. Don't revoke my Reddit membership card please.
 
Sounds like the 980 GTX when it was released. O wait 4GB on the 980 isn't enough!

Yeah how crazy of them to release a card a year ahead of AMD. :D

Next year AMD will release a card to compete with the 980ti. They will tell us how awesome it is, but refuse to let anybody but Semiaccurate.com test it.

H35CHyC.gif
 
You see flip flopping, I see playing the field. I've said enough bad things about both AMD and Nvidia over the past week, I can't even figure out which way you think I've flip flopped. :D
Brand loyalty is bad for everyone.

Oh wait, my mistake, uhhh... Nvidia BAD! AMD GOOD! There we go, I'm in the clear. Don't revoke my Reddit membership card please.

The way the world is today. Everyone flip flops. It depends what the Media is promoting for the week.

1 week is bad AMD, Next week its bad Nvidia, in 3 weeks Matrox will make a return and everyone will bash both AMD and Nvidia.

Once it's on twitter/facebook its all downhill from there.

Thank god for the internet today!
 
You see flip flopping, I see playing the field. I've said enough bad things about both AMD and Nvidia over the past week, I can't even figure out which way you think I've flip flopped. :D
Brand loyalty is bad for everyone.

Oh wait, my mistake, uhhh... Nvidia BAD! AMD GOOD! There we go, I'm in the clear. Don't revoke my Reddit membership card please.

That was epic :D

Is there a problem being neutral and still badmouthing or praising both companies.

Heck, I was hyping Fury X like nothing, then came the reviews and I started badmouthing them because it was a fail compared to hype. In the end, I still bought it because it has a feature which I need and no other card in the market can do it (except 285/380 and they're way too slow).

Oh well, back to topic: At least Fable Legends and the new Tomb Raider will use Async Compute. Lets see how Maxwell's software scheduler works when they hit the market.

This really isn't a problem now but what if Pascal also has a software scheduler to do async timewarps? Isn't it taped out already?
 
Yeah how crazy of them to release a card a year ahead of AMD. :D

Next year AMD will release a card to compete with the 980ti. They will tell us how awesome it is, but refuse to let anybody but Semiaccurate.com test it.

You realize the 980 goes against the 290x? Er I mean the 390x which has 8GB and is cheaper?, which even [H] recommended right?

Let's not even bring up the VR problems Nvidia are going to have.

But just remember 4GB was enough when the 980 GTX was released (when the 390x already had 4GB), and now with the Fury X is isn't enough.

http://www.hardocp.com/article/2015...nt_gameplay_performance_review/9#.VetgohFVhBc

"Otherwise, as we mentioned above the GeForce GTX 980 seems overpriced for The Witcher 3: Wild Hunt compared to the 390X." ***390x was released before the 980 GTX as a 290x***
 
You realize the 980 goes against the 290x? Er I mean the 390x which has 8GB and is cheaper?, which even [H] recommended right?

Let's not even bring up the VR problems Nvidia are going to have.

But just remember 4GB was enough when the 980 GTX was released (when the 390x already had 4GB), and now with the Fury X is isn't enough.

http://www.hardocp.com/article/2015...nt_gameplay_performance_review/9#.VetgohFVhBc

"Otherwise, as we mentioned above the GeForce GTX 980 seems overpriced for The Witcher 3: Wild Hunt compared to the 390X." ***390x was released before the 980 GTX as a 290x***

The GTX 980 trades blows with the R9 Fury X (which doesn't OC), the 980 can easily overclock 20%+.
 
You realize the 980 goes against the 290x? Er I mean the 390x which has 8GB and is cheaper?, which even [H] recommended right?

The 290X is just below the 780ti which was released a long time ago. When will they take on the 980ti. 2016? 2017? Will they let anyone review it? Will they call it a console players dream?

Yes we all know AMD is cheaper. That's because nobody is buying their cards. Seriously their market share is closer to Matrox or VIA than it is NVIDIA.
 
The 290X is just below the 780ti which was released a long time ago. When will they take on the 980ti. 2016? 2017? Will they let anyone review it? Will they call it a console players dream?

Yes we all know AMD is cheaper. That's because nobody is buying their cards. Seriously their market share is closer to Matrox or VIA than it is NVIDIA.

Actually kepler performs worse now then it did before. And the 290x is now faster/equal to the 780ti.

So much for 780ti future proofing lol

http://www.hardwarecanucks.com/foru...iews/70125-gtx-780-ti-vs-r9-290x-rematch.html (Didn't you post this think Prime1 showing the 290x actually being faster/equal to a 780ti?)
 
Last edited:
The GTX 980 trades blows with the R9 Fury X (which doesn't OC), the 980 can easily overclock 20%+.

http://www.techpowerup.com/reviews/Sapphire/R9_Fury_Tri-X_OC/35.html

Think you ment to say 980ti? Which it doesn't trade blows with?

Right now there is no way anyone should recommend a Fury/Fury X to anyone, they aren't worth the price.

Now the best price/performance to me is the 390/390x cards. Both have 8GB of memory and are cheaper then the 970/980 GTX.

If you want the best card it's the 980ti hands down. So to me there is no reason to bring up the Fury X. It is an overpriced card, but making up fud about the 980 trading blows with a Fury is a but much, Unless you use the 1 review that shows this, and that's here at [H]
 
http://www.techpowerup.com/reviews/Sapphire/R9_Fury_Tri-X_OC/35.html

Think you ment to say 980ti? Which it doesn't trade blows with?

Right now there is no way anyone should recommend a Fury/Fury X to anyone, they aren't worth the price.

Now the best price/performance to me is the 390/390x cards. Both have 8GB of memory and are cheaper then the 970/980 GTX.

If you want the best card it's the 980ti hands down. So to me there is no reason to bring up the Fury X. It is an overpriced card, but making up fud about the 980 trading blows with a Fury is a but much, Unless you use the 1 review that shows this, and that's here at [H]
The GTX 980 when overclocked will trade blows with a Fury X, the overclocked performance scales very well. GTX 980 Ti stomps the Fury X when the Ti is Overclocked.
 
The GTX 980 when overclocked will trade blows with a Fury X, the overclocked performance scales very well.

Thats great....but when a normal average Joe goes to buy a video card, they don't care what overclocks a video card can get.

And besides. Who would recommend a fury at $570 when you can get a 980ti on sale for $629-$639? I mean it's pointless.

But saying (an overclocked card can be faster then a stock card) Is just stupid to even say.
 
Thats great....but when a normal average Joe goes to buy a video card, they don't care what overclocks a video card can get.

Luckily there are factory overclocked cards.

You must be new to this whole pc hardware thing. Please do a little research before posting.

As far as the Furry X goes:

http://techreport.com/review/28513/amd-radeon-r9-fury-x-graphics-card-reviewed/14

What's important to note in this context is that these scores aren't just numbers. They mean that you'll generally experience smoother gameplay in 4K with a $499 GeForce GTX 980 than with a $649 Fury X.
 
Back
Top