confirmed: AMD's big Navi launch to disrupt 4K gaming

drutman

Limp Gawd
Joined
Jan 4, 2016
Messages
192
Apple is a tough customer when it comes to pricing and their standards of retail. We had a macbook with a recalled/bad battery and was overcharged for the swap guess what happened next? Apple came to the shop and shut them down for misrep/fraud. I actually had the Apple fraud investigators call me to document what happened.
 

AVATARAT

n00b
Joined
Jun 16, 2020
Messages
30
I can get a 'good 4k experience' with an APU... in Counter-Strike.

The statement is meaningless.
Theoretically I can agree with you, but don't tell me that you understand this CEO's words as APU for CS :D
“There’s a lot of excitement for Navi 2, or what our fans have dubbed as the Big Navi“
“Big Navi is a halo product”
“Enthusiasts love to buy the best, and we are certainly working on giving them the best”.
“RDNA 2 architecture goes through the entire stack“
"it will go from mainstream GPUs all the way up to the enthusiasts and then the architecture also goes into the game console products... as well as our integrated APU products.
"This allows us to leverage the larger ecosystem, accelerate the development of exciting features like ray tracing and more."
via AMD's CFO, David Kumar
 

N4CR

Supreme [H]ardness
Joined
Oct 17, 2011
Messages
4,661
5700xt = PPW parity with current gen nvidia. Nvidia will get process efficiency jump and uarch jump. Or maybe mostly process if more resources went into migration.

So if nvidia gets generous 15% from process and 30% from uarch in PPW terms, they'll still be at parity in midrange to high end if not sightly behind.
RDNA2 forces nvidia to go more die area, higher price or less margin to just stick with AMD in their main revenue area, if their [NV] rumored numbers are true.
This upcoming generation might finally be interesting from a competition POV. My problem is if AMD will be enough for 4k120. It'll definitely drive 10 bit though (I can't believe this is still a petty segmentation issue for NV in 2020).
 

N4CR

Supreme [H]ardness
Joined
Oct 17, 2011
Messages
4,661
I feel like that is jumping ahead a bit. We need to get to 4K60 solid (and I mean including ray tracing) or 1440p/21:9 high refresh before we start talking about 4K120.
Fair call I guess it depends on the games you play. I'd be happy with 90fps with RT, having done CRT up to 120Hz in past I found beyond 100Hz was diminishing returns. That said with more experience today, maybe it's different.
 

IdiotInCharge

[H]F Junkie
Joined
Jun 13, 2003
Messages
14,323
Fair call I guess it depends on the games you play. I'd be happy with 90fps with RT, having done CRT up to 120Hz in past I found beyond 100Hz was diminishing returns. That said with more experience today, maybe it's different.
>100Hz or thereabouts becomes less about 'smoothness' in terms of seeing a difference in framerates and more about 'responsiveness' and 'motion resolution'. I doubt I could tell the difference of anything above 100Hz with respect to smoothness, but in terms of responsiveness we could get to 1,000Hz and still be left wanting.
 

Ready4Dis

[H]ard|Gawd
Joined
Nov 4, 2015
Messages
1,403
>100Hz or thereabouts becomes less about 'smoothness' in terms of seeing a difference in framerates and more about 'responsiveness' and 'motion resolution'. I doubt I could tell the difference of anything above 100Hz with respect to smoothness, but in terms of responsiveness we could get to 1,000Hz and still be left wanting.
This is why when I wrote my game engine I decoupled my input handling from the graphics (this was back in like 2004 before 100fps was common)... This way even if you had low end hardware and crap frame rates, all inputs were still highly responsive. Of course higher frame rates still helped but it didn't 'feel' less responsive even at 25fps as your inputs where still handled and processed much faster than actual frame rate. With all the cores available nowadays you'd think this would be common. I guess the synchronization between the two is still difficult to handle as you can't be moving objects in the middle of a scene render. It requires a bit of extra effort to handle inputs this way because you need to keep track of inputs/values between renders and then make a copy to use during rendering. I would imagine if someone would do something similar nowadays you could have easily process inputs at 1000hz and render @ 100hz. The responsiveness would be there and your eyes probably couldn't tell the difference.
 

IdiotInCharge

[H]F Junkie
Joined
Jun 13, 2003
Messages
14,323
Here's the problem: responsiveness includes output :). Literally the time between the user making an input (button press, mouse click, mouse movement, etc.), and seeing the results of that input on the screen. The speed and consistency of responses weighs heavily on how games 'feel'. id software is an example of a company that puts quite a bit of focus here, and whoever Bethesda has making Elder Scrolls and Fallout games is an example of the opposite.
 
  • Like
Reactions: noko
like this

noko

Supreme [H]ardness
Joined
Apr 14, 2010
Messages
5,148
Not that you're wrong, but if they increase perf/watt by 50% and then keep power draw the same, shouldn't that result in 50% more performance or thereabouts?
Math wise yes, for exact same test method as in same load as in a particular game tested the same way and not limited by something else as in CPU, PCIe bandwidth or anything else.

AMD is more ambiguous it appears then previous, all we know is 50% better performance in The Division 2 per watt. Is that a 5700XT or 40cu running at 275w compared to a 80cu part running at 275w? Navi 10 gains are very much non-linear from 225w to 275w. Or is it compared to a RNDA 40cu at 225w with a 80cu RNDA running at 225w? 40cu as in 5700XT at 225w compared to a 80cu at 275w? (Best case) Or some other combination which when one thinks about it, can be many. In any case AMD made the claim in general so I think most would expect in general it will be true and not some typical marketing ploy which will go over like a lead brick. Probably best to evaluate once AMD and Nvidia release actual hardware and proper testing is done to see advances and problems. Both had serious issues on their last launch.
 
Last edited:

noko

Supreme [H]ardness
Joined
Apr 14, 2010
Messages
5,148
Here's the problem: responsiveness includes output :). Literally the time between the user making an input (button press, mouse click, mouse movement, etc.), and seeing the results of that input on the screen. The speed and consistency of responses weighs heavily on how games 'feel'. id software is an example of a company that puts quite a bit of focus here, and whoever Bethesda has making Elder Scrolls and Fallout games is an example of the opposite.
That I find so true, Doom Eternal is so damn responsive like you are right there in real time while other games I feel like I am in the future controlling the past with a time delay. While only a fraction of the second the mind seems to exaggerate that feeling that something is not right or off.
 

Ready4Dis

[H]ard|Gawd
Joined
Nov 4, 2015
Messages
1,403
Here's the problem: responsiveness includes output :). Literally the time between the user making an input (button press, mouse click, mouse movement, etc.), and seeing the results of that input on the screen. The speed and consistency of responses weighs heavily on how games 'feel'. id software is an example of a company that puts quite a bit of focus here, and whoever Bethesda has making Elder Scrolls and Fallout games is an example of the opposite.
I agree, that's why I said higher frame rates still made it better, but it didn't feel super laggy b cause from the time you started your movement till you stopped it was much more precise than what was displayed on screen. So your movements were consistent/repetitive. Frame times didn't lead to issues of inconsistent movement or miss-timed movements because they were irrelevant/separate to the input. Hard to describe but if you click to shoot at someone it would process that request even if you were between rendering frames or in the middle of a frams. Our brains are really good at predicting things like where something will be even if we can't actually see all the mini between steps. But if the timing of an action taking place and where you predict the event should happen don't line up, it throws you off (this is why people focus on frame times so you don't get that little stuffed and miss your target since it fired your weapon late). Now, seeing the frames in between would help (to a point). This is why games at 240hz feel faster even though our eyes can't really benefit, more consistent input timing/less variance. The timing is closer to real time without frame time differentiations. That's not to say a game at 20fps will play like one at 100fps, but it does make it much easier to play at 20fps when you move your mouse at a specific time and when the screen finally updates you are at the same place someone else playing at 240hz would be. If it waited to handle the input until the next frame your movement would be delayed (jumping, shooting) which makes it much more difficult because our ability to predict is screwed up when it's not consistent.
Anyways, sorry about the side tracking. Still looking forward to rdna2 ;). Hopefully it'll allow some much needed competition.
 
Top