Possible AMD Radeon RX 490 Performance Numbers Show Up in DX12 AOTS Benchmark – On Par With High-End

i can't see how nvidia can be faster than AMD in new upcoming games.
How so? they already are even in DX12 and vulkan games. The gains AMD cards receive so far is not enough to offset how well NVIDIA cards are allready performing plus there are very few games that are being made with DX12 or Vulkan DX11 is still the got to for many games coming out.
 
You had any detailed information on the upcoming Radeon Chill, this seems to be managing the GPU Queue and latency while controlling fps all in terms of reduced power demand and improved dropped frames-latency.
This may impact the intent of your revised model, not had a chance myself to look closely at what latest Chill is doing but it is pretty intrusive and so far game dependent-verified by AMD while also currently only pre DX12 (I think).
Cheers
Only what I skimmed over that was publicly released. Looked like it was just enabling vsync and dialing up the clocks. Frames finish more quickly and TDP decreases as the card idles a greater percentage of time. Then use frame pacing to kick off the frame just in time to hit the vblank. The WoW example they provided was cutting the FPS in half, likely along with power consumption, but then using that free power to drive clocks higher. Sort of on par with overclocking your card and enabling vsync, but tricking the game into getting the timing right. Double the FPS but discard every other frame for better latency or something. Don't want to render a frame really quickly then wait a while for the vblank. That's likely where the dependent part comes from.

Doesn't necessarily affect my model and my clock scaling was more to maximize efficiency and somewhat arbitrary. Seems likely there is more to it or it wouldn't be as game dependent. Unless by dependent they mean FPS to spare.
 
Only what I skimmed over that was publicly released. Looked like it was just enabling vsync and dialing up the clocks. Frames finish more quickly and TDP decreases as the card idles a greater percentage of time. Then use frame pacing to kick off the frame just in time to hit the vblank. The WoW example they provided was cutting the FPS in half, likely along with power consumption, but then using that free power to drive clocks higher. Sort of on par with overclocking your card and enabling vsync, but tricking the game into getting the timing right. Double the FPS but discard every other frame for better latency or something. Don't want to render a frame really quickly then wait a while for the vblank. That's likely where the dependent part comes from.

Doesn't necessarily affect my model and my clock scaling was more to maximize efficiency and somewhat arbitrary. Seems likely there is more to it or it wouldn't be as game dependent. Unless by dependent they mean FPS to spare.
Seems strange how they mention GPU Queue-latency management though if all they are doing is VSYNC and frame pacing unless they are making it sound more than it really is or more likely some reports reading too much into it.
If it is not done well the performance actually drops as latency-dropped frams and input lag would be worst, most will not be playing games in the ideal way for a basic design to work well either efficiently or without issues.
Games they have done this for so far are: WoW, Witcher 3, Paragon, RoTR, Tomb Raider, and importantly FPS such as Counter Strike and Call of Duty Infinite Warfare, along with another 15 or so games.
That said I really doubt anyone is going to enable it anyway for games supported :)

Edit:
See next post as think found the answer.
Cheers
 
Last edited:
Only what I skimmed over that was publicly released. Looked like it was just enabling vsync and dialing up the clocks. Frames finish more quickly and TDP decreases as the card idles a greater percentage of time. Then use frame pacing to kick off the frame just in time to hit the vblank. The WoW example they provided was cutting the FPS in half, likely along with power consumption, but then using that free power to drive clocks higher. Sort of on par with overclocking your card and enabling vsync, but tricking the game into getting the timing right. Double the FPS but discard every other frame for better latency or something. Don't want to render a frame really quickly then wait a while for the vblank. That's likely where the dependent part comes from.

Doesn't necessarily affect my model and my clock scaling was more to maximize efficiency and somewhat arbitrary. Seems likely there is more to it or it wouldn't be as game dependent. Unless by dependent they mean FPS to spare.

OK I think I have found the answer and yeah it is not necessarily integrated deeply with hardware and not affect your model.
The solution matches very closely to Hialgo that was purchased by AMD, in fact their technology was called Chill/Boost/etc.
Back then this was a plug-in and does pretty much what AMD describes, so I think some are reading too much into the GPU queue-latency management.
It does seem AMD has incorporated some aspects at a driver level, but this explains the level of application intrusiveness.
1. HOW TO USE HIALGO CHILL

WHAT IS HIALGO CHILL
HiAlgo CHILL is a plugin. It works with 3D games which use DirectX 9. When injected into the game, it monitors game activity (movement of the main character and turning of the camera). When there is no or little action, it smoothly lowers the framerate, thus decreasing load on the computer processors, both on CPU and GPU. This allows processors to cool down. Once the action starts, CHILL restores the full power of the processors.


HOW TO INSTALL IT ON WINDOWS 8, 10
Well, usually you download EXE file and run it on your computer. However, Windows 8 and 10 tries to make this more challenging for you - for your own sake, of course ☺. Here is what you should do.


WHY IT IS IMPORTANT FOR COMPUTER TO STAY COOL
What is important for the gamers is to prevent overheating. When temperature of a processor (CPU or GPU) reaches about 175 °F (80 °C), the processor decreases its performance (this is called undeclocking). As a result, you are losing framerate, your games starts lagging.

With CHILL you also get lower framerate, but only when there is no activity in the game. And you get full speed back when you really need it.

Another benefit of using CHILL is if you are playing on batteries, they will last longer.
http://www.hialgo.com/ChillFAQ.htm
http://www.hialgo.com/TechnologyCHILL.html

And some fluff on the product tech:
HOW TO SEE EFFECT OF HIALGO CHILL
When HiAlgo CHILL starts, you will also see a scrolling Performance Monitor graph:

Performance%20Monitor.png


Each 1-pixel-wide vertrical line here corresponds to one frame. The graph shows how much time, in milliseconds, is spent on each frame by the GPU (green) and the CPU (yellow). It also shows, in whitish-bluish shade, the amount of idle time per frame, when the system was cooling down due to HiAlgo CHILL intervention. The more whitish you see, the more your system is cooling.

To hide/unhide the monitor press the "HOME" key.


CAN I RUN CHILL ALONG WITH BOOST OR SWITCH, OR SHOULD I CHOOSE ONE?
Usually you only need one of them - and this is why: if you have a weak GPU you need BOOST or SWITCH; if your GPU is ok and works ok for the first few minutes into gameplay, but then your game starts stuttering (because your computer overheats) - then you need CHILL.

In a rare case your GPU is both weak, and also overheats, you may try running BOOST (or SWITCH) and CHILL together -- this is an unsupported feature, but it seems to work :).
Cheers

Edit:
LOL never noticed the company was also mentioned in another thread.
 
Last edited:
I'm sure nvidia pays or sponsors games too. I just thought that AMD has the better Architecture with asynchronous compute that would run upcoming games better that ran dx12 and vulkan. The rx480 performance is steadily increased over the last couple of months with driver updates.
 
Anyone see a pattern?

  • Polaris 10XT2
  • Twice the value on Zauba.
  • Roughly twice the performance (assuming perfect benchmark)..
  • '2' in the nomenclature. They only ever use '2' when it's a dual something.. X2.. duo..
Recent AMD naming schemes indicated refreshes/updates would share XX5 nomenclature. More signs to this being a dual gpu card with the XT2 IMO.
 
Anyone see a pattern?

  • Polaris 10XT2
  • Twice the value on Zauba.
  • Roughly twice the performance (assuming perfect benchmark)..
  • '2' in the nomenclature. They only ever use '2' when it's a dual something.. X2.. duo..
Recent AMD naming schemes indicated refreshes/updates would share XX5 nomenclature. More signs to this being a dual gpu card with the XT2 IMO.

Not really, R9 295x2 has both a 5 and x2, despite not being a refresh/upgrade :p , 390/390x were "refreshes" without the 5 as well.
 
Not really, R9 295x2 has both a 5 and x2, despite not being a refresh/upgrade :p , 390/390x were "refreshes" without the 5 as well.

390X was prior to the naming convention change. This is only since just prior to Polaris.

Main thing here is the X2. You don't see 'X2' or '2' or anything similar used on any AMD products historically, without it being a dual core/gpu.

Edit XT of course has historical precedence too.. so XT2... E.g. X800, X800 GTO, X800 PRO, X800 XT etc..
 
390X was prior to the naming convention change. This is only since just prior to Polaris.

Main thing here is the X2. You don't see 'X2' or '2' or anything similar used on any AMD products historically, without it being a dual core/gpu.

Edit XT of course has historical precedence too.. so XT2... E.g. X800, X800 GTO, X800 PRO, X800 XT etc..

How do you have any indication to their new naming scheme? there have been no refreshes or updates since Fiji/Polaris dropped. XT and X2 go back a long time, not really sure where you're getting this.
 
How so? they already are even in DX12 and vulkan games. The gains AMD cards receive so far is not enough to offset how well NVIDIA cards are allready performing plus there are very few games that are being made with DX12 or Vulkan DX11 is still the got to for many games coming out.

Are you comparing the 480 to the 1070 here? Because what you said is just really wrong if the comparison was to the 1060.
 
Are you comparing the 480 to the 1070 here? Because what you said is just really wrong if the comparison was to the 1060.
Not at all performance with the 1060 is pretty dead even in most games, plus if we look at only DX11 games the 1060 trounces the 480. The fact is there is this assumption that NVIDIA somehow will magically become uncompetitive under DX12 and Vulkan and that's a ludicrous assumption.
 
Not at all performance with the 1060 is pretty dead even in most games, plus if we look at only DX11 games the 1060 trounces the 480. The fact is there is this assumption that NVIDIA somehow will magically become uncompetitive under DX12 and Vulkan and that's a ludicrous assumption.

Again, you're wrong, on the trounce part.
 
Last edited:
Frankly, I am not expecting Vega to be much better than a 1080TI, and wouldn't surprise me if it performed a bit under.

The reason is that with HBM being expensive, and Vega the last in line of the old Arctic Islands, which was already in development when Raja took over, I think the technology in some ways is at an engineering cul-de-sac. I think GCN has a future, but I think as a whole the Arctic Islands just don't scale very well when compared to what NVidia made - in fact, the 1080 was some new pieces of engineering going against the last iteration of an older generation of cards - if Vega can hang, that would be impressive in and of itself.

The first real piece of tech that AMD will provide on the graphics front under the eye of Raja will be the one code named Navi. There is early indication they will be moving on from HBM (likely to something new that has a lower cost compared to HBM2, but works as fast or faster), and beyond that, we'll really see what AMD is capable of.

One thing I will say - I won't write AMD off yet. They keep making smart strategic deals and their stock is over $10, so investors believe in the company. I think even if Vega is just a "nice card", it can carry them if priced right.

Because let's be honest - if Matrox or some other company that's been shit and on death's door for a while came out with a killer gaming card, no one is going to say "well, Matrox was dead so I won't buy it". Bullshit - enthusiasts flock to the best. And even if AMD is putting up turds right now for the high end, if they keep making money, then all the shit we worry about won't mean anything because they will be able to buy as much time as they need.
 
How do you have any indication to their new naming scheme? there have been no refreshes or updates since Fiji/Polaris dropped. XT and X2 go back a long time, not really sure where you're getting this.

http://videocardz.com/61721/amd-radeon-rx-400-series-naming-scheme-explained

XT2 is an entirely new product, not a revision, otherwise it would be 485 instead of 480.
Let alone on name, score and Zauba price I'd be almost willing to bet it's a dual card. Double price, double score, rebirthed moniker we have not seen since 7970XT with a '2' on the end. AMD fans know what 2 means in the past. Only thing I can think of that could throw that would be an obscure mobile (edit or apple 2017 refresh) variant model.

Apple did the shitty GPU in the macbooks now, so people will be clamouring for updated new shiny in 2017. Gotta sell the same shit again eh ;)
 
Last edited:

http://videocardz.com/61721/amd-radeon-rx-400-series-naming-scheme-explained

XT2 is an entirely new product, not a revision, otherwise it would be 485 instead of 480.
Let alone on name, score and Zauba price I'd be almost willing to bet it's a dual card. Double price, double score, rebirthed moniker we have not seen since 7970XT with a '2' on the end. AMD fans know what 2 means in the past. Only thing I can think of that could throw that would be an obscure mobile variant model.

Where was it ever confirmed XT2 is 480? What you said really is conflicting with your own argument. Polaris XT was rx 480, Polaris XT2 if history is evidence from gigahertz edition it will probably be a newer revision for 480. Now why would it not be just called 485? I am guessing you meant 490? But I am going by AMDs pervious naming schemes. It seems XT2 is just a newer revision, it could possibly be higher clocked and may be another block of shaders. We have yet to see SKU with 2560 and 2816 shaders. We might be seeing 580 590 and then vega at fury. AMD has too many brackets to fill still. Its pretty naive to think they will be jumping straight to fury replacement leaving those slots unfilled.
 
Where was it ever confirmed XT2 is 480? What you said really is conflicting with your own argument. Polaris XT was rx 480, Polaris XT2 if history is evidence from gigahertz edition it will probably be a newer revision for 480. Now why would it not be just called 485? I am guessing you meant 490? But I am going by AMDs pervious naming schemes. It seems XT2 is just a newer revision, it could possibly be higher clocked and may be another block of shaders. We have yet to see SKU with 2560 and 2816 shaders. We might be seeing 580 590 and then vega at fury. AMD has too many brackets to fill still. Its pretty naive to think they will be jumping straight to fury replacement leaving those slots unfilled.

I have never seen the 480 called 480XT, in fact I remember thinking it weird and people being chastised as the naming convention had changed. Polaris XT internal name perhaps but not the model designator.
Nowhere confirmed it was a 4XX product. Rumours are for 4xx and 5xx. As there is no 485 leaked in drivers, plus it's named already, the XT2 cannot be a revision of this card...
What we have seen leaked is 490 on AMD website and only Polaris 10, 10XT2, 11, 12. Vega 10, 11 in driver hex dumps.

We are seeing XT2 as an internal name, not a model name... so maybe we don't see this as an 'XT2' model but maybe something else. Again though still, double the price and score for a 'revision', with a 2 in the model name, I think we are looking at a dual gpu card as the 'revision'.

We don't know if this XT2 will be the '490' equivalent, or a 480XT2 placeholder or otherwise, or if Vega 10 becomes the Fury replacement, Cut Vega 10 for 490/590, Vega 11 the Polaris replacement... no one said they will re-spin or revise Polaris afaik.

You're right about too many brackets to fill. AMD has far too many different chips currently. They need to consolidate production, they're going to need quite a few cuts either way...

edit: this would all fit in with predictions from Kyle et al. AMD is a little behind the times and struggling to catch up. Making a dual gpu card would prove this is the case.

Vega *H1* from AMD themselves.. lets be real, nothing on shelves till early Q2 if they're only having secret events now. Some say Jan, we'd hear much more about it already if so.
This makes AMD look even more shit in high end over holidays... hence the 'stick two together' approach to 'win back the upper mid range value segment' with a dual P10 'XT2' till Vega is ready for retail. It's all about appearances.
 
Last edited:

http://videocardz.com/61721/amd-radeon-rx-400-series-naming-scheme-explained

XT2 is an entirely new product, not a revision, otherwise it would be 485 instead of 480.
Let alone on name, score and Zauba price I'd be almost willing to bet it's a dual card. Double price, double score, rebirthed moniker we have not seen since 7970XT with a '2' on the end. AMD fans know what 2 means in the past. Only thing I can think of that could throw that would be an obscure mobile (edit or apple 2017 refresh) variant model.

Apple did the shitty GPU in the macbooks now, so people will be clamouring for updated new shiny in 2017. Gotta sell the same shit again eh ;)


right, so from that picture, where do you see any XT? anyway , XT isn't new, it has been used for so long :p
 
Shintai will dig a grave for AMD every time but he hasn't been able to put AMD in any one of them yet! Lol.
 
I talk positive and negative about both camps. I can care less but I do like to get more performance for my money and Nvidia just doesn't do that until sales happen. Absolutely loved my GTX 1080 but the GSync issues and adoption cost killed the experience. When I pay $1200 on hardware, I expect near perfection, nothing less.
 
Seeking Alpha? So did someone just give a date in that article? I doubt it. Could be middle of 2017 but I doubt he said exactly a date. We will find out soon enough. But seeking alpha tends to always hate on amd. On top I gotta download the app to read? No thanks lol

Shintai will dig a grave for AMD every time but he hasn't been able to put AMD in any one of them yet! Lol.

Its a transcript and the information comes directly from AMD. But why check the basics first? :)
 
Seeking Alpha? So did someone just give a date in that article? I doubt it. Could be middle of 2017 but I doubt he said exactly a date. We will find out soon enough. But seeking alpha tends to always hate on amd. On top I gotta download the app to read? No thanks lol


No they don't, you have articles written by one or other side, any case this a transcript from AMD's own presentation, so no this has no sides its AMD saying these things, if anything its extremely favorable to AMD (its only AMD's best case they are talking about here)
 
Last edited:
Would read but not going to sign up for it. I doubt Vega will only come after June of next year.
 
Would read but not going to sign up for it. I doubt Vega will only come after June of next year.

I also hope it launches before the middle of next year. If it's another 6 months away, I will grab a 1070 to hold me over.
 
Back
Top