AMD Responds to R9 290 Series Performance Variance

could [H] do some gaming benchmark after the card was heated up ? because what I read from pcper was a bit concerning http://www.pcper.com/reviews/Graphics-Cards/AMD-Radeon-R9-290X-Hawaii-Configurable-GPU/Cold-versus-Hot-R9-290X-Results

I play game around 3~4 hours a day & I want to know if the card performance deteriorate after hours of gaming, also I don't use headphones btw :)

We do most all of our testing after the card is "heated up." We actually use these cards to play games on for some long stretches of time even if you may only see 5 to 10 minutes of play on a graph. Realworld gameplay is the backbone of our testing and is what we use to form our opinions.
 
could [H] do some gaming benchmark after the card was heated up ? because what I read from pcper was a bit concerning http://www.pcper.com/reviews/Graphics-Cards/AMD-Radeon-R9-290X-Hawaii-Configurable-GPU/Cold-versus-Hot-R9-290X-Results

I play game around 3~4 hours a day & I want to know if the card performance deteriorate after hours of gaming, also I don't use headphones btw :)

When I test, I do my run-through after having played the game a while to figure out what's playable, so my run-throughs are after the card "heating up"

I've been testing this way since forever. One of the many benefits of testing in a real-world scenario like we do.
 
When I test, I do my run-through after having played the game a while to figure out what's playable, so my run-throughs are after the card "heating up"

I've been testing this way since forever. One of the many benefits of testing in a real-world scenario like we do.

ok then, kudos for doing that. But is there any performance decrease with stock fan setting after hours of playing ? or do the fan must be set above 50% to maintain gaming performance? also I'm a bit worry about the thermal wear of the GPU and will it affect the thermal wear of surrounding components inside the case, like PCIE slot, the MB PCB etc?
 
AMD's response: "Screw it!! Just fuckin increase the fan profile again, 80% fan speed and 100 degree operation temperature is perfectly fine!!"
 
ok then, kudos for doing that. But is there any performance decrease with stock fan setting after hours of playing ? or do the fan must be set above 50% to maintain gaming performance? also I'm a bit worry about the thermal wear of the GPU and will it affect the thermal wear of surrounding components inside the case, like PCIE slot, the MB PCB etc?

There wouldn't be much point in the 'UBER' mode if silent mode managed to keep 1Ghz clocks all the time; there'd be no performance difference at all, since it just changes the maximum fan setting(to my knowledge).

So I think it's kinda expected that silent mode will sometimes cripple the card's performance for the sake of acoustics.

As for the temps, I guess time will show. I really doubt AMD's engineers would have put the operating temp that high if they didn't believe it was harmless to the card and the computer though. They'd expose themselves to tons of bad publicity and possibly even lawsuits.
 
When I test, I do my run-through after having played the game a while to figure out what's playable, so my run-throughs are after the card "heating up"

I've been testing this way since forever. One of the many benefits of testing in a real-world scenario like we do.

What system case are you using for test system ?
 
[H] crew can never catch a break. last week i was skimming over the comments for an article on the gtx 650 ti boost. nearly every single one was about how [H] was being paid off to put amd products in a poor light.
 
I look forward to finding out what the driver does in regards to this part of the statement: "In the meantime, we’ve identified areas where variability can be minimized and are working on a driver update which will minimize this variance."

Man the amount of pure paranoia people exhibit these days concerning something like this is really trivial. I remember when Nvidia and ATI (flashback kids , before they were called AMD) use to use all kinds of tricks for 16-bit color and 32-bit color selections that eventually landed them both in hot water for cheating the benchmarks of those times. I remember the screenshots that everyone did showing that ATI was often not doing anything funky and that Nvidia was doing all kinds of nasty things to make us think we were getting 32-bit color when we were getting a dithering effect of 16-bit color instead.

I don't see the point of cheating benchmarks now , so many people inspect these new cards with such a detailed eye that doing so would be a massive accomplishment if you managed to actually slip past performance in a cheeky way without anyone noticing. Some people can't seem to except that sometimes an error is simply .. an error.
 
BTW, this might be the issue being described in the statement, that could be fixed with a driver update: http://forum.beyond3d.com/showpost.php?p=1802697&postcount=1887

The current driver is controlling the fan by PWM set point. Electrical to mechanical can present some variance; we're in the process of changing the control to an RPM set point. -Dave Baumann

I know Dave Baumann, he works for AMD, I trust his statement. I think this issue is being entirely exaggerated by many.


I also think people now getting their new cards are freaking out that the card is throttling at 40% (quiet mode default setting) from 1GHz. Of course it is, you should expect it to. We made this very clear in our review. That's why there is an Uber Mode, and that's why you can also manually increase the fan beyond that to eliminate that.

I hope this new driver, changing the control point of the fan, will stabilize variances between video cards. That seems to be the real issue, and a driver can fix it apparently. It is nothing to freak out over.
 
Something else to add to this. This is also true for all NVIDIA GPUs, series 600 and 700. Yep.

Environment, case temp, ambient temp, electrical flow, voltages, all of it affect the actual real-time frequency obtained by GPU Boost between NVIDIA video cards. Retail GTX 780 "A" might be running or achieve a different clock speed while gaming than Retail GTX 780 "B" card.

No one seems to have complained about this. Which is also something we did bring up in our reviews concerning GPU Boost and how it works. We made it clear that different cards can clock at different clocks, based on environment alone. In the case of NVIDIA, environment plays an even bigger factor since its trying to keep the GPU lower at 80c, instead of 95c on 290X.

So BOTH GPUs are affected by environment, and BOTH GPUs can have variances between brands, or even cards of the same brand. It's the nature of dynamic clocks based on power/temp/etc....

It seems AMD is getting knocked for it harder than NVIDIA was by some. Note that we brought these issues up on launch, of all this. Why some are so shocked, perhaps just a misunderstanding or lacking knowledge about how these things work.
 
There are two different issues on the table here, currently being muddied together.

Issue 1.) Speed variances between the same models of video card. Potentially because of electrical and mechanical differences inherent to different boards. Current fan control point is by PWM, new control point will be by RPM, and this will be fixed in a driver update.

Issue 2.) 290X throttling it's clock speed below 1GHz in default. This one is normal, this is correct, the card will throttle it's performance below 1GHz while running in Quiet Mode at 40% fan. This data is everywhere to show this. Users should know that this will happen. One way to combat this is to set to Uber Mode, or set CCC manually to 100% so that there is no fan cap. This should help stabilize the clock speed closer to 1GHz. Everyone is hoping custom cards will fix this issue completely.
 
This might be WAY off, but Kyle / Brent: Is it possible that AMD, to keep costs down, went with less than prime components in regards to the cooling on their 290 / X cards? Also, have you guys heard from any AIB's as far as aftermarket coolers for these cards?
 
Something else to add to this. This is also true for all NVIDIA GPUs, series 600 and 700. Yep.

Environment, case temp, ambient temp, electrical flow, voltages, all of it affect the actual real-time frequency obtained by GPU Boost between NVIDIA video cards. Retail GTX 780 "A" might be running or achieve a different clock speed while gaming than Retail GTX 780 "B" card.

No one seems to have complained about this. Which is also something we did bring up in our reviews concerning GPU Boost and how it works. We made it clear that different cards can clock at different clocks, based on environment alone. In the case of NVIDIA, environment plays an even bigger factor since its trying to keep the GPU lower at 80c, instead of 95c on 290X.

So BOTH GPUs are affected by environment, and BOTH GPUs can have variances between brands, or even cards of the same brand. It's the nature of dynamic clocks based on power/temp/etc....

It seems AMD is getting knocked for it harder than NVIDIA was by some. Note that we brought these issues up on launch, of all this. Why some are so shocked, perhaps just a misunderstanding or lacking knowledge about how these things work.

I can't speak for anyone else but when GPU Boost was introduced I did have concerns regarding the nature of testing, dynamic performance rates, and reproducibility. I did comment in at least one discussion regarding it but I agree that not any attention was given to the matter.

I'm wondering if reviewers in general now should document more variables such as the following -

ambient room temperature
case and configuration (or if open) in addition to system configuration
case temperature if applicable
temperature data (possibly graphs as well) for each individual benchmark (not just idle/max for separate stress tests)
clockspeed data during test runs
voltage information during test runs as well as stock voltages

Also with performance in clearly thermally limited conditions such as the 290/290x or even the GTX 780 (to use an Nvidia example) as seen in the review here - http://www.hardocp.com/article/2013/05/23/nvidia_geforce_gtx_780_video_card_review/8#.Unoj3hDXt_E perhaps reviewers can look into what happens if you slightly modify the environment conditions since the cards being tested at right at their thermal limits? For example how much would a 5C higher temperature change (well within real world parameters I would think) affect performance? What if you put everything in a case (I assume in HardOCPs testing it is an open bench?)?

I'm also wondering if it would be interesting to look at how each GPU actually handle clock speeds changes. Are they all identical in performance? Or are some able to adjust clock speeds smoother over more sample points while others have much larger fluctuations over few points? Would this be a contributing factor in terms of smoother or more jerky game play?

Issue 1.) Speed variances between the same models of video card. Potentially because of electrical and mechanical differences inherent to different boards. Current fan control point is by PWM, new control point will be by RPM, and this will be fixed in a driver update.

Would there be an issue of GPUs in the same model having different TDPs (affecting the thermal load on the cooling setup) at the same clock speed due to variances in leakage and voltage resulting in the possibility of some running faster or slower even in the same environmental conditions?
 
My random thoughts here at 6:40AM

We may be moving into a new era of video cards, and how clock speed will be rendered meaningless.

Perhaps the future are video cards that have no quoted base clock, or clock speed of any kind. Perhaps the GPU will be able to dynamically change its clock speed internally (perhaps even with separate domains) from 0MHz - Infinity. The factors that will set the clock are TDP/Power/Voltage/Temperature, and those values are set by NVIDIA or AMD. Perhaps the end user will no longer know what clock speed their card is running at, nor will they need to know. Perhaps overclocking video cards like this will be all about upping the power/tdp/voltage and lowering the temperature as much as possible, and by way of that, the performance goes up because internally the clock speed is changing itself dynamically higher, but you still won't know what that is.

This could be where we are headed, and we shouldn't be afraid of this change.
 
Though I'm very pleased with the price-point of $399 for the 290 I really would've paid $450 for a better cooler right out the gate. Like maybe copper instead of aluminium for the reference heat sink, or bigger fans for less rotational noise. But third-party solutions should take care of that shortly, making this card really the one to own for the prospective video gamer.
 
AMD has changed the game. Once AMD releases the magic drivers that will take care of power and heat.

1) 280X is faster than 770 4GB with a price difference of 50$
2) 290 is created to outclass 780 and compete with 780ti with a price difference of 50 and 250$
3) 290X is created to compete with Titan with a price difference of 450.

AMD has taught nVidia a lesson. nVidia has dropped the price of 780 and 770 because of fear, and with magic drivers on the way, nothing will come against AMD.
 
Something else to add to this. This is also true for all NVIDIA GPUs, series 600 and 700. Yep.

Environment, case temp, ambient temp, electrical flow, voltages, all of it affect the actual real-time frequency obtained by GPU Boost between NVIDIA video cards. Retail GTX 780 "A" might be running or achieve a different clock speed while gaming than Retail GTX 780 "B" card.

No one seems to have complained about this. Which is also something we did bring up in our reviews concerning GPU Boost and how it works. We made it clear that different cards can clock at different clocks, based on environment alone. In the case of NVIDIA, environment plays an even bigger factor since its trying to keep the GPU lower at 80c, instead of 95c on 290X.

So BOTH GPUs are affected by environment, and BOTH GPUs can have variances between brands, or even cards of the same brand. It's the nature of dynamic clocks based on power/temp/etc....

It seems AMD is getting knocked for it harder than NVIDIA was by some. Note that we brought these issues up on launch, of all this. Why some are so shocked, perhaps just a misunderstanding or lacking knowledge about how these things work.

AFAIK- Nvidia turbo boost speeds exceed those given in specifications when conditions are favourable (at least for mine 770)
A subtle diffrence compared to AMD best case scenario clocks.
 
Dude, what a load, honestly. A community event would not be centered around one company and its products. It was nothing but a marketing ploy.

Can this idiot just be shown the door? Get the fuck out, seriously.
 
AFAIK- Nvidia turbo boost speeds exceed those given in specifications when conditions are favourable (at least for mine 770)
A subtle diffrence compared to AMD best case scenario clocks.

Most are not using the proper verbiage when reciting the clock speed of the 290X. I chose my words very carefully when I stated in our launch review that the clock speed of 1GHz on the 290X is not a fixed clock rate, but a "cap." The 1GHz clock rate is an "Up To" 1GHz clock rate, I specifically used the words "Up To."

1GHz is not guaranteed, nor was it ever guaranteed. The clock speed has, from the get-go, been a dynamic clock speed that could clock itself up to 1GHz, and no higher, if conditions are right. If not, it will run under. Many people are not understand that 1GHz is not the guaranteed clock, it is simply the cap, the hat sitting on top of the card that it can go up to there, if conditions are right. It's not a hard lock, nor guaranteed. Now, if you want to stabilize your clock close to it, then thermal performance is key.

You have to evolve your understanding how how GPU clocks work in regards to the 290 series. It isn't like NVIDIA, it isn't like before. It's a different thing. Understand how that thing works, and then there won't be people freaking out. People are freaking out cause they are not fully understanding what that 1GHz cap means, and are use to the old way of doing things. It is time to change.
 
Also, to add, GPU Boost on NV GPUs can downclock the GPU below the base clock. All it takes is a power demanding app, like furmark for example, and the clock speed will go below the base clock, again because the clock speed is also dynamic on it.

We have to start thinking of clock speed differently, the old ways no longer apply.
 
Also, to add, GPU Boost on NV GPUs can downclock the GPU below the base clock. All it takes is a power demanding app, like furmark for example, and the clock speed will go below the base clock, again because the clock speed is also dynamic on it.

We have to start thinking of clock speed differently, the old ways no longer apply.

And this statement right here is why before I buy any hardware the first site I check for reviews is [H]. I remember years ago when you switched the way you review GPUs from apples to apples to "best playable settings" and at first I hated it but soon after realized it is a much better way to review hardware.

This isn't me kissing up but just stating that [H] reviewers are much more forward thinking in the ways they review components.
 
Didn't we go through something similar with the 4870 when it came out?
 
And this statement right here is why before I buy any hardware the first site I check for reviews is [H]. I remember years ago when you switched the way you review GPUs from apples to apples to "best playable settings" and at first I hated it but soon after realized it is a much better way to review hardware.

This isn't me kissing up but just stating that [H] reviewers are much more forward thinking in the ways they review components.

Same here. I use to dislike the [H] graphs and their reviews until I realized, their methodology follows closely how I use my video card in games. I choose the best compromise between visual quality and playability with games. I don't play benchmarks and don't even run them on my video cards for any reason. (Even running them just to see the demo scenes puts me to sleep.)

Not kissing up here either. Not saying I would complain if they tossed one of those video card/SSD/case giveaways my way... ;)
 
BTW, this might be the issue being described in the statement, that could be fixed with a driver update: http://forum.beyond3d.com/showpost.php?p=1802697&postcount=1887



I know Dave Baumann, he works for AMD, I trust his statement. I think this issue is being entirely exaggerated by many.


I also think people now getting their new cards are freaking out that the card is throttling at 40% (quiet mode default setting) from 1GHz. Of course it is, you should expect it to. We made this very clear in our review. That's why there is an Uber Mode, and that's why you can also manually increase the fan beyond that to eliminate that.

I hope this new driver, changing the control point of the fan, will stabilize variances between video cards. That seems to be the real issue, and a driver can fix it apparently. It is nothing to freak out over.

If you add more varity/complexity it takes time until the community at large understands the technology.
 
Below is one of the first comments on Forbes R9 290 Review- Turns out he works for Nvidia-

Other hardware reviewers are pointing to the vast inconsistency of the card and its unsustainable performance. It’s recently been revealed that the 290x and 290 both drop clocks due to thermal throttling after several minutes of gameplay – something that doesn’t show in synthetic benchmarks which don’t last long enough for the card to heat up. Other reviewers are citing up to nearly 25% performance loss due to heat.

I’m interested to hear your follow-up on sound. TO hear Forbes call this card disruptive while Anandtech and Tom’s hardware both issue “do not buy” reviews is an interesting inconsistency.
 
My random thoughts here at 6:40AM

We may be moving into a new era of video cards, and how clock speed will be rendered meaningless.

Perhaps the future are video cards that have no quoted base clock, or clock speed of any kind. Perhaps the GPU will be able to dynamically change its clock speed internally (perhaps even with separate domains) from 0MHz - Infinity. The factors that will set the clock are TDP/Power/Voltage/Temperature, and those values are set by NVIDIA or AMD. Perhaps the end user will no longer know what clock speed their card is running at, nor will they need to know. Perhaps overclocking video cards like this will be all about upping the power/tdp/voltage and lowering the temperature as much as possible, and by way of that, the performance goes up because internally the clock speed is changing itself dynamically higher, but you still won't know what that is.

This could be where we are headed, and we shouldn't be afraid of this change.

I agree this seems to be the direction we are heading. However, I hope there will be modified BIOS's out there that will override this to feed my full control needs in the future too. I'm currently using the Tech Inferno BIOS that disables boost in combination with Afterburner on my 780GTX, I can modify my voltage all the way up to 1.3v and change the clock to whatever I want and it will always stay there no matter what. No more no less and no worries about how my card will preform.
 
Interesting thread.

While I've stopped reading thg a long time ago, it's interesting they went off the handle to sound the alarm and AMD have a plan in place to tighten up variences on a very programable gpu it seems.

The "up to xx clock" phrase is interesting. Ever since furmark was accused of overworking cards what really is the "safe" operating zone of a chip? Nvidia throttles furmark, so whatever ceiling they calculated is not meant for sustained absolute maximum use it seems. Anybody who's done distributed computing projects, especially mined litecoins (the most taxing IMO) on GPUs can see these applications really push the hardware to its maximum, way way beyond even the hardest gaming sessions going 24/7.

All that to say having a variable maximum speed on a card can let you attain good speeds, but troubleshooting it can bring a new heap of questions and misinformation until it's more understood.
 
Thanks for great review Brent! Explanation of base clock speeds really helps to understand what is going on with these new cards. Guess I'll stick with my two 7970's till I see what aftermarket cards look like.
 
Launch a great card and every one wants to find something wrong with it..Oh no AMD made something that beats everything Nvidia there has to be something wrong with it.. Yea it's hot and yes it uses a lot of power But wait it's faster and cheaper.. Sorry people you can't have your CAKE and eat it to this round.. Maybe when the new Fab goes to 22nm. Your buying this card knowing it gets hot and knowing it's going to down clock if it gets to hot.. If you BUY something and bitch about it knowing whats coming your an idiot.. If you just want to bitch and troll about something please ask your mom to bring you another hot pocket to your basement bedroom. If you don't like the heat get out of the Kitchen...
 
Launch a great card and every one wants to find something wrong with it..Oh no AMD made something that beats everything Nvidia there has to be something wrong with it.. Yea it's hot and yes it uses a lot of power But wait it's faster and cheaper.. Sorry people you can't have your CAKE and eat it to this round.. Maybe when the new Fab goes to 22nm. Your buying this card knowing it gets hot and knowing it's going to down clock if it gets to hot.. If you BUY something and bitch about it knowing whats coming your an idiot.. If you just want to bitch and troll about something please ask your mom to bring you another hot pocket to your basement bedroom. If you don't like the heat get out of the Kitchen...

I don't think you understand the nature of the criticism. It was not about the variable speed as much as different speeds between retail and review cards. That is a very valid criticism. And not everyone found issues with it, not sure where you are getting that from. All in all, most of the reviews I have seen have been quite positive on the card.

And yes, we can have our cake and eat it too, AMD has shown us that before.
 
Launch a great card and every one wants to find something wrong with it..Oh no AMD made something that beats everything Nvidia there has to be something wrong with it.. Yea it's hot and yes it uses a lot of power But wait it's faster and cheaper.. Sorry people you can't have your CAKE and eat it to this round.. Maybe when the new Fab goes to 22nm. Your buying this card knowing it gets hot and knowing it's going to down clock if it gets to hot.. If you BUY something and bitch about it knowing whats coming your an idiot.. If you just want to bitch and troll about something please ask your mom to bring you another hot pocket to your basement bedroom. If you don't like the heat get out of the Kitchen...

I for one am not sold on AMD as the clear winner here. I remember a couple of generations ago when AMD was praised for offering a slightly slower card that produced less heat and sucked less power. Weird that the heat and power advantages that Nvidia has are no longer advantages?? Furthermore, I would like to see some overclocked comparisons. Nvidia cards seem to have more headroom based on the information we've been given thus far. And I would also like to see how the AMD cards hold up in a long gaming session with the thermal challenges they're facing. But to be honest, if Nvidia drops their pricing to equivalent or better levels, I'm not going to go with AMD due to the thermal concerns just for a few more fps.
 
why would you say any review site is doing this.
If they are it should be known but making claims when you don't have all the facts like what went on at amd and how they chose what to send for review probably accounts for a lot/all of this.
maybe NOTHING was done to alter this..go buy some at the store and test them yourself.
If they are 10-20% slower than review cards make sure and SCREAM so everyone knows.

but you still don't know who or what is going on.
Find out the truth before pointing fingers...Then once you do, get a BIG ass finger and point it so everyone knows.

On the throttling part several sites said they had to run it at 50% fan to eliminate it. AND IT IS LOUD at that setting.
why would anyone say oh I like loud, I am Hard and want loud?
hey go strap some deltas on the card and have a nice day.
 
I don't think you understand the nature of the criticism. It was not about the variable speed as much as different speeds between retail and review cards. That is a very valid criticism. And not everyone found issues with it, not sure where you are getting that from. All in all, most of the reviews I have seen have been quite positive on the card.

And yes, we can have our cake and eat it too, AMD has shown us that before.

My personal opinion is that AMD is pushing the thermal limits of this card to beat Nvidia, plain and simple. And they're putting alot of pressure on their board partners to cool and support this card once the RMAs start coming in.
 
I for one am not sold on AMD as the clear winner here. I remember a couple of generations ago when AMD was praised for offering a slightly slower card that produced less heat and sucked less power. Weird that the heat and power advantages that Nvidia has are no longer advantages?? Furthermore, I would like to see some overclocked comparisons. Nvidia cards seem to have more headroom based on the information we've been given thus far. And I would also like to see how the AMD cards hold up in a long gaming session with the thermal challenges they're facing. But to be honest, if Nvidia drops their pricing to equivalent or better levels, I'm not going to go with AMD due to the thermal concerns just for a few more fps.

Difference is AMD produced less heat, less power, cost less and performed roughly the same.

This time AMD produced more heat, more power, cost less and performed equal to or better than.
 
Gibbo from OCuK (who have a warehouse full of cards) chimed in:
Gibbo said:
Absolute rubbish about press cards being golden samples.

We had a press sample card, it did OK, we then got an Asus card and a Sapphire card form our warehouse stock, they both beat the press card in all our benchmarks as they hit higher overclocks. None of the cards experienced any throttling issues.

So without a doubt, complete BS. :)
http://forums.overclockers.co.uk/showthread.php?p=25255073#post25255073
 
Difference is AMD produced less heat, less power, cost less and performed roughly the same.

This time AMD produced more heat, more power, cost less and performed equal to or better than.

I'd like to see if Nvidia changes the pricing of their cards before we draw a conclusion. Your points are valid though.
 
Most are not using the proper verbiage when reciting the clock speed of the 290X. I chose my words very carefully when I stated in our launch review that the clock speed of 1GHz on the 290X is not a fixed clock rate, but a "cap." The 1GHz clock rate is an "Up To" 1GHz clock rate, I specifically used the words "Up To."

1GHz is not guaranteed, nor was it ever guaranteed. The clock speed has, from the get-go, been a dynamic clock speed that could clock itself up to 1GHz, and no higher, if conditions are right. If not, it will run under. Many people are not understand that 1GHz is not the guaranteed clock, it is simply the cap, the hat sitting on top of the card that it can go up to there, if conditions are right. It's not a hard lock, nor guaranteed. Now, if you want to stabilize your clock close to it, then thermal performance is key.

You have to evolve your understanding how how GPU clocks work in regards to the 290 series. It isn't like NVIDIA, it isn't like before. It's a different thing. Understand how that thing works, and then there won't be people freaking out. People are freaking out cause they are not fully understanding what that 1GHz cap means, and are use to the old way of doing things. It is time to change.

I understand perfectly how it works.

Which is precisely why i have concerns how those radeons will work in the more mainstream enviroment of less than perfectly cooled cases.
 
I understand perfectly how it works.

Which is precisely why i have concerns how those radeons will work in the more mainstream enviroment of less than perfectly cooled cases.

That will be more of a concern with AIB coolers, as the stock cooler vents outside the case.
 
Back
Top