HD5870 & HD5850 Furmark Temps

SonDa5

Supreme [H]ardness
Joined
Aug 20, 2008
Messages
7,437
Post some screen shots and info of furmark temps along with system chipset temps using HD5000 series.
 
I'm interested in seeing scores too, to help assess whether or not gaming performance is being gimped by the 256-bit bus. With my old 4870 @ 850/1000 i got ~6000 at 1280x1024 with no AA), so a score around 12k (since FurMark is shader-limited) would suggest it is somewhat constrained. 9-10k, however, would put it in line with other benchmarks and indicate it is more likely just early drivers. Only theory, but I think it has merit.
 
I don't know anyone who uses Furmark as a performance benchmark, but they do use it to check overclocks to see if they're stable. As far as being bandwidth limited, I've seen a forum post at xtremesytems that showed a linear increase in performance when memory speeds were increased, but they only went up to 5%. Once more people get their hands on cards, and more tweaking software becomes compatible, I think we'll see that it is indeed bandwidth limited. As by how much is a wild guess though.

As far as temps go, most reviews are seeing low to mid 90C. But AMD did implement a feature that will clock down the GPU once it nears 100C. But once the temp goes down to the safe area, 90C-97C, it'll go back to full speed. It will keep repeating this as needed.
 
Want any particular settings for the bench?

Old link was screwed up, seems my first bench was removed.
 
Last edited:
Want any particular settings for the bench?

Standard settings

Yes, can you please re-run same settings but without AA. It should produce higher temps, plus on a personal level it will provide a useful comparison score. Also, could you try renaming the exe to see if they are still gimping their cards under Furmark :D
 
I don't know anyone who uses Furmark as a performance benchmark, but they do use it to check overclocks to see if they're stable.

Yeah, it is quite useless as it is a very specific benchmark. But for the purposes I outlined, the score could be useful. It's also one of the few programs able to leverage the 'theoretical' computational advantage RV770 has over G200 - my GTX260 @ 837/1674/1296 (I don't run those clocks for gaming) produces similar results to a stock 4850 :eek:

EDIT: I should probably add that I never got 3DMark '06 to complete at those clocks, even though Vantage, Crysis et al were all fine. Street Fighter 4 also bombs out, so furmark stability is no guarantee. I've also had bad memory pass Memtest86.
 
Yeah, it is quite useless as it is a very specific benchmark. But for the purposes I outlined, the score could be useful. It's also one of the few programs able to leverage the 'theoretical' computational advantage RV770 has over G200 - my GTX260 @ 837/1674/1296 (I don't run those clocks for gaming) produces similar results to a stock 4850 :eek:

EDIT: I should probably add that I never got 3DMark '06 to complete at those clocks, even though Vantage, Crysis et al were all fine. Street Fighter 4 also bombs out, so furmark stability is no guarantee. I've also had bad memory pass Memtest86.

Personally, I've never been one to judge cards on "theoretical" numbers becasue as we all know, no programs take complete advantage on aviable performance. But I'm also the kind of person who doesn't even have any version of 3DMark installed becasue I'm not into synthetical performance numbers. I built this rig for gaming, and thats what I test with. But, I know that not everyone feels the same, so I will never try to tell someone that their methods are wrong. As the old saying goes, to each their own. As far as testing for stability, I've never seen a program that stesses every single part of a card at the same time, so for testing, a wide selection should be used. That even goes for CPU's with IBT and Prime95. By the way, does Furmark even put a strain on the Texture units, ROP's, or memory? I've been under the impression it was mainly for testing the shaders.
 
Personally, I've never been one to judge cards on "theoretical" numbers becasue as we all know, no programs take complete advantage on aviable performance. But I'm also the kind of person who doesn't even have any version of 3DMark installed becasue I'm not into synthetical performance numbers. I built this rig for gaming, and thats what I test with. But, I know that not everyone feels the same, so I will never try to tell someone that their methods are wrong. As the old saying goes, to each their own. As far as testing for stability, I've never seen a program that stesses every single part of a card at the same time, so for testing, a wide selection should be used. That even goes for CPU's with IBT and Prime95. By the way, does Furmark even put a strain on the Texture units, ROP's, or memory? I've been under the impression it was mainly for testing the shaders.



Keep this in mind the next time you read any video card review.

There is always some type of benchmark being used to compare the performance of one video card to another. It might be a game showing frames per second or a"synthetic" benchmark like 3dMark06. In the big picture the best video card will show the most performance over an array of different types of benchmarks.


Higher perofmance from these test normally means much better game play. Of course individual hardware drivers are important as well but these test help provide performance data.

For me operating temperatures are important because I want my system temps as low as possible so that all of my hardware may enjoy the highest quality electrical power possible.
 
Last edited:
I'm interested in seeing scores too, to help assess whether or not gaming performance is being gimped by the 256-bit bus. With my old 4870 @ 850/1000 i got ~6000 at 1280x1024 with no AA), so a score around 12k (since FurMark is shader-limited) would suggest it is somewhat constrained. 9-10k, however, would put it in line with other benchmarks and indicate it is more likely just early drivers. Only theory, but I think it has merit.

I wouldn't worry to much about the 256-bit bus, it isn't the only factor in performance.
 
Yes, can you please re-run same settings but without AA. It should produce higher temps, plus on a personal level it will provide a useful comparison score. Also, could you try renaming the exe to see if they are still gimping their cards under Furmark :D

This isn't necessary anymore, latest FurMark renames automatically.
 
By the way, does Furmark even put a strain on the Texture units, ROP's, or memory? I've been under the impression it was mainly for testing the shaders.

This is a valid question, and I don't know the answer. However, when I run FurMark I do notice the GPU utilization is at 99-100% the entire time, and the power and heat generation is greater than any game I've experienced. One thing we do in our testing though is to enable Antialiasing in FurMark, to go ahead and put strain on the ROPs and such.
 
Keep this in mind the next time you read any video card review.

There is always some type of benchmark being used to compare the performance of one video card to another. It might be a game showing frames per second or a"synthetic" benchmark like 3dMark06. In the big picture the best video card will show the most performance over an array of different types of benchmarks.


Higher perofmance from these test normally means much better game play. Of course individual hardware drivers are important as well but these test help provide performance data.

For me operating temperatures are important because I want my system temps as low as possible so that all of my hardware may enjoy the highest quality electrical power possible.

The thing is, if you're into comparing 3DMarks with other people, than it's a reasonable app to use. But, if you're a person such as myself who buys hardware to stricktly have better performance in games, than it's not needed. When trying to figure out which card to buy for gaming, the only thing that makes sense to do is compare the actual gameplay perfromce for the games that you personally play. The biggest problem I have with 3DMark is that over the years, less informed gamers have come to think of it as the end all, be all tool. Then they go spreading that in forums and such. People should be fully informed on the subject, especially when paying high dollar amounts for the hardware.
 
Done. With the exe renamed to FurMarker.exe

Blimey, that's lower than a stock 4870. I'm glad it didn't score 12k though :D

I wouldn't worry to much about the 256-bit bus, it isn't the only factor in performance.

I'm not saying that is the case, but it remains to be seen whether or not it is detrimental to performance. Based on the numbers put out by the launch drivers, something is amiss imo. It could be any number of things - this just sticks out as the most obvious answer. I'd like to think that AMD wouldn't make such a huge miscalculation and it is the drivers and/or design changes, but we had R600 with an unnecessarily high amount of bandwidth.....it's possible.
 
If bandwidth is an issue, and it may be in CrossFire situations, it is not because it has a 256-bit bus, all it means is that it needs to run at faster GDDR5 frequencies to increase memory bandwidth. We aren't near the end of what GDDR5 is capable of yet, I think we'll see 7GHz out of it in its lifetime. 7GHz would provide 224GB/sec of bandwidth, the 5870 has 153GB/sec of bandwidth. So there is room left yet in a 256-bit bus/GDDR5 combo.
 
I don't think he's saying that the 256bit bus is whats wrong, but the total bandwidth. After doubling everything else, AMD only has a mild increase in that department. I thought they would be hitting somewhere around 170GB/s with this card. I'm trying to find someone who has overclocked just the memory and see if it give's a major increase in performance.
 
Last edited:
By the way, does Furmark even put a strain on the Texture units, ROP's, or memory? I've been under the impression it was mainly for testing the shaders.
AMD says that Furmark is a "power virus" because it strains everything and they really, really hate it. Whatever it does is unrealistic, since not even Crysis on maximum settings will hammer the GPU that much and provoke such high temperatures and power consumption.

Obligatory car analogy: running Furmark is like pushing the pedal to the metal in neutral gear. It helps you see if your engine will blow up or overheat at a certain RPM which you will never, ever reach while driving. (And of course, blaming the car manufacturer for introducing a rev limiter and demanding it removed or removing it yourself.)

Say no to Furmark!
 
The thing is, if you're into comparing 3DMarks with other people, than it's a reasonable app to use. But, if you're a person such as myself who buys hardware to stricktly have better performance in games, than it's not needed. When trying to figure out which card to buy for gaming, the only thing that makes sense to do is compare the actual gameplay perfromce for the games that you personally play. The biggest problem I have with 3DMark is that over the years, less informed gamers have come to think of it as the end all, be all tool. Then they go spreading that in forums and such. People should be fully informed on the subject, especially when paying high dollar amounts for the hardware.



Right and that is why I stated that the next time you read any video card review you pay attention to all the different types of test done from frames per second in games to "synthetic benchmarks" like 3dMark06.

It all adds up to performance data and the more the better in determining the performance of a video card.
 
AMD says that Furmark is a "power virus" because it strains everything and they really, really hate it. Whatever it does is unrealistic, since not even Crysis on maximum settings will hammer the GPU that much and provoke such high temperatures and power consumption.

Obligatory car analogy: running Furmark is like pushing the pedal to the metal in neutral gear. It helps you see if your engine will blow up or overheat at a certain RPM which you will never, ever reach while driving. (And of course, blaming the car manufacturer for introducing a rev limiter and demanding it removed or removing it yourself.)

Say no to Furmark!

I already knew about AMD's feelings about Furmark. In fact, because that program kept killing the VRMs on 4870's, AMD implemented a feature on 5800 that would throttle the clock down when the card got near an unsafe temp. I think they still detect the program with drivers and cut power to unnecessary parts to keep total temps down.
 
Right and that is why I stated that the next time you read any video card review you pay attention to all the different types of test done from frames per second in games to "synthetic benchmarks" like 3dMark06.

It all adds up to performance data and the more the better in determining the performance of a video card.

You seem to be not getting the point I'm trying tos ay. Very well could be my fault for not being clear enough. I, and people who follow my way of thinking, already avoid any test showing 3DMark scores because it's not an indacation of real-world gameplay. The only thing I'm against when it comes to 3DMark is that a majority of people still put way too much stock into it, and it shouldn't happen. It's sad that only [H] has a review method that means anything to gamers.

Edit---- I'm sorry that I started to turn this thread into a ran against synthetic benchmarks. It really wasn't my intention.
 
Here's mine. Heck, I'd forgotten that I'd backed my C2Q OC back a bit from 3.5 GHz to 3.4 for my 24/7 settings with how rock-solid this machine's been. :D (I last played with those settings last June or so! :eek:) FWIW, while I'd previously said that my 5870 is remarkably quiet at idle (and it is, at least in my environment), it definitely gets annoyingly loud running Furmark. :p Sure it may not be a 'real' load and to this point I haven't really noticed the fan too much in the little bit of gaming I've done (mainly some Batman:AA), but I'm still definitely looking forward to WC'ing this beast. :D With any luck a decent mem & VRM HS will show up in short order to couple with the MCW60 I've got sitting here... :)
 
Last edited:
I am aware of the power saving features and overvolt protection and thermal throttling, I was asking for proof that they detect the program with drivers and cut power based on the driver profiling the application.
 
From the Anandtech article
"Regardless of what AMD wants to call these stress testers, there was a real problem when they were run on RV770. The overcurrent situation they created was too much for the VRMs on many cards, and as a failsafe these cards would shut down to protect the VRMs. At a user level shutting down like this isn’t a very helpful failsafe mode. At a hardware level shutting down like this isn’t enough to protect the VRMs in all situations. Ultimately these programs were capable of permanently damaging RV770 cards, and AMD needed to do something about it. For RV770 they could use the drivers to throttle these programs; until Catalyst 9.8 they detected the program by name, and since 9.8 they detect the ratio of texture to ALU instructions (Ed: We’re told NVIDIA throttles similarly, but we don’t have a good control for testing this statement). This keeps RV770 safe, but it wasn’t good enough. It’s a hardware problem, the solution needs to be in hardware, particularly if anyone really did write a power virus in the future that the drivers couldn’t stop, in an attempt to break cards on a wide scale."

It clearly states that AMD put protection into their drivers to reduce the chance of hardware damage due to these kind of stress programs. As to how exactly they do that, you'll probably have to contact someone on the AMD driver team.

Edit- I don't know where I got the idea that they turned of parts of hardware. Guess I was thinking about how the Phenom's can turn off individual parts and inferred that AMD would do something similar with the Radeons. Sorry for any confusion But they still cut power use of certain programs to protect hardware, so the general statement I was trying to make stands.
 
Last edited:
You seem to be not getting the point I'm trying tos ay. .
:rolleyes:

I understand you perfectly well.

You don't understand what I posted.

Once again you will not find a video card review that doesn't use frames per seconds on games or synthetic benchmarks like 3dMark06.

The best reviews include a large array of game frames per seconds or synthetic benchmarks combined.

I understand you like to play games. I do as well. The best video cards for playing games perform better. The best video cards have high frames per seconds on games and high synthetic benchmarks.

The best video cards have low temps as well.


Think of synthetic benchmarks as a hunter taking a new weapon to a firing range to see how true it shoots. Once he knows the weapon performs well he takes the weapon hunting with him.


Bottom line is that video card performance can be measured.
 
"Once again you will not find a video card review that doesn't use frames per seconds on games or synthetic benchmarks like 3dMark06."


I'm arguing that only games needed to be used when reviewing cards for gamers. As far as not being able to find a review that doesn't use synthetics, did you forget about [H]'s policy on the issue? And even though other reviewers use them, that doesn't automatically make it relevant. Like I've said before, if you like to use it, then fine. I'm just saying that it should be made clear to everyone, especially people new to this hobbby, that they shouldn't pay any more attention to it than games. Just becasue a card scores higher in 3DMarks and other synthetics doesn't mean it's a better card. The R600 and R770 both score higher than their respective Nvidia counterparts, but real games show the exact opposite.
 
"Once again you will not find a video card review that doesn't use frames per seconds on games or synthetic benchmarks like 3dMark06."


I'm arguing that only games needed to be used when reviewing cards for gamers. As far as not being able to find a review that doesn't use synthetics, did you forget about [H]'s policy on the issue? And even though other reviewers use them, that doesn't automatically make it relevant. Like I've said before, if you like to use it, then fine. I'm just saying that it should be made clear to everyone, especially people new to this hobbby, that they shouldn't pay any more attention to it than games. Just becasue a card scores higher in 3DMarks and other synthetics doesn't mean it's a better card. The R600 and R770 both score higher than their respective Nvidia counterparts, but real games show the exact opposite.


Again you didn't understand me.

Pay attention to the word "or".

I never stated that all reviews have to include synthetic benchmark data. I stated that all reviews contain information about "frames per seconds of games" or "synthetic benchmarks", it's one or the other. Also I don't think FPS and synthetic benchmarks are "apples and oranges", IMO both are important to measure the performance of a video card.

I prefer all types of performance information to include performance with games, synthetic benchmarks, system power used, and GPU temperatures as well.

I'm not trying to argue with you but you haven't understood me. I hope my statements are clear now.

I understand you. :eek:
 
Last edited:
I'm not going to go into anything else but this. Can you tell me where exactly I said that people should pay attention to FPS? I've been saying the whole time that thats exactly what everyone should focus on. Hell, the Vicodin could be getting to me for all I know and I'm just seeing random words that aren't really there. Heh....

Edit: SonDa5, you don't need to answer. Continuing this further will just waste both of our time.
 
Last edited:
Here is a screen shot from a member over at TPU.

5870Fur.jpg



Looks like a hot one.
 
i thought these cards would be cooler.

Any new tests done?
 
Back
Top