AMD Radeon HD 4800 Series @ [H]

Damn, these cards look really tempting right about now. Too bad I'm so broke ahah
 
So they have an audio input? I looked at pics and didn't see it.

they just route the bit stream from the drive to the hdmi cable...most drives can pass audio data right over the PATA bus and from what I understand all SATA drives do this as well ( you no longer need the audio cables from drive to sound card)
 
They don't - they just route it via the PCI-E interface I believe. You do have to enable it via your OS but it's very simple to enable and disable.
Actually, you need to connect a 2-wire cable to route an S/PDIF signal from your motherboard to the card. This is a convenience if your MB has an S/PDIF pin header for digital audio, and you want to connect your vid card to an HDMI A/V receiver (or use the speakers in your HDTV) - the main difference between Dual-Link DVI and HDMI is that HDMI carries S/PDIF audio along with the digital video, all in one cable. Several of the newer vid cards offer the option. Keep in mind this is just a pass-through for a signal that your MB has to generate - if you don't have S/PDIF, or if your S/PDIF is optical, you'll just have to run your audio down a separate cable.

If your MB only has S/PDIF on the I/O shield, then routing it back inside the case is kind of silly - just crack open a DVI to HDMI adapter and solder an RCA plug onto the appropriate pins. I should sac a DVI to HDMI adapter and post a mod for that.

In practice though, even folks using these cards in DVR rigs are probably connecting the audio to the reciever, and the HDMI directly to the TV, right? I'm thinking HDMI-switching A/V receivers are still in the minority, and most people still have component-switching A/V receivers? Besides, with the HDCP encryption mess, it takes forever to switch over to the input from a PC when connected directly to an HDTV ...about 8-10 seconds with my Aquos. I couldn't imagine waiting for HDCP to negotiate through the A/V switch and then the HDTV :eek:

/hijack
 
Actually, you need to connect a 2-wire cable to route an S/PDIF signal from your motherboard to the card. This is a convenience if your MB has an S/PDIF pin header for digital audio, and you want to connect your vid card to an HDMI A/V receiver (or use the speakers in your HDTV) - the main difference between Dual-Link DVI and HDMI is that HDMI carries S/PDIF audio along with the digital video, all in one cable. Several of the newer vid cards offer the option. Keep in mind this is just a pass-through for a signal that your MB has to generate - if you don't have S/PDIF, or if your S/PDIF is optical, you'll just have to run your audio down a separate cable.

If your MB only has S/PDIF on the I/O shield, then routing it back inside the case is kind of silly - just crack open a DVI to HDMI adapter and solder an RCA plug onto the appropriate pins. I should sac a DVI to HDMI adapter and post a mod for that.

In practice though, even folks using these cards in DVR rigs are probably connecting the audio to the reciever, and the HDMI directly to the TV, right? I'm thinking HDMI-switching A/V receivers are still in the minority, and most people still have component-switching A/V receivers? Besides, with the HDCP encryption mess, it takes forever to switch over to the input from a PC when connected directly to an HDTV ...about 8-10 seconds with my Aquos. I couldn't imagine waiting for HDCP to negotiate through the A/V switch and then the HDTV :eek:

/hijack
ok. what you said above is totally wrong re: the 4870. The 4870 has its own sound processor on it.

What you said above applies to some of the nvidia cards with audio pass-through.
 
So what's taking the power calculations so long? I'd like to see how much heat two 4870's in crossfire dump so I can see if they would be a better fit for my new cooling loop than two gtx 260's in sli - everywhere I've looked on other sites show the total system idle and load, which doesn't tell me how much to subtract for the system alone like the [H] does.
 
I see. This works with a sound card or just onboard sound?

Its not the matter if you have a sound card or not. The OS passes the sound information through the pci-express slot into the graphics card (provided the right drivers are installed and you TELL the OS to send the sound), and then the card sends the information along with the video through HDMI.

The existence of a sound card does not matter because it does not have to decode or work with the sound information.
 
ok. what you said above is totally wrong re: the 4870. The 4870 has its own sound processor on it.

What you said above applies to some of the nvidia cards with audio pass-through.
DOH! OK, I'm a dumbass - I checked the card specs and it has an 8 channel decoder on it - pretty spiffy. So yeah, that looks like a PCIe-addressable device as far as the host computer is concerned.
 
DOH! OK, I'm a dumbass - I checked the card specs and it has an 8 channel decoder on it - pretty spiffy. So yeah, that looks like a PCIe-addressable device as far as the host computer is concerned.
That's right. Windows will recognize an "HDMI Audio device".
 
The 3870 and 2600XT uses the PCI-E to pass the audio thru the card then through the ATI DVI-HDMI proprietary adapter to a TV that ATI recognizes.



ALSO, Does anyone know what the idle temps are on the Powercolor 4850s? Will be mid 60s (like Visiontek) or mid 80s (Asus)?
 
Does anyone know how many amps does the 4870 draw at load? I want to step up to a 750W(60A single rail) PS and want it to be able to power 2 of these cards or at least a 4870x2 when it comes out.
 
I would like to suggest not using 2560x1600 resolution on midrange graphics cards. Or at least try x8 AA settings first. That resolution is kind of pointless on these cards. IMO A person with a 30" monitor is not going to buy a $200 video card for gaming. Plus, I would rather use my 37" or 50" 1080p HDTV instead (so 1920x1080).

So please try some higher AA settings before jumping above 1920x1200. Ya, sure its cool to know the video card can handle that ultra high resolution, but I cant see many using it. Its just not a practical gaming resolution.

BTW, I loved [H] reviews, so this is just nitpicking criticism.

Actually this makes a lot of sense to me. We will discuss it.

So for $200 video card should 1600x1200 be our top end res? (Doing a widescreen at 1620x1080 renders less pixels so 16x12 would be more demanding and cover all 1600 users IMO.)
 
I'll throw my support behind that as well, 1600x1200 and show me more AA and higher game settings.
 
So what's taking the power calculations so long? I'd like to see how much heat two 4870's in crossfire dump so I can see if they would be a better fit for my new cooling loop than two gtx 260's in sli - everywhere I've looked on other sites show the total system idle and load, which doesn't tell me how much to subtract for the system alone like the [H] does.

Sorry, been busy with month-end/beginning things.

This should give you a good overall picture. CPU loads and no-card load noted as well to give you some gauge as to what exactly is going on. (Still trying to find a better way of doing this, but AMD and NVIDIA have not been much help beyond what we can figure out ourselves.)

http://enthusiast.hardocp.com/article.html?art=MTUyNCw4LCxoZW50aHVzaWFzdA==
 
i know this is overkill and I do understand why you tested the way you did but I would like to have seen some

COD4
UT3
BioShock
World at War

You know games that were somewhat hard on last gen hardware

and a lot of us are gamers stuck at 1680*1050 or 1440*900 so those resolutions would have been nice to see with everything maxed out IQ wise
 
i know this is overkill and I do understand why you tested the way you did but I would like to have seen some

COD4
UT3
BioShock
World at War

You know games that were somewhat hard on last gen hardware

and a lot of us are gamers stuck at 1680*1050 or 1440*900 so those resolutions would have been nice to see with everything maxed out IQ wise

UT3 - No challenge to modern GPU at all. $125 cards play it at 1920 with great eye candy. Honestly, after all the graphics hype, this was the biggest joke we have seen in a while. And I wont get into gameplay....

COD4 - We just dropped it this round because we are seeing sub-$150 cards play it great at max res.

BIOShock stopped being challenging a while back, same issues. Game is almost a year old.

Call Of Duty World At War - "World at War is slated to arrive on PC, PlayStation 3, PlayStation 2, Wii, Xbox 360 and Nintendo DS this fall." & "Using the same engine as Call of Duty 4..."Did we miss something here??? That has happened before. But we can't test with what is not available.

We are not going to spend the resources on using games as gameplay evaluation tools that basically tell us nothing. Remember we are trying to find out the overall gaming value of a video card in these articles, not point out specific setting for specific games, although we do articles on that as well that are quite popular.

So it comes down to this from a business perspective. Why should I pay editors more money to cover games that will have no impact on our conclusions? If we can show you that a $200 video card played the very recently released and GPU-heavy Assassin's Creed at 2560 resolution, what are the above games going to tell us? At some point we have to give the power of conclusion back to our readers to make the decision that is right for them. In fact, that is all we have ever tried to do. While this sounds smartass, hopefully it will spell out my point. "If it play Conan at 1600x1200, it will play Quake 3 just fine." That is extreme, but the analogy stands.

The last thing that comes into this is game popularity. While not the most hard and fast rule, we do try to stick to games that are popular and are selling well so we try to make sure our games are pertinent to our readers. Yes, you can find exceptions to this in our past and I can give you reasons as to why if you need to know.

And let me follow up by saying, we LOVE doing the game specific articles like we have done in the past year. We will do more. But they are almost "fill-in" content for when the review schedule dies down a bit.
 
Hey Kyle,

Any chance you can downgrade Assassin's Creed and run a comparison of DX10.1? Would like to see if DX10.1 hardware matured any since 3870s.
 
It seems off to me as well. AMD was just in house giving a huge presentation on their upcoming product about a month ago and they kept saying how much more efficient this card is compared to Nvidia.
 
Seems neither the 4800 or 200 series could be called energy misers,so I'd call it a wash there.Guess there's only so much they can do conserve power and still deliver performance.
 
So those power consumption #'s were with Furmark and something like Prime/Orthos loading the CPU ? Or was it just with the videocard loaded ?
 
It seems off to me as well. AMD was just in house giving a huge presentation on their upcoming product about a month ago and they kept saying how much more efficient this card is compared to Nvidia.

Im with you guys. something isnt right when everybody else is getting much lower power consumption numbers. also if thats what the 4870 actually consumes then that is ridiculous compared to the gtx 280. also most other reviews show higher temps for the 4800series. imo the 4800 series fan issues are enough to turn me off for the time being.
 

I can't answer for any other sites testing. I have explained in detail what we did and that we think power testing needs to be done again. I shared the numbers with AMD before we published and AMD had no issue with them.

You might ask those other sites if they took a day to do nothing but power test cards on the exact same system. We just report our experience and quite frankly I don't read other reviews.
 
I have a question about "Idle Power" from the review.

Are the values taken in the bios, or after the computer booted and the drivers have been loaded.

Reason I ask is that currently if I take Idle while in the bios, my ATI HD3870 is on 3D clocks and my Kill-A-Watt reports ~160w. If I take Idle after Vista64 has loaded, the card is in 2D clock mode and computer Idle consumption is ~120w. (no cpu clock changes I.E. speedstep?)

Thanks,
 
alright cool man..........

lol not COD WAW I mis-spoke I meant World in conflict

Here is our write up on WIC from about 9-10 months ago.

http://enthusiast.hardocp.com/article.html?art=MTM5OSwxLCxoZW50aHVzaWFzdA==

Gameplay and using WIC to stress the GPU are two very different things.

That doesn't mean that those of you with a 512 MB NVIDIA GeForce 7900 GTX or ATI Radeon X1950 XTX are in the clear. Not by a long shot. You will need a GPU that can perform at least on par with the GeForce 8800 GTS GPU, coupled with at least 512 MB of memory if you want to enjoy World in Conflict with "Very High" settings. If you want maximum settings, you have two options right now: a GeForce 8800 GTX or a GeForce 8800 Ultra.
 
Those new power draw numbers have me worried, I've just bought a 850w Antec modular PSU rated for SLI/Xfire and have it powering my quad core system which Im going to be putting in 2 4870's when they arrive today.

With power draws of around 350w each thats 700w just for the cards...that sounds a bit close to me, I was under the assumption from many other reviews that power draw on these cards were a bit lower than what you're reporting.
 
Oh, I only had chance to glance over the chart this morning before work, I didn't read it that well just picked out the top 2 rows for 4870 power draw.

Ooops :eek:
 
My q6600 OCd to 3.4ghz, 4GB DDR2, and two 4870s draw about 470W max under full load.
 
So the CF 4870 wattage alone would be 458w - 109w (system) = 349w x .8 (620HX 80% efficient) = 279.2w
That right?
 
From my experiences (and this may be a bit forward looking as well) if I was buying a PSU today to last me for the next 2 years and was looking CF (even up to 4 4870), I would go with at a GOOD 800w PSU. Let's consider hypothetically that I had 4 x 4870 and was stress testing them over night with the above setup. I would guess you would not see spikes above 725w (at the wall). So 725*.8=580w PSU power draw. 580*1.25=725w (My PSU Full Load + 25% is my personal formula). An 800w should give you some solid room, if it will do the rated 800w. Shitty PSUs need not apply.
 
Oh, and you can poo-poo my power numbers all you want as being high compared to Site X, but I KNOW what I see happening over a series of tests lasting more than day. Maybe I got some cards that had a higher leakage than some others. But rest assured that we are going to lead to you have a bit too much power rather than not enough. Introducing you to problems is not our business. We have done this for a decade to help folks out. Take it for what you want, but that little red X is always in the corner of your browser and will cure whatever [H] ails you.
 
So the CF 4870 wattage alone would be 458w - 109w (system) = 349w x .8 (620HX 80% efficient) = 279.2w
That right?


Think about it in terms of total system wattage like we have shown you at the wall. With an 80% efficient PSU, that makes it 458w X .8 = 366w that the PSU is actually outputting. If you look at the actual efficiency numbers HERE in the ES Review you will see that the ES-800 efficiency is actually even a little higher at that draw. So there is a bit of a safety zone built there too.

NOW, keep in mind that we are not fully loading the CPU's 4 cores when we show you the 458w GPU load number. I would suggest a fully loaded stock CPU is going to pull about 40w more (worst case scenario and keep in mind that OCing will increase this exponentially). So lets take a 500w usage at the wall number for you. 500w*.8(keeping in mind you buy a quality power supply shown to actually do 80% under load)=400w load for your PSU. My personal rule is actual power X 1.25, IF you are not planning to add any load to the PSU, which is likely not going to happen knowing you guys. So anyway you end back to a number of 500w rating for your PSU with this system under full CPU and GPU load.

So really the way I see it is that at the wall numbers show you what you NEED in terms of PSU rating (again for a GOOD PSU). Add 25% to that number for future expansion and you have 625w.

Get you a 600w PSU that we have reviewed and awarded and you are likely going to be golden for a while and not be running the thing under stress or "red-lined" for extensive periods of time. Then you will almost assuredly KNOW that when you have issue, it is not because of a lack of good clean power. And keep in mind we did not spend thousands and thousands of dollars on PSU test equipment to give you useless information. You can read the conclusion of any of them, even if you have no knowledge of electricity, and come away knowing what you are buying. IMO anyway. ;)
 
At the rate power connectors are chaning too, molex, to multi molex, then to PCI-E and then 8 pin PCIe, chances are you'll need a new PSU for the new connector types alone before the power values become a problem, my 8800gtx is 2x 6pin and 1 gen later the 280 it's 1x8pin and 1x6pin :/
 
Looks interesting. But now I'm torn between G260 and 4870. I wanted to buy the ATI card (for use in Age of Conan mainly) but now I'm not so sure.
Any advice?
 
Support AMD! They need you! :p

Also you may want to consider how big your case is, I know the GTX260 is bigger than the 4870 by a good bit.
 
Back
Top