Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
So they have an audio input? I looked at pics and didn't see it.
Actually, you need to connect a 2-wire cable to route an S/PDIF signal from your motherboard to the card. This is a convenience if your MB has an S/PDIF pin header for digital audio, and you want to connect your vid card to an HDMI A/V receiver (or use the speakers in your HDTV) - the main difference between Dual-Link DVI and HDMI is that HDMI carries S/PDIF audio along with the digital video, all in one cable. Several of the newer vid cards offer the option. Keep in mind this is just a pass-through for a signal that your MB has to generate - if you don't have S/PDIF, or if your S/PDIF is optical, you'll just have to run your audio down a separate cable.They don't - they just route it via the PCI-E interface I believe. You do have to enable it via your OS but it's very simple to enable and disable.
ok. what you said above is totally wrong re: the 4870. The 4870 has its own sound processor on it.Actually, you need to connect a 2-wire cable to route an S/PDIF signal from your motherboard to the card. This is a convenience if your MB has an S/PDIF pin header for digital audio, and you want to connect your vid card to an HDMI A/V receiver (or use the speakers in your HDTV) - the main difference between Dual-Link DVI and HDMI is that HDMI carries S/PDIF audio along with the digital video, all in one cable. Several of the newer vid cards offer the option. Keep in mind this is just a pass-through for a signal that your MB has to generate - if you don't have S/PDIF, or if your S/PDIF is optical, you'll just have to run your audio down a separate cable.
If your MB only has S/PDIF on the I/O shield, then routing it back inside the case is kind of silly - just crack open a DVI to HDMI adapter and solder an RCA plug onto the appropriate pins. I should sac a DVI to HDMI adapter and post a mod for that.
In practice though, even folks using these cards in DVR rigs are probably connecting the audio to the reciever, and the HDMI directly to the TV, right? I'm thinking HDMI-switching A/V receivers are still in the minority, and most people still have component-switching A/V receivers? Besides, with the HDCP encryption mess, it takes forever to switch over to the input from a PC when connected directly to an HDTV ...about 8-10 seconds with my Aquos. I couldn't imagine waiting for HDCP to negotiate through the A/V switch and then the HDTV
/hijack
I see. This works with a sound card or just onboard sound?
DOH! OK, I'm a dumbass - I checked the card specs and it has an 8 channel decoder on it - pretty spiffy. So yeah, that looks like a PCIe-addressable device as far as the host computer is concerned.ok. what you said above is totally wrong re: the 4870. The 4870 has its own sound processor on it.
What you said above applies to some of the nvidia cards with audio pass-through.
That's right. Windows will recognize an "HDMI Audio device".DOH! OK, I'm a dumbass - I checked the card specs and it has an 8 channel decoder on it - pretty spiffy. So yeah, that looks like a PCIe-addressable device as far as the host computer is concerned.
I would like to suggest not using 2560x1600 resolution on midrange graphics cards. Or at least try x8 AA settings first. That resolution is kind of pointless on these cards. IMO A person with a 30" monitor is not going to buy a $200 video card for gaming. Plus, I would rather use my 37" or 50" 1080p HDTV instead (so 1920x1080).
So please try some higher AA settings before jumping above 1920x1200. Ya, sure its cool to know the video card can handle that ultra high resolution, but I cant see many using it. Its just not a practical gaming resolution.
BTW, I loved [H] reviews, so this is just nitpicking criticism.
So what's taking the power calculations so long? I'd like to see how much heat two 4870's in crossfire dump so I can see if they would be a better fit for my new cooling loop than two gtx 260's in sli - everywhere I've looked on other sites show the total system idle and load, which doesn't tell me how much to subtract for the system alone like the [H] does.
nice review overall but i would have liked to see more games tested........
i know this is overkill and I do understand why you tested the way you did but I would like to have seen some
COD4
UT3
BioShock
World at War
You know games that were somewhat hard on last gen hardware
and a lot of us are gamers stuck at 1680*1050 or 1440*900 so those resolutions would have been nice to see with everything maxed out IQ wise
It seems off to me as well. AMD was just in house giving a huge presentation on their upcoming product about a month ago and they kept saying how much more efficient this card is compared to Nvidia.
Im with you guys. something isnt right when everybody else is getting much lower power consumption numbers. also if thats what the 4870 actually consumes then that is ridiculous compared to the gtx 280. also most other reviews show higher temps for the 4800series. imo the 4800 series fan issues are enough to turn me off for the time being.Something is off.
http://enthusiast.hardocp.com/article.html?art=MTUyNCw4LCxoZW50aHVzaWFzdA==
versus
http://techreport.com/articles.x/14990/15
versus
http://www.legitreviews.com/article/734/18/
versus
http://arstechnica.com/reviews/hardware/ati-4800-series-review.ars/7
versus
http://www.anandtech.com/video/showdoc.aspx?i=3341&p=22
versus
http://www.pcper.com/article.php?aid=581&type=expert&pid=12
versus
http://www.tweaktown.com/reviews/1481/13/sapphire_radeon_hd_4870_in_crossfire/index.html
versus
http://hothardware.com/Articles/ATI_Radeon_HD_4850_and_4870_RV770_Has_Arrived/?page=12
In every case but [H]'s that I've seen everywhere, the 4870 uses less power than the gtx 280 under load by approx 20-30w. What gives?
Something is off.
http://enthusiast.hardocp.com/article.html?art=MTUyNCw4LCxoZW50aHVzaWFzdA==
versus
http://techreport.com/articles.x/14990/15
versus
http://www.legitreviews.com/article/734/18/
versus
http://arstechnica.com/reviews/hardware/ati-4800-series-review.ars/7
versus
http://www.anandtech.com/video/showdoc.aspx?i=3341&p=22
versus
http://www.pcper.com/article.php?aid=581&type=expert&pid=12
versus
http://www.tweaktown.com/reviews/1481/13/sapphire_radeon_hd_4870_in_crossfire/index.html
versus
http://hothardware.com/Articles/ATI_Radeon_HD_4850_and_4870_RV770_Has_Arrived/?page=12
In every case but [H]'s that I've seen everywhere, the 4870 uses less power than the gtx 280 under load by approx 20-30w. What gives?
alright cool man..........
lol not COD WAW I mis-spoke I meant World in conflict
That doesn't mean that those of you with a 512 MB NVIDIA GeForce 7900 GTX or ATI Radeon X1950 XTX are in the clear. Not by a long shot. You will need a GPU that can perform at least on par with the GeForce 8800 GTS GPU, coupled with at least 512 MB of memory if you want to enjoy World in Conflict with "Very High" settings. If you want maximum settings, you have two options right now: a GeForce 8800 GTX or a GeForce 8800 Ultra.
So the CF 4870 wattage alone would be 458w - 109w (system) = 349w x .8 (620HX 80% efficient) = 279.2w
That right?
Looks interesting. But now I'm torn between G260 and 4870. I wanted to buy the ATI card (for use in Age of Conan mainly) but now I'm not so sure.
Any advice?