ATI Radeon HD 3800 Series @ [H]

^ i'd like to know also.. I'm not sure if I'll have to upgrade my psu when I get a 3800..
 
Watts = volts*amps, but I'm pretty sure that video cards use both the 5v and 12v rails from the PSU so there'd be no way to tell unless you knew how much (wattage) each line drew. Also, that 450-550w recommendation includes the video card AND whatever AMD/ATi decide constitutes a standard mobo, CPU, disk drives, etc. At 55nm, I would hope it wouldn't draw too much, but then again, it is still the son of the beast that was the 2900xt.

I'm sure someone out there with a 38x0 and a multimeter would love to tell us :)
 
Keep in mind also that the video card will be drawing power through the PCIe slot as well as through the separate power connector. The power it gets from the PCIe slot is pretty close to impossible to measure, unfortunately.

Probably the closest easy way to measure the total power draw is to test it on a motherboard with integrated graphics. Measure the power draw (at the wall) while idle without the graphics card, then while doing something CPU heavy (again, without the graphics card), then measure idle with the graphics card, then measure with a GPU-stressing load. That'll give you the idle and load power figures for the video card.
 
'Just got some interesting info from The Stilt at MuroBBS
He's been testing a HD3870, and like pretty much all reviewers who have OCd the card, reached ~860MHz.
He said, that ATI has already released a new BIOS to fix an issue with the current/review cards - they have the PLL VIO divider set wrong, which prevents the cards from reaching over 862MHz on core. However, most of the retail cards should carry the new BIOS, luckily.
He promised to report with the new BIOS on OCs tomorrow'

http://forum.beyond3d.com/showpost.p...postcount=1555
 
I can strip down to one optical unit and one HD if I have to so what I have might be enough. Chieftec 400w, 29amps on the +12vs, six-pin PCIe connector with the PSU as well, that is supposed to deliver up to 75w. (400AA on http://www.chieftec.com/smart-power.html )

Either I pick up a PSU or a 3870 by the end of the month. I'm leaning towards a 3870 so I can rest on the graphical front until I've upgraded everything else and then simply pick up another 3870, and then much later a third, a fourth. <3


I also found this on that b3d forumlink: http://www.computerbase.de/artikel/hardware/grafikkarten/2007/test_ati_radeon_hd_3870/28/

These charts that show Nvidia parts strongest in 4/16 mode, and ATI parts beating them very nicely in 8/16 mode. Even in the larger resolutions. ATI cards officially stronger and better performing than Nvidia cards, since AA/AF scaling is all it's about according to the ATI haters? I'd like to see some tests done by people with access to both the 8800GT and 3870 to see if this is correct, if everybody missed out on this since nobody went past 8xAA in tests ever? Is this why Nvidia specifically asked for tests to be run in 4xAA and not anything higher?
 
These charts that show Nvidia parts strongest in 4/16 mode, and ATI parts beating them very nicely in 8/16 mode. Even in the larger resolutions. ATI cards officially stronger and better performing than Nvidia cards, since AA/AF scaling is all it's about according to the ATI haters? I'd like to see some tests done by people with access to both the 8800GT and 3870 to see if this is correct, if everybody missed out on this since nobody went past 8xAA in tests ever? Is this why Nvidia specifically asked for tests to be run in 4xAA and not anything higher?
I'm not convinced that the visual difference between 4xAA and 8xAA is great enough to carry much weight in this discussion, though. [H] have shown some image comparisons, and I'm hard pressed to see much of a difference when they're side-by-side. I can only imagine that in the context of a game it would be even more subtle. Especially as you scale beyond 1600x1200.
 
hmm, even though the card does get a higher fps count at 8xaa than 4xaa, the difference is very small; would that be accounted to experimental/calculation differences? (whatever that difference term is)
 
Those numbers are just in relation to each other... hovering over different video cards gives you a percentage value and not much else. The 3870 might be a step in front at 8/16 at that insane resolution, certainly possible due to how it might handle AA and AF duties, but it's probably like 20 FPS over 18 FPS and not really playable either way anyways.
 
lolwut?

That chart shows the 3870 getting MORE FPS at 8xAA than it did at 4xAA. Instantly discredited.

Actually, if you run 3dMark06 w/ the 3870 vs. the 8800GT, you'll notice that the 3870 actally beats the 8800GT starting at 8xAA which is weird to say the least, so this isn't pulled out of no where: 8xAA seems to be the magic spot for the 3870.
 
3dmark doesn't count

it does, but its only a benchmarking tool; i.e. one of the many programs to compare different graphics cards. But I agree that it is only a benchmark and performance with that program doesn't necessarily reflect performance in games.
 
I know it's cool to for someone who joined the forums two whole weeks ago to smugly attack all things [H]ard when their previous posts demonstrate that they haven't yet climbed the learning curve and become [H]ard themselves, but it's only confusing to people who haven't learned better yet and/or aren't paying sufficient attention.

[H] didn't change from benchmarks and bar graphs just to offer something different and differentiate themselves in the marketplace. They did it because they recognized that apples-to-apples doesn't really tell the potential buyer what they need to know. If it doesn't tell you what you need to know, it is of no value. If it is of no value, why should they waste any time (and the money they pay their staff) doing it? It's a paradigm shift. Like all paradigm shifts, it's slow to catch on and hard to wrap your head around. Like all paradigm shifts, it's going to be "obvious" in hindsight as the necessary course of action one day. Welcome to the future.
Hmm, I checked, you've only been here 3 years, you don't know what [H]ard is :rolleyes:

I am a potential customer. Apples-to-apples reviews tell me exactly what I need to know. Who are you to say it is of no value? As stated, 'real-world' in this case is real-world for someone who has that particular setup? So someone is telling us what real-world is? The way hardocp reviews cards is fine, but it is not the end-all and be-all of reviews. Ten readers with ten different processors. Apples-to-apples mght not tell him what each would get in game, but it'll tell him Card 1 may be better than Card 2. hard review will tell him real world results .. If he happens to have a similar setup.

I read Hard's review but skip the results. That's just me. Don't knock either format, they are both of value.
 
Hmm, I checked, you've only been here 3 years, you don't know what [H]ard is :rolleyes:

I am a potential customer. Apples-to-apples reviews tell me exactly what I need to know. Who are you to say it is of no value? As stated, 'real-world' in this case is real-world for someone who has that particular setup? So someone is telling us what real-world is? The way hardocp reviews cards is fine, but it is not the end-all and be-all of reviews. Ten readers with ten different processors. Apples-to-apples mght not tell him what each would get in game, but it'll tell him Card 1 may be better than Card 2. hard review will tell him real world results .. If he happens to have a similar setup.

I read Hard's review but skip the results. That's just me. Don't knock either format, they are both of value.

Sigh. Okay, sporto, let's just try and look at that little rant with our brains turned on. The crux of it is your statement, "Apples-to-apples mght not tell him what each would get in game, but it'll tell him Card 1 may be better than Card 2." If the test doesn't tell you what you'll get in-game, how can it tell you which card is better?

Except in large chunks, FPS are invisible to the gamer while gaming. The only visible measure of a card's value is the quality of the image it provides. Anything else is just "keeping score." I don't want to pay real money for imaginary "wins." I want to pay it for results I can see.

So, you need to change your statement, "Apples-to-apples tells me what I need to know" to "Apples-to-apples tells me what I want to know." As a fan of technology, you want to see a competition, read some numbers, and tally up the score. As a consumer, you need to know what real value is and where you can buy it for how much.

The reference to length of membership is relevant to me when someone is brand new and hasn't yet grasped the philosophy of this site. I wouldn't consider it as relevant for someone who had been active here for a year or more. Presumably they've been around enough to "get it." How do you think it speaks well of you to have been here twice as long as me and still not "get it," leaving aside the fact that I was reading [H] since 1999 even though I didn't sign up until later?
 
Single cards are working fine and have always worked fine, I don't get why anyone has a problem with a company making it so their platform can use multiples of the same thing, just because it gives the best playable doesn't mean the single card solution won't be a HUGE upgrade, which isn't the case at all.

I don't remember this argument coming from multi-CPU platforms.
 
How much power can you get out of a regular four pin molex cable?


Reading arstechnicas 3800 series article:
ATI has also confirmed that the HD 3870 doesn't actually need a dual-slot cooler; Asus has plans to launch a single-slot card later in the year.
 
Single cards are working fine and have always worked fine, I don't get why anyone has a problem with a company making it so their platform can use multiples of the same thing, just because it gives the best playable doesn't mean the single card solution won't be a HUGE upgrade, which isn't the case at all.

I don't remember this argument coming from multi-CPU platforms.

I have a cousin once who literally cried because she got a large ICEE and just wanted a small.
 
I don't get why anyone has a problem with a company making it so their platform can use multiples of the same thing, just because it gives the best playable doesn't mean the single card solution won't be a HUGE upgrade, which isn't the case at all.

I don't remember this argument coming from multi-CPU platforms.

Multi-CPU platforms were hardly even discussed. If anyone tried to push a quad CPU system at home user, they would have been lampooned. Multi-core is a different story since there are essentially no downsides.

Back to graphics. SLI/Crossfire is minority use among even gamers, but I guess I can see why some would go for dual cards sometimes. I wouldn't touch it with a ten foot pole, but I can see some going for it and it actually making some sense in a few obscure cases.

Triples and quads are just plain idiocy. You need a huge power supply to power it and associated increase in the electrical bill. You have to remove all the extra heat created, so that means high noise. Finally you get onto the expense part where you motherboard and power supply both cost more to house this, and then buying 4 cards will be a huge expense, so much so that they will probably not buy top end cards, even so, spending $800 to $1000 on video cards that still likely only have 512mb of ram. Also unless they are continually upgrading their quad graphics card they will likely fall behind a single high end card in a few months.

IMO you would really have to be a moron to buy into a quad card setup. We will have to wait and see on how large the market is for geek-morons with buckets of cash. I am betting it is pretty Insignificant.
 
IMO you would really have to be a moron to buy into a quad card setup. We will have to wait and see on how large the market is for geek-morons with buckets of cash. I am betting it is pretty Insignificant.

I dunno, how many PS3 owners are out there still? /jab
 
I dunno, how many PS3 owners are out there still? /jab

More every day I assume.

$400 to play games for PS3.
vs
$2000+ for quad graphics card monster that will impact your power bill significantly and heat your room, to play games.

Seem like oceans apart on the moron scale. Now if someone were to duct tape together 4 PS3s in an attempt make them play game faster, we might have an idiot contest, but PS3 vs quad graphics cards is no contest.
 
$2000+ for quad graphics card monster that will impact your power bill significantly and heat your room, to play games.

Or save your bill cuz you wont need to use your house's heating system during the winter. :p
 
Snowdog said:
Multi-CPU platforms were hardly even discussed. If anyone tried to push a quad CPU system at home user, they would have been lampooned. Multi-core is a different story since there are essentially no downsides.

I think you forgot about the early gamer PC's being released by eager companies, especially when alien got into the game and started selling dual CPU P3 systems which yielded 0 results in any item the actual PC was intended for, it also cut back on the GPU to put two CPU's in the case.

Being a smaller case then the GPU's it was still out there and people did want to run multi-cpu platforms for their "ultimate" gaming solution for a while, where as it isn't until just recently when games started to be released to take advantage of that.

I agree with you that multi card solutions are lame to purchase, but it doesn't bother me in the slightest that they support it.
 
Actually, if you run 3dMark06 w/ the 3870 vs. the 8800GT, you'll notice that the 3870 actally beats the 8800GT starting at 8xAA which is weird to say the least, so this isn't pulled out of no where: 8xAA seems to be the magic spot for the 3870.
Sounds like to me once again that ATi has spent too much time on getting the X2900XT memory bandwidth which ends up being wasted, they halve everything for the X3800 series and they still have too much bandwidth. I guess this is forward looking but only time will tell whether or not additional memory bandwidth can be of any use or not..
 
More every day I assume.

$400 to play games for PS3.
vs
$2000+ for quad graphics card monster that will impact your power bill significantly and heat your room, to play games.

Seem like oceans apart on the moron scale. Now if someone were to duct tape together 4 PS3s in an attempt make them play game faster, we might have an idiot contest, but PS3 vs quad graphics cards is no contest.

touche...
 
ATI Radeon HD 3800 Series - The ATI Radeon HD 3800 series graphics cards are here. We will explore performance in Crysis, Unreal Tournament 3, NFS: Pro Street, and TimeShift. We can now recommend two ATI graphics cards for your gaming needs.



Please Digg to share! Thanks.


These "best playable framerate" [H] reviews continue to confuse end users. If equal comparisons of GPU's were made AND "best playable" it would be fine, but as it stands now.....

http://www.incrysis.com/forums/viewtopic.php?id=13293
 
Why are people still arguing about this? There are 1290871324 other sites out there that do direct, apples to apples comparisons. Read those reviews.
 
What [H] needs to do is run an analysis on the forums and see if, in each review discussion thread, it's the same people bitching or not (maybe based on posting IP address) and then ban the dumbasses, so that those of us who understand the methodology can actually have a meaningful conversation about the product. :D
 
What [H] needs to do is run an analysis on the forums and see if, in each review discussion thread, it's the same people bitching or not (maybe based on posting IP address) and then ban the dumbasses, so that those of us who understand the methodology can actually have a meaningful conversation about the product. :D

A post worth celebrating on this CyberMonday. God bench us, every one...:cool:
 
I had a 7950gt 256. i just bought a 3870 got it local for $240 it was like $260 after the card and a couple cans of bawls. I personally was expecting better frames in cod4 at 1440x900. I mean I can play everything on high and get at the lowest 37fps but I was kinda hoping for a constant 60. The card for sure out performs my old card. my friends say im crazy cause the human eye cant see higher than 30fps. Im pleased by the card but was just expecting a little more. im hoping they release more drivers. I dont have crysis and cant afford it now due to my recent card purchase, but i would like to see how this card performs on it in person. Im downloading 3dmark right now to see what i can get out of it.
 
Back
Top