ATI Radeon HD 3800 Series @ [H]

you could always download the crysis demo to test out your rig... and if you have a crossfire motherboard, you can buy another 3870 in a month or so and double your speeds for longevity of your current rig... which is what i plan on doing :cool:
 
Not this debate again...some trained eyes can spot differences at 200fps, so there's no real max number that everyone sees.

Also, in games, you can always tell the difference between 30 and 60fps if you simply move your screen or if objects spin/move in front of you...therefore the more fps, the better.
 
thanks yeah im downloading the demo now. In about an hour I will be able to install it. I swear I had this demo on one of my hard drives oh well its lost somewhere. I cant tell the difference only between 30-40 anything beyond 40 i cant tell. I just spent an hour testing this in cod4.
 
the center of your eye cannot see movement well.

the outer section of your eye, the perimeter of your retina, cannot see detail, cannot see colour well at all, but can see high speed motion super well. that is why you havent been crushed by a car yet, or can actually play a sport like boxing.

the 75hz thing is for images focused on and in detail mode. the rest of your mind is pissed off at that hz
 
Remember that movies are projected at 24fps and they don't look all that bad.

My personal lowest acceptable frame rate is around 25 fps or so. I honestly can't tell the difference between frame rates >40fps with an analog monitor set to 120Hz refresh rate.

And no, I don't play games looking from the side of my field of view, I tend to stare straight at the monitor.
 
Not to beat a dead horse to death again for the 50th time, but obviously this is a highly subjective topic and seems to be different from person to person. There are definitely diminishing returns on framerates higher than 50-60 fps but I seem to remember articles stating that humans could in fact notice a *difference* (not necessarily better or worse) between framerates much higher. Whether this is peripheral vision or not, your brain could tell the difference, period.

As has been stated countless times before, movies != games. They are filmed on cameras that have shutter speeds, meaning they record images onto film by opening their shutters to light for a small fraction of time. This means that if objects move during that short time the motion is also recorded, ie- motion blur. This gives the filmed subjects much smoother transitions between frames if they do move, or conversely if the camera moves still objects have blur giving the camera panning motion fluidity to the viewer.

On the other hand, many computer games (especially ones made before the past year or so) render objects on a frame-by-frame basis. This means the motion the objects may have gone through to get from frame A to frame B is not recorded at all, only their final positions for frame A and frame B. We simply perceive this as motion as our brain connects the dots for us.

A simple example is to go watch a live-action film and then watch a stop-action film (I've always been a fan of the Wallace & Gromit movies). If you don't have any claymation around just google for some flipbook animations, same concept. There's an obvious difference, and that difference is blurring.

New games are starting to use motion blur (including CoD4), and the result is one of the most pronounced changes to happen to graphics in awhile. I would even go so far as to put it on par with anti-aliasing. Games now appear smooth even at framerates that are considered unacceptable in older games without motion blurring. The same way lower resolutions are now more acceptable thanks to AA.

-.02
 
Not to beat a dead horse to death again for the 50th time, but obviously this is a highly subjective topic and seems to be different from person to person. There are definitely diminishing returns on framerates higher than 50-60 fps but I seem to remember articles stating that humans could in fact notice a *difference* (not necessarily better or worse) between framerates much higher. Whether this is peripheral vision or not, your brain could tell the difference, period.

As has been stated countless times before, movies != games. They are filmed on cameras that have shutter speeds, meaning they record images onto film by opening their shutters to light for a small fraction of time. This means that if objects move during that short time the motion is also recorded, ie- motion blur. This gives the filmed subjects much smoother transitions between frames if they do move, or conversely if the camera moves still objects have blur giving the camera panning motion fluidity to the viewer.

On the other hand, many computer games (especially ones made before the past year or so) render objects on a frame-by-frame basis. This means the motion the objects may have gone through to get from frame A to frame B is not recorded at all, only their final positions for frame A and frame B. We simply perceive this as motion as our brain connects the dots for us.

A simple example is to go watch a live-action film and then watch a stop-action film (I've always been a fan of the Wallace & Gromit movies). If you don't have any claymation around just google for some flipbook animations, same concept. There's an obvious difference, and that difference is blurring.

New games are starting to use motion blur (including CoD4), and the result is one of the most pronounced changes to happen to graphics in awhile. I would even go so far as to put it on par with anti-aliasing. Games now appear smooth even at framerates that are considered unacceptable in older games without motion blurring. The same way lower resolutions are now more acceptable thanks to AA.

-.02

A well-written discussion of the issues.
 
A simple example is to go watch a live-action film and then watch a stop-action film (I've always been a fan of the Wallace & Gromit movies). If you don't have any claymation around just google for some flipbook animations, same concept. There's an obvious difference, and that difference is blurring.
That is incorrect. The jerkiness in "claymation" or hand drawn animation is due to inaccuracies in positioning the clay or drawing completely uniformly from frame to frame not the lack of "motion blur".

Ideally one should use a computer animated movie like any of the Pixar or Dreamworks Animation movies as examples as to the acceptability of the smoothness of 24fps rendering... but since they look smooth and non-jerky, you probably won't because they don't support your hypothesis.

Note I'm not saying that 24fps is acceptable, I have already said that I can personally notice differences up to around 40fps and do not doubt that some golden-eyes can notice up to much higher rates. I'm just disagreeing with the example used discredit using 24 fps movies as a reference.
 
Ideally one should use a computer animated movie like any of the Pixar or Dreamworks Animation movies as examples as to the acceptability of the smoothness of 24fps rendering... but since they look smooth and non-jerky, you probably won't because they don't support your hypothesis.
Actually, Pixar add motion blur to their movies. Do a freeze frame on an action scene (Sully tobaggoning down the hill in Monsters, Inc comes to mind), and you'll see it. Silent.Sin has it right.
 
Been a long freaking wait but finally I got hold of my 3870 in Sweden. It was larger and heavier than expected compared to my puny 2600XT, and it was so much cheaper than I had expected, I see 2600XT models for sale for the price of my 3870!

Performance is absolutely nuts for the price asked, the fan feels like it's more silent than that of my 2600XT, the auto-downclocking makes me feel like I'm saving the environment and my power bill, but most importantly:

400w Chieftec PSU, 29A on the +12v rails = IT RUNS PERFECTLY!
 
I noticed a lot of discussion about people having monster power supplies when running a 3870 so I thought I would toss this out

InWin mATX case with 350W PSU
Foxconn G33M mATX mainboard
4GB 800Mhz DDR2 Ram
Intel Core 2 Duo E6550 running at 3GHz
ATi 3870 card
Lite-On Sata DVD burner
Lite-On DVD reader
WDC Caviar RE2 500GB HDD
Audigy 2

So i decided to stress test this machine and it seems to work just fine with no crashes and I can play games like UT3 for hours on end.

so why is there such a steep PSU recommendation? is this a CYA or something?
 
so why is there such a steep PSU recommendation? is this a CYA or something?
Pretty much. The problem is that most PSU's out there are not honestly rated. For example, you may be able to get 30A on the 5V rail (for 150W) and 21A on the 12V rail (for 252W), but not at the same time. The PSU manufacturer may nevertheless slap a "400W" sticker on it, regardless of the fact that the PSU can't actually deliver that much power.
 
Pretty much. The problem is that most PSU's out there are not honestly rated. For example, you may be able to get 30A on the 5V rail (for 150W) and 21A on the 12V rail (for 252W), but not at the same time. The PSU manufacturer may nevertheless slap a "400W" sticker on it, regardless of the fact that the PSU can't actually deliver that much power.

That's why you buy properly reviewed PSU's or look into user experiences ;).
 
Yeah, X1900XTX cards were pretty much the worst cards for stressing a PSU, with only one molex connector. It was very very close to the limit that a PCI-E molex connector could handle, so the rest had to be drawn through a motherboard's PCI-E slot. I do not think there is another card that puts as much of a draw through a single slot or molex as that XTX! (Mind you, an 8800 or a 2900XT has more distributed power draw through an additional PCI-E molex connector.) Just gotta remember when the XTX'es first came out.. only a handful few PSU's could handle those cards in Crossfire.
 
the center of your eye cannot see movement well.

the outer section of your eye, the perimeter of your retina, cannot see detail, cannot see colour well at all, but can see high speed motion super well. that is why you havent been crushed by a car yet, or can actually play a sport like boxing.

the 75hz thing is for images focused on and in detail mode. the rest of your mind is pissed off at that hz

Is this why I can see my CRT flickering if I, uh, look at it in my peripheral vision?
 
Yeah, X1900XTX cards were pretty much the worst cards for stressing a PSU, with only one molex connector. It was very very close to the limit that a PCI-E molex connector could handle, so the rest had to be drawn through a motherboard's PCI-E slot. I do not think there is another card that puts as much of a draw through a single slot or molex as that XTX! (Mind you, an 8800 or a 2900XT has more distributed power draw through an additional PCI-E molex connector.) Just gotta remember when the XTX'es first came out.. only a handful few PSU's could handle those cards in Crossfire.

Odd, because I ran two X1900's, both flashed to XTX+ speeds and voltages in BIOS, running with a 4400 X2 OC'd to 2.8Ghz, 2 500mb and 1 300 MB SATA drives, 2 gigs of RAM, 7 fans plus two Accelero V2 coolers, 2 DVD drives, sound card and a bunch of USB devices plugged in.

Powered by a 535w Enermax Liberty. Ran that setup for over a year and a half.

No stability issues related to power, ever. Heat, on the other hand, could be a problem.

Granted, the 4400+ and RD670 chipset weren't power hogs like my current Q6600/X38 combo.
 
has anyone done any folding with this gpu ?>

I have the x1900xt and was wondering if this would be an upgrade for that .
 
has anyone done any folding with this gpu ?>

I have the x1900xt and was wondering if this would be an upgrade for that .

I don't think the folding works on the 2k/3k series.. well at least it wasn't a few months ago when I was still folding... check the folding forums here for more info (DC forums).
 
Odd, because I ran two X1900's, both flashed to XTX+ speeds and voltages in BIOS, running with a 4400 X2 OC'd to 2.8Ghz, 2 500mb and 1 300 MB SATA drives, 2 gigs of RAM, 7 fans plus two Accelero V2 coolers, 2 DVD drives, sound card and a bunch of USB devices plugged in.

Powered by a 535w Enermax Liberty. Ran that setup for over a year and a half.

No stability issues related to power, ever. Heat, on the other hand, could be a problem.

Granted, the 4400+ and RD670 chipset weren't power hogs like my current Q6600/X38 combo.

Cool.. no wonder Enermax PSU's are one of the best brands, right? For me, I've had many PSU's fail with my XTX!!! Four, to be exact! I've even bought the most expensive brands from CompUSA and returned them due to unsatisfactory 12V droops (as low as 11.30V even on a BFG SLI-certified 650W). Right now, I'm using a PSU 300W booster from FSP to supplement my PSU for worries-free rock-stable voltages.
 
I picked up a HIS 3850 ICEQ3 512mb a couple of weeks ago, upgrading my Shuttle's video solution from the X1950 XT it had originally. This card is a step forward in many significant ways, most of all being that the cooling is whisper-quiet. I never realized how dog loud my older card was until I slapped in this little puppy.

It weighs probably half of what my X1950 XT does but the cooling is definitely superior. My X1950 XT used to peak at 100C at full GPU load but this card tops out at 70C. And it is alot faster, too... with system specs below I got 10568 in 3DMark06 (http://service.futuremark.com/compare?3dm06=4432612) up from 6050 or so best run.

And I paid 230 CDN from ncix.com for it, with express shipping such that it was on my doorstep literally the very next day after I ordered it. This latest series from DAAMIT is good stuff.
 
If only I still had a CRT I would be trying this. what happens?

n/m. I just tried it and it didn't work. Maybe it's only on lower refresh rates or something. But I remember doing it under some circumstances where it totally broke up the illusion of a continuous image, everything flickered like mad.
 
how much power would one need to run two 3870s in crossfire? im about to buy a new PSU for a build im doing but ive never run a multi-GPU system, so about how much power do you think i need for two 3870s, Asus Maximus Formula, Q6600 and 4x1GB of RAM, 2 500GB Harddrives?
 
thanks, it takes a lot less to run a pair of these in crossfire than i thought it would. good to know.
Yeah, I too was pleasantly surprised by this. I'm currently running a single 1900XT, using a Seasonic S12-550 Energy+ PSU. I'll definitely consider a 3870 now, and will maybe add another one in future.
 
have you guys noticed that the Visiontek HD 3870 has a new stock cooler on it? I just got my 3870 and I love it and the new cooler is different than what *egg posted on their site
 
Wanna really freak? Try staring at it while you chew something really crunchy, like crushed ice...

yeah the shitty cheap stupid crt's at my pt job, if I eat my dinner in front of them the interlace is visible.

it could even happen on my 2070sb at 100 hz.

100hz is not smooth. you need over 250 in my opinion to not notice flickering on a crt. People just are lemmings and believe the 75hz is more than your eyes can see thing.

The very center of your retina can see a bit more than 75hz at times and your outer ring of your retina can see WAY more than 75hz movement. your eye out there is like a frog, action, black and white, twitch. meant for combat, getting out of the way of things, etc.

tikara wins!
 
Does anyone know the best way to overclock these HD3xxx's?

Auto tune is worthless for my card, Being that my set oc ceiling is somehow only 720/880 auto tune will just bounce off the limit even though I know I can get far far far higher. I've overclocked this card to 800/1050 rivatuner and not seen a single artifact, but the ATI drivers do not seem to agree with that program in some games even if rivatuner sets my clocks to stock.

I would flash my bios, but can't find any appropriate roms for my card.
 
Back
Top