ATI Radeon HD 3800 Series @ [H]

Did he say TAKE? No he said MAKE, there is a differece.

I do think he should delete his post and re-phrase it a bit more courteously.

I didn't mention it, but I also don't like to see the game dip below 60 FPS if I'm playing online and 90% of the time I'm playing a game it's multi-player. Since a lot of people are running flat panel monitors these days they won't be able to SEE more than 60 FPS because that's the refresh rate.

his essential argument is that [H] is making money by writing favorable reviews for gpu makers. whether he said take or make is academic, because his point comes across loud and clear.
 
When you review video cards, you have a lot of data. Understand? All HardOCP does is change the reference point of the data. You still draw your own conclusions, just differently. If this still baffles you, then.... I dunno.
Okay no need for attacks. While I understand the concept of determining which card can play at higher settings while having decent framerates. It's misleading others, aswell as myself when I first saw it, drawing false conclusions, as they looked at the settings and the FPS and did not see the huge difference the resolution was at.
 
Okay no need for attacks. While I understand the concept of determining which card can play at higher settings while having decent framerates. It's misleading others, aswell as myself when I first saw it, drawing false conclusions, as they looked at the settings and the FPS and did not see the huge difference the resolution was at.

If you call that a personal attack....may God have mercy on your soul.

Why is everyone so bent on apples to apples. The world is isn't apples to apples. It never has been and never will be. Everyone and everything has strengths and weakensses. The job of any good reviewer is to try and put the products at hand in the best possible light and then let results speak. However, I guess it makes reviewing alot of easier when you don't have to think when you publish result. I guess thinking is a bad thing these days.

What I find completely ironic is people keep posting/linking to review sites that claim apples to apples but never really show you what the whole apple is. They don't tell you settings, which benchmarks, etc. Places like Anand tech that have well crafted and pretty reivews do so at the penality of limiting information.

If people who keep visiting sites keep wanting all reviews to be exactly the same and done exactly the same way...why have more than one review?
 
Okay no need for attacks. While I understand the concept of determining which card can play at higher settings while having decent framerates. It's misleading others, aswell as myself when I first saw it, drawing false conclusions, as they looked at the settings and the FPS and did not see the huge difference the resolution was at.


RTFA ;)

Please be aware we test our video cards a bit differently from what is the norm. We concentrate on examining the real-world gameplay that each video card provides. The Highest Playable section shows the best Image Quality delivered at a playable frame rate.

So all in all, the only people that get mislead, are the folks that don't read the information. There are plenty of other sites that will give you the cookie cutter canned benchmarks you are wanting. We simply decided YEARS ago to evaluate video cards by the gaming experience they supplied to the gamer rather than by declaring our opinion based on a timedemo that in many cases in no way represents gameplay.
 
i love how your review says avg. price of 8800GT is $293.00...that is utter BS! was this review sponsored by ATI? You can go on EVGA's site right now and buy in-stock a $269.00 stock-clocked 8800GT.

Also did anyone miss the fact that the 8800GT is running at 1600x1200 with most settings on high? drop the 8800GT to 1280x1024 and form my personal testing everything including shadows will be on high with 8x AF. why don't you have an option fro those numbers? (and i am referring just about Crysis, in the other games the 8800GT dominates the charts)

i do have to say that the price points on the ATI cards are nice. but who really plays at 1280x1024 or 1024x768. if this is a next gen card (DX10.1) why can it play at 1680x1050 or 1600x1200 with decent setting and framerates? Making their "Mainstream" card 219.99 is a cop out and they did that because they know that they can compete at all with the 8800GT.
 
The funny thing is, since before ATi was bought, people were saying that thr R600 wasn't their saving grace, but the R700. Then AMD bought ATi, and now suddenly people are expecting ATi to "win" with the 38xx series using the R600 process. This is all fireworks until AMD releases their real competitor, the R700. Don't lose track so quickly people. Kyle and Steve keep hinting not to count AMD/ATi out and for good reason.

The 38xx are good cards and they definitely mad a HIT, not a Homerun though. A homerun brings in all the people at base to score. A hit just places you in position to score. As semantic as this sounds, a homerun is what the 8800 series is. Everyone is coming home scoring, 8600GTS, 8800GTS, 8800GT, 8800GTX.

i love how your review says avg. price of 8800GT is $293.00...that is utter BS! was this review sponsored by ATI? You can go on EVGA's site right now and buy in-stock a $269.00 stock-clocked 8800GT.

I do agree with you partly, but with some cards going for as high as $309+, it really ruins the average scores, bringing it higher. But then again, I am one of those happy 8800GT owners that bought one at lower than msrp when we had the chance. All I did was click the Fry's link a fellow Hardocp'er posted after reading H's awesome article on it. But they way they handled pricing in general for their latest article was not like them...

One request though, I would love to see the highest playable settings that a game can do at 1280x1024 resolution.
 
Uh, on1yalad, he wrote "average", not "cheapest". Average means the median between the lowest and highest of two values, see.
 
About what Kyle said earlier, does RTFA = read the f***ing article? Just want some clarification on that :p
 
i love how your review says avg. price of 8800GT is $293.00...that is utter BS! was this review sponsored by ATI? You can go on EVGA's site right now and buy in-stock a $269.00 stock-clocked 8800GT.

No, sorry:

"*Currently on Backorder*"
 
About what Kyle said earlier, does RTFA = read the f***ing article? Just want some clarification on that :p


QFT... and that one made me laugh!

Edit: Maybe I sould say, quoted for what I believe is true... not sure anything else fits.
 
Uh, on1yalad, he wrote "average", not "cheapest". Average means the median between the lowest and highest of two values, see.

Reading comprehension 101 "i love how your review says avg. price of 8800GT is $293.00"

@ XamediX i understand that companies are selling it $75-150 dollars over MSRP but that isn't the cards fault MSRP is skewed cause how "popular" it is. Why do they use the 38xx MSRP price?

Looking at Price/Performance the 8800GT is unchallenged @ 269 MSRP
 
Reading comprehension 101 "i love how your review says avg. price of 8800GT is $293.00"

@ XamediX i understand that companies are selling it $75-150 dollars over MSRP but that isn't the cards fault MSRP is skewed cause how "popular" it is. Why do they use the 38xx MSRP price?

Looking at Price/Performance the 8800GT is unchallenged @ 269 MSRP

Difference is the MSRP on the 8800GT is NOT $269. It's $199-249. It's unchallenged at $249 and even what I paid for it, $229. Dont confuse the manufacturers suggested retail price with the retailers prices. If the market puts the 3870 at $199 then that is an awesome deal. Naturally the 3870 Should arrive there sooner or later but we'll have to see.
 
While change can be good, change just for the sake of change isn't always good. I don't mind the format, but a small section that shows the other card running at the same settings would be neat.

Like where you have the 3870 and all the settings, show the FPS of the 8800GT at those settings.

Then when you have the 8800GT and its best playable settings you can show the 3870 at the same settings.

Kind of like a combo of the [H]ard style and traditional.
 
While change can be good, change just for the sake of change isn't always good. I don't mind the format, but a small section that shows the other card running at the same settings would be neat.

Like where you have the 3870 and all the settings, show the FPS of the 8800GT at those settings.

Then when you have the 8800GT and its best playable settings you can show the 3870 at the same settings.

Kind of like a combo of the [H]ard style and traditional.

i agree
 
Uh, on1yalad, he wrote "average", not "cheapest". Average means the median between the lowest and highest of two values, see.

Median and Average are two different things. Median is a datapoint in the middle of the range, and average is the average of all the datapoints.
 
I seem to remember when the HD2900XT was released and there were a slew of articles describing the specifications. One of the things that bothered me then was that they were quoting this "320 Stream Processors" figure, but really these are in ganged sets of 4. They can all work at the same time as long as they do different things. That's why the architecture is called "Super Scalar". And the problem is that it's hard to keep all those "stream processors" working all the time.

What it really is amounts to is "80" multi-function/parallel stream processors. And that's why even after a die shrink, and a clock bump - but with less memory bandwidth - the HD3800 series is better competition for the GTS line (with 96 Stream Processors) than anything else. I mean seriously, if AMD / ATI was talking the same thing when they said "stream processors" then these cards (the HD 3800 series) would clearly outperform anything from Nvidia.

Can't anyone but me recognize a pile of "Male Bovine Excrement" before they step in it? I just catch a whiff, recognize it for what it is, and go around it...

And please don't jump to the conclusion that I'm an Nvidia Fanboi! I have an AMD system that isn't listed in my signature. Anyway that's my 2¢ worth, I'll stand by it, flame on if you want to. :p
 
While change can be good, change just for the sake of change isn't always good. I don't mind the format, but a small section that shows the other card running at the same settings would be neat.

Like where you have the 3870 and all the settings, show the FPS of the 8800GT at those settings.

Then when you have the 8800GT and its best playable settings you can show the 3870 at the same settings.

Kind of like a combo of the [H]ard style and traditional.


i dunno, after work i usually enjoy a beer. when i'm workin overtime, generally i don't get to have a beer. the apples to apples benchmarks are being done by the masses already, so why work overtime and miss beer o'clock, doing redundancy testing for other websites?

whenever an apples to apples test can say something that hasn't already been said within the format of [H]'s tests, there you will find an apples to apples test on a hardocp webpage.
 
i dunno, after work i usually enjoy a beer. when i'm workin overtime, generally i don't get to have a beer. the apples to apples benchmarks are being done by the masses already, so why work overtime and miss beer o'clock, doing redundancy testing for other websites?

whenever an apples to apples test can say something that hasn't already been said within the format of [H]'s tests, there you will find an apples to apples test on a hardocp webpage.

I know it's cool to blindly defend all things [H]ard, but the format could use some work. It's very confusing as seen here. If someone was to look at it they would infer that the 3870 beats the 8800GT in some tests simply because it appears to have higher FPS. One simple extra step to show which card is better at what settings isn't going to change the whole apples to oranges comparison that people think is so great. It's more like comparing an apple to an orange AND an apple.
 
I know it's cool to blindly defend all things [H]ard, but the format could use some work. It's very confusing as seen here. If someone was to look at it they would infer that the 3870 beats the 8800GT in some tests simply because it appears to have higher FPS. One simple extra step to show which card is better at what settings isn't going to change the whole apples to oranges comparison that people think is so great. It's more like comparing an apple to an orange AND an apple.

I know it's cool to for someone who joined the forums two whole weeks ago to smugly attack all things [H]ard when their previous posts demonstrate that they haven't yet climbed the learning curve and become [H]ard themselves, but it's only confusing to people who haven't learned better yet and/or aren't paying sufficient attention.

[H] didn't change from benchmarks and bar graphs just to offer something different and differentiate themselves in the marketplace. They did it because they recognized that apples-to-apples doesn't really tell the potential buyer what they need to know. If it doesn't tell you what you need to know, it is of no value. If it is of no value, why should they waste any time (and the money they pay their staff) doing it? It's a paradigm shift. Like all paradigm shifts, it's slow to catch on and hard to wrap your head around. Like all paradigm shifts, it's going to be "obvious" in hindsight as the necessary course of action one day. Welcome to the future.
 
I know it's cool to blindly defend all things [H]ard, but the format could use some work. It's very confusing as seen here. If someone was to look at it they would infer that the 3870 beats the 8800GT in some tests simply because it appears to have higher FPS. One simple extra step to show which card is better at what settings isn't going to change the whole apples to oranges comparison that people think is so great. It's more like comparing an apple to an orange AND an apple.



nah, the resolution used is clearly displayed. i'm not blindly defending everything [H] either, i'm defending a videocard review method that i honestly prefer. one that i realize takes an assload of work to execute. not to suggest that traditional benchmarking isn't work, of course.

all one must do is look at the maximum playable settings chart to see what resolution was in play. what could be construed as confusing if one weren't used to the format, i clearly comprehend. so do many others who've learned how it's done.

there are plenty of other sites to compare [H] results to but i wouldn't have [H]'s present formula changed for anything. it's gold.
 
I know it's cool to for someone who joined the forums two whole weeks ago to smugly attack all things [H]ard when their previous posts demonstrate that they haven't yet climbed the learning curve and become [H]ard themselves, but it's only confusing to people who haven't learned better yet and/or aren't paying sufficient attention.

Thank you for demonstrating exactly what I was talking about. Remarking on how the current method isn't perfect != attacking all things.
 
don't take it personally, it's just that you're not the first and wont be the last to question [H]'s testing methods.

it's a matter of taste, really.


like i mentioned, when an apples to apples test can demonstrate something which regular [H] testing doesn't cover, that's where you'll see an apples to apples test. it does happen.
 
Thank you for demonstrating exactly what I was talking about. Remarking on how the current method isn't perfect != attacking all things.

Your suggestions for improvement are not what I was referring to. Your slam on veteran forum members disagreeing with you for substantive reasons is what I was referring to.

Snaggletooth's point was relevant and accurate; your reply was ignorant and defensive. Like this one, so thanks for demonstrating exactly what I was talking about.
 
R.E. the evaluation method, if anyone has any real suggestions for improving the presentation of data to make things understood easier, by all means send me email on it, I will take each suggestion into considerations and make changes as needed to improve the format.
 
R.E. the evaluation method, if anyone has any real suggestions for improving the presentation of data to make things understood easier, by all means send me email on it, I will take each suggestion into considerations and make changes as needed to improve the format.

I think just a large "Settings can be very different for each card, please pay close attention to detail settings / resolutions not just fps."

on each graph would help users who don't read much in reviews.. hell I know I don't read all the text in other reviews :p
 
#1 Don't compare the 3850 to a gts 320 mb. In no way are they priced similar. In no way are they aimed towards the same market. You are wrong making any comparison to these two cards. This is aimed at the MID RANGE market meaning nvidia's 8600gts and it is by far a better card FOR A VERY SIMILAR PRICE! There is no way to compare these two cards, and comparison is invalid.

#2 So what if the 3870 doesn't perform quite as well? Good for you if you got your 8800 GT at MSRP, everyone else has little chance of receiving that offer now until after christmas, we all know the demand is too high for the time being.

#3 Just get whatever suits you, if you want to save some money then go with the 3870 if you don't mind a slight performance drop. If you want Nvidia and slightly better performance then shell out the cash for prices above msrp.

On a side note i've never been much of a two card guy. But any of these cards sound inticing to put in a two card setup considering the cost.
 
RE this comment by rhagz:

"I know it's cool to blindly defend all things [H]ard, but the format could use some work. It's very confusing as seen here. If someone was to look at it they would infer that the 3870 beats the 8800GT in some tests simply because it appears to have higher FPS. One simple extra step to show which card is better at what settings isn't going to change the whole apples to oranges comparison that people think is so great. It's more like comparing an apple to an orange AND an apple."...please see the following:

"Please be aware we test our video cards a bit differently from what is the norm. We concentrate on examining the real-world gameplay that each video card provides. The Highest Playable section shows the best Image Quality delivered at a playable frame rate. "

That's been at the beginning of every vid card review done here since the switch in testing procedures. Nobody's defending [H] blindly, you're just defending the fact that you didn't read page 3.
 
I know it's cool to for someone who joined the forums two whole weeks ago to smugly attack all things [H]ard when their previous posts demonstrate that they haven't yet climbed the learning curve and become [H]ard themselves, but it's only confusing to people who haven't learned better yet and/or aren't paying sufficient attention.

[H] didn't change from benchmarks and bar graphs just to offer something different and differentiate themselves in the marketplace. They did it because they recognized that apples-to-apples doesn't really tell the potential buyer what they need to know. If it doesn't tell you what you need to know, it is of no value. If it is of no value, why should they waste any time (and the money they pay their staff) doing it? It's a paradigm shift. Like all paradigm shifts, it's slow to catch on and hard to wrap your head around. Like all paradigm shifts, it's going to be "obvious" in hindsight as the necessary course of action one day. Welcome to the future.

almost none of the potential buyers have that cpu ram and board setup so its a complete mystery as to what the graphics cards will do. I am positive it is because of NDA agreements that they only show it with super fast cpu setups.

It is however the best review of them so I am thankful for that :)

One has to triangulate things unfortunately. I have to read a review that shows the degree of non-effect a cpu has on framerates at 1920x1200 resolutions with reviews like this that explain min framerates. Really, min framerates is the only thing that matters for certain games like rpg, hellgate, oblivion. They arent rigorous competitive fps's that require 40-50 for minimum competitive play.
 
Clearly I'm missing something here - the 3870 seems to be superior to the 2900XT in all respects other than the potential for a (largely useless) 1GB variant. Why are 2900s still priced at double the price of the 3870, and why are these cards being treated as ATi's new "mid-range" cards? Are these not the best ATi cards available right now?
 
They're mid-range by pricing, and midrange because within this generation of ATI cards, they're positioned in the middle of the pack...even though they haven't released the low end or high end 55nm parts yet.

The 2XX0 cards are priced high because the retailers paid more for them, and want to recoup their costs. There's always a chance ATI or the AIB's will offer rebates to the retailers or the end-users to help clear out the old stock, and if they sit long enough the retailers might write them down and take a loss just to get rid of them, but that's a few months off it if happens at all.

This happens with every generation/refresh, the 7 series cards didn't get instantly cheaper when the 8 series was released, and the 80nm 8 series hasn't dropped in price with the release of the 65nm 8800GT either.
 
almost none of the potential buyers have that cpu ram and board setup so its a complete mystery as to what the graphics cards will do. I am positive it is because of NDA agreements that they only show it with super fast cpu setups.

It is however the best review of them so I am thankful for that :)

One has to triangulate things unfortunately. I have to read a review that shows the degree of non-effect a cpu has on framerates at 1920x1200 resolutions with reviews like this that explain min framerates. Really, min framerates is the only thing that matters for certain games like rpg, hellgate, oblivion. They arent rigorous competitive fps's that require 40-50 for minimum competitive play.

You're right, it is a best-case scenario in terms of the testing platform and you have to mentally adjust for the particulars of your own system. But running all of these tests on multiple configs would take way too much time. So they stick to a system that allows the cards to show the best they are capable of. Which is useful too, since it tells you what you can expect if and when you upgrade other components.
 
It's not worth it when you can get 8800GT 512's for like $245+. I would rather pay the $25+ more for the 8800GT
 
It's not worth it when you can get 8800GT 512's for like $245+. I would rather pay the $25+ more for the 8800GT

it'd cost me about $300CDN to have one shipped here... in about 2.5, 3 weeks.

or i can go to futureshop and buy one now for $319 or so. i'm not gonna link prices because they seemingly change by the hour.

that said, the 3870 is backordered everywhere too, and isn't in stock at futureshop at all.

crazy ol' canada.
 
almost none of the potential buyers have that cpu ram and board setup so its a complete mystery as to what the graphics cards will do. I am positive it is because of NDA agreements that they only show it with super fast cpu setups.

It is however the best review of them so I am thankful for that :)

One has to triangulate things unfortunately. I have to read a review that shows the degree of non-effect a cpu has on framerates at 1920x1200 resolutions with reviews like this that explain min framerates. Really, min framerates is the only thing that matters for certain games like rpg, hellgate, oblivion. They arent rigorous competitive fps's that require 40-50 for minimum competitive play.

I agree with this 100%.

I would imagine a person who went out and bought a $250 GPU wouldn't spend $700+ on the CPU as well. It would be nice for the mid range ($300 and less) and the low end cards be tested on a X2 5k, or an E6600 and below from Intel.

AND holy crap, I just saw this...

http://www.newegg.com/Product/Product.aspx?Item=N82E16819103191
 
ATIs page says the 3800 series needs a 450-550w PSU but does anyone have any clue about the amperes a 3870 or a 3850 might draw? I forget if there's a way to tell that from the power consumption numbers we were given measured in watts.
 
Back
Top