HD4850 Pictures & Performance from Hong Kong

Nice, thanks! Here are a few of them:

dbmlhy.jpg


312bnl2.jpg


dfb02t.jpg


24ooaqb.jpg
 
Thats great but lets start seeing real game performance and not benchies.
 
1699 HKD = 217 USD

Promising. If this card comes in at around $200 and uses a reasonable amount of power, I will be all over it :)
 
Im looking at upgrading my 3850 256 MB, and i was looking at an 8800GTS g92, maybe i should wait and see how this performs, how is it rumored to perform in actual games.
 
lol its only got a 256bit memory bandwidth? haha thats gonna suck balls, its gonna be like trying to fit a golfball through a waterhose.
 
NO 256bit ISN'T going to suck balls, it isn't needed, especially if it is using DDR5.. god i wish i would stop seeing that "this card sucks if it has 256bit bus!! blah blah blah"
 
lol its only got a 256bit memory bandwidth? haha thats gonna suck balls, its gonna be like trying to fit a golfball through a waterhose.

Now explain why should 256-bit "suck balls" ? Does 8800GT, GTS, 9800GTX suck balls ? Don't think so. Yet they use 256-bit interface too. Why should it then limit 4850/4870, which have comparable performance ? Just because GTX260/280 use 512-bit, thus 512-bit interface is your new god ?

PS: try to use correct terms. There is no such thing as 256-bit memory bandwith. There is 256-bit memory interface and there is memory bandwith. The two are separate things. Memory interface can limit memory bandwith, but not allways.
 
Now explain why should 256-bit "suck balls" ? Does 8800GT, GTS, 9800GTX suck balls ? Don't think so. Yet they use 256-bit interface too. Why should it then limit 4850/4870, which have comparable performance ? Just because GTX260/280 use 512-bit, thus 512-bit interface is your new god ?

PS: try to use correct terms. There is no such thing as 256-bit memory bandwith. There is 256-bit memory interface and there is memory bandwith. The two are separate things. Memory interface can limit memory bandwith, but not allways.

owned
 
think I just saw 15k on 3DMark06 from the website posted
.

3DM06 is not really important guys. Stop using it for comparison of new cards :). Compare games (and not only Crysis please), not old benchmarks. Or you guys play 3DM06 ? It must be boring.
 
Hrm.

4850, better than an 8800GTX? Single card? The 4870 and the 4870x2 are going to be damn impressive, if that bungholio mark score equates to in game performance for this card,
 
Yeah the single 4850 is obviously better than a single 8800 GTX judging by that score.
Interesting results, can't wait to see these things hit NewEgg and some official benchmarks, plus vs GT280. :)
 
lol its only got a 256bit memory bandwidth? haha thats gonna suck balls, its gonna be like trying to fit a golfball through a waterhose.

maybe he's talking about fire water hose and you can fit more than one golf balls :D
 
think I just saw 15k on 3DMark06 from the website posted
.

If that guy was running a C2D at 3.0 gig over a quad his score would have around 13K, but his real gaming performance would be near identical in most games.

3D mark 06 is worthless, may as well use a random number generator, or at least make reference to the individual scores.
 
i remember some guy post in the forums here 3 months ago who was saying "an engineer told me from ati and he says theyre gonna pawn nvidia"

I guess everyone who called him nonsense is almost going to be proven wrong in a few more hours/days
 
Well, ATI doesn't need to compete with them at ultra-highend. It is enough for them to compete with a little worse performance with much better pricing. Yeah, NV will be a performance leader, but who cares if ATI will sell much more cards with huge profit, even with this agressive pricing.
 
I still wonder how much stream processors it has.
Oh, and not for nothing, but I think his overclock (GPU and CPU) made the score seem a lot higher than it should be.
Score with a--
QX9770 stock: 14935
Q6600 stock: 13624
E8400 stock: 12858
E6600 stock: 11777
 
Of course, this is assuming he ran 3dmark06 with the O/C applied. I think it looks somewhat right. Isn't this what the 3870X2 scored?

OBR (i know he can't be trusted too much) said he had over 12k with his E8400 and with not overclocked 4850. I think your artificial score is a bit low then.
 
Yeah, I looked over the score and it did seem low. I guess the 3dmark06 score was on the stock settings.
 
oh man this card is looking more and more promising to me. great performance for about the same price as 8800gts right now, but that picture on the card is disturbing im not sure if its supposed to be a dude or chick hmm?
 
NO 256bit ISN'T going to suck balls, it isn't needed, especially if it is using DDR5.. god i wish i would stop seeing that "this card sucks if it has 256bit bus!! blah blah blah"

it isn't using GDDR5, it's a 4850, i.e. GDDR3 (Rumors have surfaced about GDDR4 but ATI themselves stated this card will use GDDR3)
 
it isn't using GDDR5, it's a 4850, i.e. GDDR3 (Rumors have surfaced about GDDR4 but ATI themselves stated this card will use GDDR3)

Still, having 512-bit interface is not the ultimate solution for all problems, as noted before (see Radeon 2900).
 
Hrm.

4850, better than an 8800GTX? Single card? The 4870 and the 4870x2 are going to be damn impressive, if that bungholio mark score equates to in game performance for this card,


Lol, you were expecting it to still be slower than a card launched in November 2006? I think people have gotten so used to G80 being on top that they expected it to last forever.
 
Lol, you were expecting it to still be slower than a card launched in November 2006? I think people have gotten so used to G80 being on top that they expected it to last forever.
Nvidia expected that, too, apparently.
 
Still, having 512-bit interface is not the ultimate solution for all problems, as noted before (see Radeon 2900).

gddr3 + 256bit = problems at high resolutions with AA & AF, most likely, at least if previous generations of 256bit + GDDR3 cards have been any indication (g92, R670)
 
gddr3 + 256bit = problems at high resolutions with AA & AF, most likely, at least if previous generations of 256bit + GDDR3 cards have been any indication (g92, R670)

Now tell me, why do you want use a cheap card (4850 is a cheap one) for 1920x1200 with AA & AF ? Use card what is designed for, not what you expect from it.
 
...meet the hard-core demands of your alternate reality...
hahah, What WoW? AoC?

Also the shroud's picture design is so off the wall, as they always are. Like where do they come up with this?
 
WoW, the hk guys always got the lastest stuff

Don't we rock...blame us hongkies :D

Although, wouldn't AMD be pissed as hell since this guy got hold of one of their 'upcoming not-so-secret-weapon' ?

And the 3dmark thing...still seems very heavily influenced by the CPU :(.
Need some real game tests.
 
And the 3dmark thing...still seems very heavily influenced by the CPU :(.
Need some real game tests.

Yes, and something else than Crysis, which is not helping us much either. Some game with U3 engine, with Source engine would be nice.
 
Yes, and something else than Crysis, which is not helping us much either. Some game with U3 engine, with Source engine would be nice.

Very true, but I also recommend including Crysis because it is known to stress the majority of the current cards. I think that although this game is heavily disputed as to being a good benchmark guide, we should still include it in a set of games to make testing even.
 
Also the shroud's picture design is so off the wall, as they always are. Like where do they come up with this?

I've always noticed that too.. but what would you slap on there? Pac-man? "The Pac-man Jones ATI 4850 simply paralyzes the competition"
 
Lol, you were expecting it to still be slower than a card launched in November 2006? I think people have gotten so used to G80 being on top that they expected it to last forever.

I've been yearning to replace this card for a year, but there hasn't been any option worth a damn. I'm just surprised there finally is, or so it appears.
 
Back
Top