GoldenTiger
Fully [H]
- Joined
- Dec 2, 2004
- Messages
- 29,666
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
lol its only got a 256bit memory bandwidth? haha thats gonna suck balls, its gonna be like trying to fit a golfball through a waterhose.
Now explain why should 256-bit "suck balls" ? Does 8800GT, GTS, 9800GTX suck balls ? Don't think so. Yet they use 256-bit interface too. Why should it then limit 4850/4870, which have comparable performance ? Just because GTX260/280 use 512-bit, thus 512-bit interface is your new god ?
PS: try to use correct terms. There is no such thing as 256-bit memory bandwith. There is 256-bit memory interface and there is memory bandwith. The two are separate things. Memory interface can limit memory bandwith, but not allways.
think I just saw 15k on 3DMark06 from the website posted
.
lol its only got a 256bit memory bandwidth? haha thats gonna suck balls, its gonna be like trying to fit a golfball through a waterhose.
think I just saw 15k on 3DMark06 from the website posted
.
Of course, this is assuming he ran 3dmark06 with the O/C applied. I think it looks somewhat right. Isn't this what the 3870X2 scored?
No, you're looking at System Requirements. This card has 512MB GDDR3.Did i see 1GB VRAM?
No, you're looking at System Requirements. This card has 512MB GDDR3.
NO 256bit ISN'T going to suck balls, it isn't needed, especially if it is using DDR5.. god i wish i would stop seeing that "this card sucks if it has 256bit bus!! blah blah blah"
it isn't using GDDR5, it's a 4850, i.e. GDDR3 (Rumors have surfaced about GDDR4 but ATI themselves stated this card will use GDDR3)
Hrm.
4850, better than an 8800GTX? Single card? The 4870 and the 4870x2 are going to be damn impressive, if that bungholio mark score equates to in game performance for this card,
Nvidia expected that, too, apparently.Lol, you were expecting it to still be slower than a card launched in November 2006? I think people have gotten so used to G80 being on top that they expected it to last forever.
Still, having 512-bit interface is not the ultimate solution for all problems, as noted before (see Radeon 2900).
gddr3 + 256bit = problems at high resolutions with AA & AF, most likely, at least if previous generations of 256bit + GDDR3 cards have been any indication (g92, R670)
hahah, What WoW? AoC?...meet the hard-core demands of your alternate reality...
WoW, the hk guys always got the lastest stuff
And the 3dmark thing...still seems very heavily influenced by the CPU .
Need some real game tests.
Yes, and something else than Crysis, which is not helping us much either. Some game with U3 engine, with Source engine would be nice.
Also the shroud's picture design is so off the wall, as they always are. Like where do they come up with this?
Lol, you were expecting it to still be slower than a card launched in November 2006? I think people have gotten so used to G80 being on top that they expected it to last forever.