Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
Here any solid rumors of a 2GB 5850 ?
One article noted that a 5850 idles @ 15W with the new driver, I'm going to look into it so don't quote me
I guarantee you that 3D will not penetrate the consumer market this year, and I seriously doubt it will do squat next year either.
Ati really did engineer a killer card with the 5000 series. Awesome performance, features, and low wattage. Kudos to them.
I think my moneys still going to go to SLI GTX470-480 once prices drop. What i5 chipset do I need to support both crossfire and SLI?
Not the first time they claim that but unlike others, I am no fan boy so I hope they do it so that I can have another choice for video cards. In the mean time, I would believe it when I see it.
Oh, and it would be fun to see how some red boys suddenly change their views on 3d vision now that ATI is supposed to be getting it!
In any case back to Fermi, if it beats or matches my 295 in single so that I can SLI them I would be sooooooooo happy.
Take this review from Xbit Labs, a site known for their excellent power consumption measurements.
http://www.xbitlabs.com/articles/video/display/radeon-hd5870_7.html#sect0
Regular load = 107w average
Furmark load = 161w
Then compare to this TechPowerUp review, where they use the exact same methodology to capture GPU-only power consumption:
http://www.techpowerup.com/reviews/ATI/Radeon_HD_5870/28.html
Regular load = 122w average, 144w instantaneous peak
Furmark load = 212w
There's obvious variations from one card to another, and although the 5870s tested do not exceed the stated TDP (outside of Furmark), some will consume significantly more/less than others. Given that, 118w at-load sounds like a perfectly reasonable number for the 5870.
im most likely getting two 470's for sli on day one.
i have 3 pci 16x slots so im going to wait on the 3rd one, lol, im going to need to get a new psu for the tri-sli,
Even in an excellent rated game like World of Warcraft, reactions to 3D Vision are highly mixed. I've tried it, and it looks nice, but it's sometimes hard to focus on menu's. Personally, I think Eyefinity gives a much more impressive - and especially immersive - gameplay experience.3D Vision is a proprietary gimmick that requires those dumb shutter glasses. Thanks but no thanks.
AMD is providing hooks into their drivers to allow ANY 3rd party to create glasses and allow for 3D meaning we could get any number of 3D solutions. This is the way it should be done, not the proprietary path NV keeps taking.
Well, if you really feel that a 2d representation of the 3D worlds we play in is really more immersive than actually experiencing the game in 3D then I have nothing else to say....I am off this one and this just goes beyond the limits of comprehension.[X]eltic;1035472089 said:Even in an excellent rated game like World of Warcraft, reactions to 3D Vision are highly mixed. I've tried it, and it looks nice, but it's sometimes hard to focus on menu's. Personally, I think Eyefinity gives a much more impressive - and especially immersive - gameplay experience.
Please dont tell me you are one of those call the glasses dumb as your lame excuse to put down the technology but then dont feel like a retard when playing a ridiculous plastic guitar to play guitar heroes!3D Vision is a proprietary gimmick that requires those dumb shutter glasses. Thanks but no thanks.
AMD is providing hooks into their drivers to allow ANY 3rd party to create glasses and allow for 3D meaning we could get any number of 3D solutions. This is the way it should be done, not the proprietary path NV keeps taking.
If decided to buy one,what is the cost of a used reactor these days?+EPA cleanup.
You are making the assumption that 3D Vision has no flaws and that it is comparable to the three dimensions that we humans experience in everyday life. But 3D Vision is nothing like that. Sure, at times it looks nice, but it has many issues. Out of focus menu's that I've experienced myself, and you're still playing your game on a small screen.Well, if you really feel that a 2d representation of the 3D worlds we play in is really more immersive than actually experiencing the game in 3D then I have nothing else to say....I am off this one and this just goes beyond the limits of comprehension.
Regards
[X]eltic;1035471905 said:Some developers even took advantage of obscure things like X-RAM, so it's not unlikely at all. In fact, in this case it's more likely. Don't forget that tessellation is not a new feature, it's being coded for already, nvidia is merely better in handling higher amounts of tessellation. What we will probably see is sliders in-game that will give an option like 'tessellation: low - normal - high - extra high' or something similar, with AMD not being able to handle the higher amounts of tessellation without a huge performance hit.
Note, I'm not saying that the GTX 480 is a winner, but it has some undeniable strengths:
-Improved tessellation performance
-PhysX support
But then again, it has some undeniable weaknesses too:
-Very high TDP of 295 watt
-Lackluster power/performance ratio
Well, if you really feel that a 2d representation of the 3D worlds we play in is really more immersive than actually experiencing the game in 3D then I have nothing else to say....I am off this one and this just goes beyond the limits of comprehension.
Please dont tell me you are one of those call the glasses dumb as your lame excuse to put down the technology but then dont feel like a retard when playing a ridiculous plastic guitar to play guitar heroes!
Oh and yes, AMD can talk all they want like they have done before but I have yet to see ANY 3d from the so talk is cheap.
Regards
Why so sore?Well, if you really feel that a 2d representation of the 3D worlds we play in is really more immersive than actually experiencing the game in 3D then I have nothing else to say....I am off this one and this just goes beyond the limits of comprehension.
Let the price wars begin!
What's your reasoning here? Are you being sarcastic? Price wars? With Nvidia selling the GTX 480 at that price there's no need for ATI to even think about lowering its prices.
If anything they'll raise them.
Suddenly the 5970 becomes even MORE attractive than what it already is.
I am a bit surprised an alarmed by how low these prices are. This could indicate a truly sore return from Fermi which would be shitty. Guess we'll see soon
5970 is still selling for $750, unless amd drops it to regular price when Fermi comes out then it might look more attractive.
$399 5870 vs $350 GTX 470...Not a bad showing by Nvidia to be honest.
$399 5870 vs $350 GTX 470...Not a bad showing by Nvidia to be honest.
Instead of having an intelligent and rational conversation, we're stuck with shitload of, "my team is better than your team." No objectivity, no open-minded perspective, just a bunch of idiotic out-group bias.
The prices are fine, but I'm sure people expected a hell of a lot more for launching 6 months later.
Agreed. How someone can be a fanboy of a computer hardware company is beyond me.
Yes such a great showing!!
A card thats 5% slower for $50 cheaper!
5% slower? thought its 10-15% ...
and not mentioning its 100 cash more for GTX 480 for 5-10% improvement
5% slower? thought its 10-15% ...
and not mentioning its 100 cash more for GTX 480 for 5-10% improvement
They'd rather lynch you, though.Hey ever heard of mac-heads?...Those guys kill peoples babies for dissin macs.
They'd rather lynch you, though.
Yes such a great showing!!
A card thats 5% slower for $50 cheaper!
Wow nVidia is really losing traction in terms of power consumption/performance/price...basically everything. Looks like AMD is going to be dominating the market for a while.