butterfliesrpretty
2[H]4U
- Joined
- Mar 15, 2002
- Messages
- 2,482
we will see, I have a feeling charlie is going to be very wrong...like always
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
Clever, but sadly no. Otherwise you'd need one for 2x 5970s aswell and well now that isn't true either.
That funny, I haven't seen one review of the cards yet, have you? Noone knows what the TPD or power draw is for the card cept for the reviewers who aint leakin it as I'm sure their is atleast 1 site out there that would love to spill/leak that kinda of info. Least of all Charlie and his GDC someone. GTX480 is 512SPs. Again, the ONLY thing he has been accurate about is the time, and none of that was why he said it was.
Clever, but sadly no. Otherwise you'd need one for 2x 5970s aswell and well now that isn't true either.
That funny, I haven't seen one review of the cards yet, have you? Noone knows what the TPD or power draw is for the card cept for the reviewers who aint leakin it as I'm sure their is atleast 1 site out there that would love to spill/leak that kinda of info. Least of all Charlie and his GDC someone. GTX480 is 512SPs. Again, the ONLY thing he has been accurate about is the time, and none of that was why he said it was.
The GTX480 requirements just look ridiculous to me. 15% faster than a 5870 and power requirements almost on par with a 5970? Wow. Better to just buy a 5970 or two 5850s.
What are the rumored power requirements for the GTX470? I think this would be the card most of us would be interested in (for instance, my wife would never let me buy a $500-600 video card).
Not sure on power specs, but the RUMORED price for the 470 (rumors ftl) is $499. Which your wife would not approve of lol.
Holy crap. $500 for a card that is roughly equal to a 5870? That's not good. Not good at all. I thought GTX470 was going to be priced competitively at $400 (no idea where I got that from) and the GTX480 at $500-600. I suppose I should pay more attention to the rumors.
Well like I said its a rumor.
The RUMOR price of the 480 GTX is $599.
March 26th cant come soon enough
it's funny as hell as you try so hard to prove Nvidia is GREAT AND ALL MIGHTY when any negative posts/news articles come along.
Im not biased in anyway, but Man you are one of the worst fanboi's with Dualown iv seen in awhile.
Nvidia's 480 is going to be a power hog, and powerwise will be compared to the DUAL GPU 5970.
We all know this coming into Fermi. But yet you tend to think otherwise?...
Eitherway when are you going to give up your Nvidia Crusade and just wait tell March 26th like the rest of us?
MSRP for the GTX470 is not 400 bucks, its closer to 325-350 which will give the 5870 a run for its cash hereBetter perf/$
PhysX is also game changing, Metro 2033 will show this.Developer support for such game changing things as Eyefinity.
I dont try to prove anything toher than corect the bullshit flowing from the fanATIcs mouths in this place.
MSRP for the GTX470 is not 400 bucks, its closer to 325-350 which will give the 5870 a run for its cash here
There is a reason why Nvidia certifies the boards, its called Q/A.
Of course they are going to offer limited lifetime warranty* on Fermi cards.i wonder if any of the video card companies are going to offer lifetime warranty with fermi...
Since when is power consumption a major deal and worth crying about? What is this [G]reen|Forum ?
I see fermi doing decent, I really have doubts it will be a really really bad card as many are predicting. We will see March 26th. I am in the market for a new video card and im not gonna let unofficial jackasses cloud my judgement.
The amount of fanboyism here.. I applaud and laugh at you for being so dedicated to a brand.
Since when is power consumption a major deal and worth crying about? What is this [G]reen|Forum ?
I see fermi doing decent, I really have doubts it will be a really really bad card as many are predicting. We will see March 26th. I am in the market for a new video card and im not gonna let unofficial jackasses cloud my judgement.
It's not [G]reen|forum but some of us live in places where power is quite expensive and it's warm so you get the double wammy of having a card use a ton of power and then you have to use that much MORE power to cool down your house with AC. So just having a 275W GTX480 would mean 550w of cooling. It adds up.
I actually started laughing, right in the middle of a meeting.
Good show. I love satire.
I see Xman didn't get my last memo. 2033 & PhysX certainly isn't "game changing" unless they somehow figured out a way to convince nVidia to allow fine-grained multithreading CPU physics on PC. Otherwise it's going to be less "game changing" than Batman was. I'm fine with GPU physics, but I refuse to support a company and product that artificially castrates the previously supported features from the company they bought the tech from(both ATI + PhysX and CPU + PhysX) in order to promote their rebadged solution.
Also, as far as I know, PhysX doesn't have a lot of supports on the consoles, where the tools are free & CPU fine-grained multithreading is enabled. Far more titles use the Havoc package, and since the biggest differentiating factor between physics packages is speed, most likely PhysX is less efficient than Havoc is (in fact devs prefer Havoc despite PhysX being somewhat more flexible...able to do deformations, some basic fluid dynamics afaik that's the major difference).
This isn't a physics discussion. On performance: it is likely that by the time 480 is in volume production, ATI will have a refresh of its 5870, and will outperform it in a significantly (say 50W for the sake of a reasonable guess) smaller thermal package.
PhysX being a competitive advantage is a stumbling block for ATI, but luckily everyone hates PhysX and wants to move toward platform agnostic physics.
The Developer himself said Metro WILL make use of multi core/multiThreaded cable CPUs as well as GPUs that support it. He clearly stated they put in the TIME and EFFORT in making sure it worked as they programmed it. Blame the bullshit devs for half assing the PhysX ability to use multicore/threaded capable system.
No, just alot of people, including yourself are more apt to go for half baked physics solutions because the dev figured preprogrammed rail induce nausium was better than doing physics right.
Again, it isn't a dev thing. It means that Nvidia changed this policy for this title, or changed it recently regardless. Everything Batman & prior, Nvidia has restricted CPU PhysX to a single core.
Right. Dude you really need to stop. The SDK for the PhysX library is the same for XBOX360, PS3, Wii, iPhone as it is for the PC. If it isn't using multiple threads on the PC, its because the Developer didn't bother to setup it up to do that. You really need to stop with your strawhat theories.
MSRP for the GTX470 is not 400 bucks, its closer to 325-350 which will give the 5870 a run for its cash here
I GUARANTEE that the gtx 470 will be sold for no less than 399, and more likely 499. Yields are far too shitty for anything less. You have absolutely no data to indicate the MSRPs of which you speak.
Posted via [H] Mobile Device
Wrong, the SDK is not the same. Each platform uses its own highly tuned version. The PC SDK has a good GPU library, fully supported by Nvidia, and a poor CPU portion that has been apparently left unchanged since the unoptimized heap that Aegia released 6 years ago. I'm not sure about this, but I assume that certain GPU libraries are not available in multi-threaded CPU versions (this was at least the situation with Novadex). That's not hearsay, there's been plenty of discussion and evidence for PhysX awful CPU performance/utilization. The Novadex SDK was free for commercial use, that's probably why they chose it. It certainly wasn't because Aegia's CPU code was better than Havoc's. Nvidia approaches devs to code the extra physics effects for their GPU in exchange for some money. It has been demonstrated that the CPU fallback to the GPU libraries is single core, and in general doesn't seem to fully utilize (never mind efficiently) even that one core. The developer can spend a few months trying to implement a multi-threaded version of the GPU libraries or just use the fallback and save themselves a lot of money. That's why we have such a 'leap' in performance vs CPU.
If Nvidia was shooting straight, their PC SDK libraries would be highly optimized for CPU's as well as GPU's, and then they could point to their GPU performance and demonstrate how much can really be gained by going to GPU side physics. Instead we have the following:
http://techreport.com/articles.x/17618/13
http://www.hardocp.com/article/2009/10/19/batman_arkham_asylum_physx_gameplay_review/11
Interestingly enough, when you enable PhysX in Batman, your CPU usage goes DOWN http://forum.beyond3d.com/showthread.php?t=54786&page=24
There are at least a half dozen examples. The only PhysX CPU multi-threaded title that I'm aware of is UT3 (very basic physics effects), and I do not believe the bonus Ageia "PhysX" levels (the only GPU physics enabled portions of the game) utilize beyond 1 core (and poorly at best). Note in the Hardocp article, that change in CPU usage with PhysX enabled on CPU vs PhysX enabled on the GPU is a whopping 4%. You'd be naive to think this isn't intentional. Nvidia loves grey marketing tactics like this.
The argument that it is somehow a selling point for Nvidia comes up empty unless GPU physics demonstrate substantial improvements over fully threaded CPU physics. I should have nearly 100% utilization on my Nehalem in scenes where there is a major performance boost when utilizing GPU to calculate. If that doesn't happen then I have to say it's a selling point for their marketing department not their hardware. It sounds like in Metro 2033's case, an easy comparison can be made, so we'll lay that argument to rest soon. I'm relatively confident that what we've seen with PhysX so far can be run at a fairly good clip on top-end CPU's, and that Nvidia is engaging in their usual deceitful marketing practices.
And FYI I'm not using any logical fallacies in this discussion. It's not a straw man argument (what's strawhat lol), well because that's not what straw man means. I'm not misrepresenting your position.
You are still full of it.If one dev can do right, why can't the others? I say because they are lazy and nothing more. And teh SDk for PhysX is the same across the board, if it wasn't, this bloody well wouldn't be true coming from a DEV THEMSELVES.
"PCGH: You announced that you game will be developed for PC, Xbox 360 and PS3 and that the PC version will come with DX11 and GPU Physx support. So we assume that the Game won't be developed as a pure cross platform title. Is that right? Do you develop the game for each platform separately?
Oles Shishkovstov: Why? I don't understand your logic. The same code executes on all of the mentioned platforms."
From teh link I provided or did you not bother to read it? If the SDK WAS NOT the same, there is no way in hell it would be the SAME CODE for all platforms.
So explain to us why some Physx games with the physx card disabled only use 1 core when it has 3 more cores to use?
Do not come and tell me they cannot use those extra 3 cores for physx....they can...they just aren't for a reason.
You are still full of it.If one dev can do right, why can't the others? I say because they are lazy and nothing more. And teh SDk for PhysX is the same across the board, if it wasn't, this bloody well wouldn't be true coming from a DEV THEMSELVES.
"PCGH: You announced that you game will be developed for PC, Xbox 360 and PS3 and that the PC version will come with DX11 and GPU Physx support. So we assume that the Game won't be developed as a pure cross platform title. Is that right? Do you develop the game for each platform separately?
Oles Shishkovstov: Why? I don't understand your logic. The same code executes on all of the mentioned platforms."
From teh link I provided or did you not bother to read it? If the SDK WAS NOT the same, there is no way in hell it would be the SAME CODE for all platforms. And you still ignored a Dev flat out saying IT WORKS. They've had support from Nvidia since they bought Ageia, but yet somehow, mysteriously they let the Devs of Metro 2033 utilize multisore/threaded cpus for CPU physx and NO FUCKING OTHER DEV, dude you seriously need a wake up call. Its the Devs being lazy and nothing more.
dude, are you supposed to be smart? cause you come off sounding like a total ass with no references to back you up other than one developer who has to speak well of the product he is using caues if he didn't, he's get screwed.