Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
The man has a history of being wrong about anything Nvidia and now because he called 2, mabe 3 things right he is to be given massive credit? That'd be like akin to giving Fudo credit even tho he is wrong most of the time even when he gets it right a few times.
QFT.
Console ports are getting lamer and more ubiquitous by the day. So these big circle-jerks by ATI and Nvida about the "potential" of their latest GPU's are nauseating.
I'm not plopping down major $$$ for a GPU upgrade just to run some arbitrary benchmarks trying to impress people. We need games designed to use this stuff from the ground up.
This. It's been alright for the past few years since we could just keep cranking the resolution up, but now it's gotten to the point where a single card at 1080p can push 99% of games 100+ fps and crysis in the 50s and 60s.
Come the next gen of video cards, I believe nv and ati will be in for a big surprise whenever the sales are less than stellar. Though, I don't know how much of their business comes from consumer cards, so they might be alright.
I'm not sure I get the Charlie hate. He's been pretty accurate these last several months. I've seen posts of his from May 09 talking about Fermi blatantly missing the Holiday cycle. He's probably wrong about Fermi not having a dedicated tesselator, but considering that he's given us something to talk about these last 6 months, and that he's been on the money about a few things is good enough for me. Or do you have some personal grudge? Did Charlie steal your girlfriend?
He's wrong 95% of the time, 4% of the time he steals info then spins it into outright rumors while pretending he came up with it himself, and 1% of the time he has something real but it is full of bias and brutally poorly-written, making it annoying to read. Even a broken watch is right twice a day, . He's been around for many years and has always had the same patterns .
Nvidia invests in gaming and will continue to do so. They know it's what sells the bulk of their graphic cards.
Oh, it's only the high end that is capable if gaming now?
it's completely wrong. Even the 448SP product will not be the majority of their GPU sales. They're probably hoping that Fermi will open massive new markets. They'll keep their market share in this one, and that will be good enough. I don't think GTX 260 and 280 constituted even a major fraction of their consumer GPU sales last generation.Nvidia invests in gaming and will continue to do so. They know it's what sells the bulk of their graphic cards.
I see this all the time. Show me this 5% accuracy you're talking about. Because I've read most of his posts, and besides an obvious anti-nvidia slant, I don't see a lot of misinformation. I do see a lot of sensationalized stuff, but little that is outright wrong. And of course he "steals" all of it. The information isn't originating with him, nor does he claim it is. But I follow B3D threads very closely, and his information is usually a few steps ahead of what is posted on those boards. And yes, of course some of it is off. But it isn't coming from nowhere. Sources aren't always going to be correct, especially with highly secretive projects like this.
I'd like you to find me 5 of his posts from September to today, and show me that all of that information is incorrect. Because that's what you're claiming.
Did Charlie steal your girlfriend?
Looks like that "GF100 = 5970 on a single core" turned out to be complete and utter BS.
Nothing extremely impressive about the card. Maybe after 28nm
The big issue is what developer is gong to support this new hardware or will we simply be able to play old games like Crysis and Farcry faster?
Nv and ATI better start sinking $$ into game studios to push PC gaming or this is all for nothing...
Did they give away a free GF100 like they said on twitteR?
GTX280<HD5850<GTX360<HD5870<GTX380<HD5970
Kinda like
9800GTX<HD4850<GTX260<HD4870<GTX280<HD4870x2
you know gtx 360 specs ? i didn't see any in game real benchs how about showing us there awesome tessellation power in real games like dirt and stalker ?Not quite. Someone needs to read the leaked info and the articles about GF100's graphics bits.
Given what we know now, it's more like:
HD 5850 < HD 5870 < GTX 360 < GTX 380 <= HD 5970
Geometry performance in Fermi based GeForces is off the chart, which is funny since tessellation was supposed to be AMD's strong point
you know gtx 360 specs ? i didn't see any in game real benchs how about showing us there awesome tessellation power in real games like dirt and stalker ?
Impressive demos indeed, except that's all they are, demos. The raytracing is no where near the point where it can be used in a game, and the hair looked great, but it alone dropped the performance to 26 fps. What happened to the days when nVidia would show off tech demos for new GPUs that actually ran smoothly?
When I see actual DX11 games on PC that run great on one of these, then I'll be intrigued.
Impressive demos indeed, except that's all they are, demos. The raytracing is no where near the point where it can be used in a game, and the hair looked great, but it alone dropped the performance to 26 fps. What happened to the days when nVidia would show off tech demos for new GPUs that actually ran smoothly?
When I see actual DX11 games on PC that run great on one of these, then I'll be intrigued.
I don't think you realize how taxing it is to simulate physics in hair...Nothing could do that, with that amount of hair, in real time before.
Plus, the hair demo is using tessellation.
You wrongly assumed I was only referring to their top end.
They have not reduced their investment in TWIMTBP. They know the halo effect still sells cards with their name on it.
Any idea about fermi GPU which will be priced close to $180? GTS 350?
where from you got those specs can you please post a link to back up your finding ?448 SPs, 320 Bit memory interface, 1280 MB of GDDR5.
And you can see the tessellation power in the hair demo.
I agree that Fermi is going to be a folding monster but that is about it. It has zero to offer to the under 300USD crowd. Also far worse they are cramming transistors on a huge die in order to beat the 5870 The result is going to be a card that is built for 28nm sitting on 40nm AKA HOT.
ATI is about to let loose the 5830 at 230USD. That is insanely high but they are sending Nvidia a message that they simply don't care and can drop the price at the slightest sign of competition.
Nvidia needs 28nm more than ever as thats the ONLY way they can get fermi into smaller versions without killing performance.
Not so simply, what NVIDIA have done is to separate the Raster Engine from the pipeline and move it down into the GPCs in four parts, and they have created a new engine they are calling the "PolyMorph Engine" which is integrated into the SMs. First a little breakup of the hierarchy, the GF100 is made up of 4 GPCs (Graphics Processing Clusters) which break down into 4 SMs (Streaming Multiprocessors) which break down into 32 CUDA cores and 4 Texture Units and some other stuff. So, 32 CUDA cores plus 4 Texture Units plus the PolyMorph Engine make up an SM, and 4 SMs make up a GPC. With this kind of parallelism you can see how the GPU can be sliced and diced to create less expensive parts.