Gt300 leaked internal benchies? seriously?

dude it does not get any more fake, common now, wont say anything about the gt300 speed wise, but that video is way too cheap, if nvidia wanted to run benhces I am sure they wouldn't be running the gt300 at 1680x1050, and why the hell would they make a video so cheap looking. they don't make videos they make slides, if it was performance slides than it would be more believable, too cheezy of a video to be official.
 
well some one with an HD5870 and a high end system run 3dmark vantage and see what numbers you get.. that video loses its validity just for the pure fact that the game tests were run at 1680x1050.. who the bloody hell would run either of those cards at that resolution.. oh wait.. people that want to make one card look way better then the other.. because at a higher resolution the difference is half what it is at the lower res..

i wont doubt that the GT300 will be faster then the HD5870 but i highly doubt the difference will be that much..
 
lol, that thing is almost twice as fast as hd 5870 if it is true and that makes it almost twice as fast as gtx295, and over three times faster than gtx 285, now that just kills it right there, no way in hell it is going to 3 times the speed of gtx 285, I really don't care if it beats the hd 5870, I am pretty sure it will, but I don't think its gonna be 90% faster than hd 5870 in crysis. just doesn't add up.
 
I blame unemployment for this.

hahhaha, lol someone had nothing better to do. hope nvidia pays him. The first part in the video is copy of history channel show, i forgot what it was called.
 
Looks like someones art project. The way people argue about ATI and Nvidia reminds me of 360 and PS3 owners. Please keep the nonsense on the console side of gaming.

If you want a 5870, buy it. If Nvidia comes out with a better card buy that one. Simple! The only thing that holds people back from buying the stuff they want is money.
 
if there are any real good results, the nvidia ceo wouldn't hold a fake card and become a laughing piece afterward.
 
did they fold the paper in half to get it to stay in the slot?
I bet they used cardboard.

They don't have anything just blowing smoke about what they will have some time.

I have seen paper launches before but nothing like the BS that nvidia is throwing out this time.
 
did they fold the paper in half to get it to stay in the slot?
I bet they used cardboard.

They don't have anything just blowing smoke about what they will have some time.

I have seen paper launches before but nothing like the BS that nvidia is throwing out this time.

Lol
Easy there cowboy there's no evidence that nvidia is responsible for this most likely spoofed chart. You guys really get that much satisfaction at a chance to yell Gotcha at nvidia? Really? Heh
 
bah a video of slides? i was hoping this was a video of it actually running..

i love the comments, they are more entertaining than the video

Stanboy said:
Wrong, I have no reputation to protect, see what hapens when its the other way around? Thats why I was so pissed when I saw flawed info put out against ATI, Guess its ok for them to do it and not me right?
 
Obviously fake. Nothing but a graph that anybody can make with numbers that make no sense at all.
 
So in the comments the guy admits he is an angry NVidia fanboy that made up a fake video. It doesn't get much more fake than that. :D

But if the other leaks are correct, Fermi has 50% more transistor and 50% more memory bus width, so I would guess we will see it about 50% faster. Depending on other tweaks and the game you are playing.

Hemlock will likely be about 30% faster than Fermi.

Fermi dual GPU card will likely need to be cut back due to power requirement and will likely have maybe 15% over Hemlock.

I think this is what we can expect in the new year.
 
HOLY SHIT

Not only is Fermi >>>>>>>>>>>>>>>>>>>>>>>>> Cyprus it also need less power because of the lone 6 pin pci-e!!!


OMG ATI in deep shitZZZ!!!!!

:eek::eek::eek::eek::eek::eek:


:rolleyes:
 
i dont belive paper lunches i wait for [H] to review them before i even buy a card. i like to see what ati and nvidia has to offer and has the best price/performace
 
Probably fake but also quite feasible. That's actually about what I would expect from the card.
 

I can only imagine that this card will be insanely priced out of my budget. So it matters not if it comes out this year.

I am interested to know if it will have 1 gpu or 2 gpus.
 
Last edited:
I can only imagine that this card will be insanely priced out of my budget. So it matters not if it comes out this year.

I am interested to know if it will have 1gpus or 2 gpus.

It's a single gpu.
 
It is possible that Fermi will be that fast, afterall the chip is going to be 60% bigger then 5870, but I will call shenanigans on the video because NV is nowhere near a final product with the card and yet they dare display what appears to be a complete product with cooler at the start of the video.
 
If any of those stats were to be believed, then it would be a $600 card at launch.
 
hey guys look, even more official benchmarks!

bullshit.JPG
 
Isn't Tesla geared towards GPGPU applications, not gaming? I though Fermi was something different.
 
hey guys look, even more official benchmarks!

bullshit.JPG

Is that the pricing chart?
power requirements? :)

Isn't Tesla geared towards GPGPU applications, not gaming? I though Fermi was something different.
Tesla is the brand of GPGPU cards, Fermi is a code name of a chip. My guess would be that Tesla and Geforce cards will use Fermi just as current Geforce and Tesla cards both use GT200b chips.
 
well some one with an HD5870 and a high end system run 3dmark vantage and see what numbers you get.. that video loses its validity just for the pure fact that the game tests were run at 1680x1050.. who the bloody hell would run either of those cards at that resolution.. oh wait.. people that want to make one card look way better then the other.. because at a higher resolution the difference is half what it is at the lower res..

i wont doubt that the GT300 will be faster then the HD5870 but i highly doubt the difference will be that much..


But then again the specs of the gt300 in comparison to the 5870.
I've also noticed that nvidia is much more efficient because, a 4870 with 800 processing cores is about the same as a a gtx 260 plus htey used gddr5 and nvidia only used gddr3 and nvidia's clocks speeds werent as fast as the 4870 but somehow the 260 was able to keep up.
 
I've also noticed that nvidia is much more efficient because, a 4870 with 800 processing cores is about the same as a a gtx 260 plus htey used gddr5 and nvidia only used gddr3 and nvidia's clocks speeds werent as fast as the 4870 but somehow the 260 was able to keep up.

Not quite true. The stream processors on the nVidia cards are clocked a lot higher than the core - almost twice as high.
 
I've also noticed that nvidia is much more efficient because, a 4870 with 800 processing cores is about the same as a a gtx 260 plus htey used gddr5 and nvidia only used gddr3 and nvidia's clocks speeds werent as fast as the 4870 but somehow the 260 was able to keep up.

Say what? Everything about the designs is counted different, so you really can't compare those core counts.

Memory bandwidth is about the same because the faster memory is offset by the giant memory bus on the GTX (448 bits vs 256 bits on ATI).

You want to know efficiency counts that really matter, because they impact costs: Transistor count.

The 4870 950 Million transistors
The GTX 260 has 1.4 Billion transistors. But the GTX 260 only has 90%(x1.4 = 1.26) of the units active so 1.26 Billion transistors.

Bottom line: ATI is using 300 million less transistor and matching performance, so that really makes their design look more efficient. Where it really counts, because in the end it is who can do the most with the transistor budgets.
 
Back
Top