G80 specs revealed

Lord of Shadows said:
Soo..... since when did the shaders run at a different clock speed from everything else?


since G70 (7800 series) as far as i know, the geometry clock was running a different speed from the other part of the GPU......i forget the technical names, something like shader domain and geometry domain
 
Actually, since NV40 NVIDIA has been using different clock domains in their GPUs. It seems the G80 might be expanding upon this further, a pretty smart move considering Vista will be using the GPU 24/7.

When "core" speed is referred to it usually means the rop/pixel shader speed. Vertex runs at a different clock sometimes called the "geometry clock speed". It could be that in G80 shader units and rops run at different speeds.
 
The lower end unit looks interesting to me, but the issue is with the power requirements. This is getting nuts. It costs a lot of money to run 100 extra watts for say 8 hours a day. That and the cards are quite long making the fit an issue again.

What I'm getting at here is we could be looking at the 5900 all over again. Perhaps skip the launch gen cards and wait for the 9000 series which will likely be refined with less power needs?
 
Advil said:
The lower end unit looks interesting to me, but the issue is with the power requirements. This is getting nuts. It costs a lot of money to run 100 extra watts for say 8 hours a day. That and the cards are quite long making the fit an issue again.

What I'm getting at here is we could be looking at the 5900 all over again. Perhaps skip the launch gen cards and wait for the 9000 series which will likely be refined with less power needs?

Not really..100 extra watts for 8 hrs is .8kw/h's extra per day, or 24kw/h's a month. Depending on where you live, that's like an extra $2-5 bucks on your electricity bill ontop of what it is right now. That's not that much..
 
Advil said:
The lower end unit looks interesting to me, but the issue is with the power requirements. This is getting nuts. It costs a lot of money to run 100 extra watts for say 8 hours a day. That and the cards are quite long making the fit an issue again.

What I'm getting at here is we could be looking at the 5900 all over again. Perhaps skip the launch gen cards and wait for the 9000 series which will likely be refined with less power needs?

you have to consider the power required per transistor

just for discussions sake, meaningless numbers here, if they increased transistor count by 50% over the previous gen GPU but only require a 20% increase in power to power it, that's a move in the right direction

i'd like to see a comparison and contrast to the power requirement to transistor count ratio of GPU's years ago, would be very interesting indeed
 
Commander Suzdal said:
The reasoning behind the seemingly odd RAM amounts...
I'm curious. Why did you add dozens of links that point to literally nothing? IntelliTXT already does this for you (zing).
 
nobody_here said:
you have to consider the power required per transistor

just for discussions sake, meaningless numbers here, if they increased transistor count by 50% over the previous gen GPU but only require a 20% increase in power to power it, that's a move in the right direction

i'd like to see a comparison and contrast to the power requirement to transistor count ratio of GPU's years ago, would be very interesting indeed
20%??...your math sucks. the 8800gtx will use twice the power or 100% more than a 7900gtx. the 7900gtx is around 90watts and this card will be around 180watts.
 
Just in case Im missing something here...
He did say meaningless numbers....
Or am I missing something?
 
trek554 said:
20%??...your math sucks. the 8800gtx will use twice the power or 100% more than a 7900gtx. the 7900gtx is around 90watts and this card will be around 180watts.


{NG}Fidel said:
Just in case Im missing something here...
He did say meaningless numbers....
Or am I missing something?


:D yes indeed, i did, it was purely to make the point, not to be mathimatically correct ;)

trek554, feel free to do the math for us

the G80 will have how many more transistors than G70 and will require how much additional power?
 
The 450W psu requirement is the same as my 7950GX2.

My Antec True Control 550 powers my 3200venice @ 2.6 / raptor + 320 hdd, and it has powered my sli'd 7800GTX's just fine.
 
Im skipping this card, going to wait and see that ATi can bring. The power requirements are bullshit.
 
dR.Jester said:
Im skipping this card, going to wait and see that ATi can bring. The power requirements are bullshit.

ATI's power requirements are supposedly pretty similar, if not more.
 
firewolf said:
Yes, but with 2 hard drives and an overclocked CPU, that 450 watt requirement looks pretty low, with a 7900GT it's fine.

Two relatively modern HDD's don't use *that* much power. People tend to view them as a huge drain of power.

I'm fairly certain that nVidia likely tested a system with a GTX that is more power-demanding than yours.
 
Unknown-One said:
Pshhh, I only spent $5 on my PSU (Yay $50 rebate!), and I'm pretty sure it will handle at least one GTX, maybe two GTS's. Its an "SLI Certified" Ultra xFinity 500W PSU

If 34A on the dual 12v rails isn't enough I would be shocked :eek:

I *highly* doubt you would be able to run two GTS' on that PSU.

Also, it'd be interesting to see what the deviation for voltage and amperage is on your +12v rails.
 
aznx said:
ATI's power requirements are supposedly pretty similar, if not more.

All the more reason to wait second gen stuff then. My current card handles everything I throw at it with my 1280x1024 native LCD resolution. So I put my rant hat on :p

Huge frigging cards, power hungry as he**, no Vista or DX10 games at launch, Vista going to be real di** when it comes into authenticating your copy. Those are just few of very many reasons to skip the first DX10 gen cards. Then again it does not stop rich people from burning money :D
 
adonn78 said:
ATi is using an early Sample of GDDR4 based of DDR2 technology. by next year the speed of GDDR4 should be faster and in full production. Secondly these are all rumors and speculartion. And the uneven RAM amounts makes everything sound fake. Does anyone know how many nanometers this new 8800 chip will be? In addituion I thought chip makers were trying to cut the amount of power needed for these CPU's/GPU's. :confused:

False on the rumors count:

"DailyTech received its first looks at a GeForce 8800 production sample today"

http://dailytech.com/article.aspx?newsid=4442
http://dailytech.com/article.aspx?newsid=4441
http://www.dailytech.com/article.aspx?newsid=4450
 
Xenozx said:
IM impressed w/ these specs. I still want to see it officially announced, and will wait to see what ATI has to offer. I have no brand loyalty, what ever is faster and has more features. Dont count ATI out yet. Whats in teh X360 is over a year old now, so IM sure they have moved on to something bigger and better :D

I have no doubt that ATI has something better than what's in the 360. They better anyway...
But when looking at these G80 specs, they're simply insane. I can't imagine the R600 being better than this, but of course, we'll need to wait and see.
 
Ajax9000 said:
Pity about all those 853/2x480 and 1367/6/5x768 plasmas & LCDs that cause grief to so many video cards.

Nvidia has just added a 1366x768 mode to ForceWare, so maybe the G80 will have the grunt to bludgen those TVs into submission. :)

Adrian
Whoever first invented those fucking resolutions should burn in hell! You can imagine how well those resolutions match anything broadcast in PALland. Typically idiotic foresight by the electronics industry . Bring on (cheap)1080p screens potentially then end of all this.
Sorry just venting.
 
Advil said:
The lower end unit looks interesting to me, but the issue is with the power requirements. This is getting nuts. It costs a lot of money to run 100 extra watts for say 8 hours a day. That and the cards are quite long making the fit an issue again.

What I'm getting at here is we could be looking at the 5900 all over again. Perhaps skip the launch gen cards and wait for the 9000 series which will likely be refined with less power needs?

You keep insisting on this. NVIDIA already proved that they have learned their lesson. They won't be making the same mistake. You can quote me on that :)
Of course the refresh will be refined. That's why they exist anyway. But this card will be a monster as soon as it's out. We'll need to wait for the maturity of drivers, as usual, but in DX9, nothing will even come close.
 
sam0t said:
*, no Vista or DX10 games at launchD

Flight Sim X on both counts :p

Company of Heroes is fully Vista integrated, as is Lego Star Wars 2. Neither are DX10 admittedly though.
 
hmm no they won't we just started porting some stuff to dx10, its a nightmare, we scrapped it, and decided to start from scratch and build everything from the ground up.
 
Being Unfied was the biggest supprise for me... I remember Kurk saying somethign to the effect that "Nv was not going doing the unfied path yet as they did not think it was time"...

Anyways if those are true, looks like a nice card.
 
phide said:
I'm curious. Why did you add dozens of links that point to literally nothing? IntelliTXT already does this for you (zing).
Ummm, perhaps because it WAS done by IntelliTXT, not me? Or are you zinging [H] about using IntelliTXT, instead of zinging me?

(Hint: they REALLY, REALLY don't like that.)

Edit: Never mind, looks like IntelliTXT is broken on [H] right now for some reason. But that's still where the links came from.

Edit2: Or at least it was when I submitted my post. Some pop-ups work in this thread, some don't.
 
Known "bug" see link below
http://www.hardforum.com/showthread.php?t=1102494
Commander Suzdal said:
Ummm, perhaps because it WAS done by IntelliTXT, not me? Or are you zinging [H] about using IntelliTXT, instead of zinging me?

(Hint: they REALLY, REALLY don't like that.)

Edit: Never mind, looks like IntelliTXT is broken on [H] right now for some reason. But that's still where the links came from.
 
trek554 said:
20%??...your math sucks. the 8800gtx will use twice the power or 100% more than a 7900gtx. the 7900gtx is around 90watts and this card will be around 180watts.

My problem isn't so much with the amount of power used, as the amount of heat produced (or power leaked). Yeah I know they go hand in hand, but I think videocards have reached the Prescott era, at around 80/90nm. You ever felt the heat a 200W light bulb puts off. Well, my case ventilation needs to be reworked already considering the temps. (need to change to 120mm fans!)

A nice BTX is starting to look more attractive.
 
I don't know who you are Razor .. but I Am pretty damn sure there will be many games that will recieve facelifts when Dx10 upgrades are released. I know I read interviews stating this.
And because you couldn't port doesn't mean it won't happen. :)
 
BTX is dead, as for the heat produced, nvidia will be using a hybrid water/air cooler. You cant removed that kind of heat in a DUAL slot space unless your using very high-speed speed fans and shrouding of some sort.
 
wow... just got finished reading all 8 pages...

The "odd" memory size doesn't seem too weird. I remember my voodoo2 had 12 megs... before everything went to 16...
I imagine that they didn't go from 512MB to 1 GB for the reason of... well, you probably won't see that much improvement yet....
Also... if you look at the 7950 gx2, and quad-sli, you really need a 24"-30" monitor to get the most out of it; most of the guys I LAN with are still gaming on 17"-19"... Hell, I still got my 17" CRT... still works... Might get a 21" soon... anyway... Is this going to be a similar issue with the 8800?

exciting stuff about the G80... can't wait to see it in action!
 
n3g471v3 d3c1b3l said:
wow... just got finished reading all 8 pages...

The "odd" memory size doesn't seem too weird. I remember my voodoo2 had 12 megs... before everything went to 16...
I imagine that they didn't go from 512MB to 1 GB for the reason of... well, you probably won't see that much improvement yet....
Also... if you look at the 7950 gx2, and quad-sli, you really need a 24"-30" monitor to get the most out of it; most of the guys I LAN with are still gaming on 17"-19"... Hell, I still got my 17" CRT... still works... Might get a 21" soon... anyway... Is this going to be a similar issue with the 8800?

exciting stuff about the G80... can't wait to see it in action!

Well I know I'm looking at upgrading mid 2007 after the 45nm quad/oct cpu's hit. That is when I'll spring for a a quad G80 or if they have G90 by then, doubt it I'll go that route. This will power a 52 inch 1080p LCD. Mmmm Pretty!

Odd memory size is probably due to the card being truly 512mb, with 128 mb allocated to the unified shaders. Just a guess, but seems logical, that way they can function somewhat independent of each other.

Sunin
 
ReubenRosa said:
I don't know who you are Razor .. but I Am pretty damn sure there will be many games that will recieve facelifts when Dx10 upgrades are released. I know I read interviews stating this.
And because you couldn't port doesn't mean it won't happen. :)


If they are going through a concurrent developement of what they have now, yes, not older games. I might have miss understood what you were saying ;)
 
Look dammit.......

I need some speculation done on this and I don't want to hear wait til dx10 comes out. But do you guys think that the GTX version of the g80 card would be able to handle crysis at 2560x1600 with some or all of the eye candy on with acceptable frame rates? I mean, according to the specs at the top of this thread.
 
StalkerZER0 said:
Look dammit.......

I need some speculation done on this and I don't want to hear wait til dx10 comes out. But do you guys think that the GTX version of the g80 card would be able to handle crysis at 2560x1600 with some or all of the eye candy on with acceptable frame rates? I mean, according to the specs at the top of this thread.

one no
SLi id say its a fare bet it can
 
StalkerZER0 said:
Look dammit.......

I need some speculation done on this and I don't want to hear wait til dx10 comes out. But do you guys think that the GTX version of the g80 card would be able to handle crysis at 2560x1600 with some or all of the eye candy on with acceptable frame rates? I mean, according to the specs at the top of this thread.


Hmm at that res probably not, 2 of them yeah, Crysis is going to show a whole new meaning to pain on graphics cards, but in a good way, not like Fear or Oblivion. It might be able to handle that res with no eye candy though.
 
StalkerZER0 said:
Look dammit.......

I need some speculation done on this and I don't want to hear wait til dx10 comes out. But do you guys think that the GTX version of the g80 card would be able to handle crysis at 2560x1600 with some or all of the eye candy on with acceptable frame rates? I mean, according to the specs at the top of this thread.

May i ask what display do you have that will run 2560x1600?

Thanks,

Sunin
 
Sunin said:
May i ask what display do you have that will run 2560x1600?

Thanks,

Sunin

Why, my soon to be purchased 30inch LCD of course. I'm still deciding on which manufacturer to buy it from. It may very well be the dell 3007wfp but there competitors I'm waiting (patiently) for that might have something better.
So you are all saying that the one GTX g80 would not be able to handle crysis at 2560x1600 with not even SOME of the eye candy at acceptable frame rates using dx10?
:(
 
StalkerZER0 said:
Why, my soon to be purchased 30inch LCD of course. I'm still deciding on which manufacturer to buy it from. It may very well be the dell 3007wfp but there competitors I'm waiting (patiently) for that might have something better.
So you are all saying that the one GTX g80 would not be able to handle crysis at 2560x1600 with not even SOME of the eye candy at acceptable frame rates using dx10?
:(

It's soooo hard to speculate on that.

Sure, the G80 is a beast, but this is an entirely new game with really hardcore graphics that nobody knows how well it will play on current hardware and the G80's.

I would wager that the 8800GTX would be able to play that game at 2560x1600 with some settings on medium/high with acceptable framerates. We won't know until the cards are released.
 
Back
Top