G80 specs revealed

http://dailytech.com/article.aspx?newsid=4450

more g80 features

"As if we mere mortals needed more reasons to be excited about G80, here are a couple more tidbits: 128-bit high dynamic-range and antialiasing with 16X sampling.

The high dynamic-range (HDR) engine found in GeForce 7950 and Radeon series graphics cards is technically a 64-bit rendering. This new HDR approach comes from a file format developed by Industrial Light and Magic (the LucasFilm guys). In a nutshell, we will have 128-bit floating point HDR as soon as applications adopt code to use it..."
 
chinesepiratefood said:
read the dailytech comments, the author posts about them not being true unified shaders, but nvidia is calling them that or something

So it is safe to assume (pending real reviews and benchmarks on both cards) that this 128 unified shaders isn't going to mean this card is twice as powerful as the R600 with 64 unified shaders?
 
Its a good bet, but until both cards are in Brent's hands I wouldn't jump to too many conclusions. ie we don't know how powerful these 'not true unified' shaders are, so we can't really compare them. If they are half as powerful as a 'true unified shader' then its possible the cards will perform similarly.
 
Silus said:
No one knows at this point of course, but my guess is that it will in DX9. In DX10, however, I won't even make a comment yet.

Dang nabbit your right!!! The missing piece of the puzzle is Microshaft's late to the party DX10. So late in fact I hear that they are actually leaving out DX10 from vista and will add it seperately as a patch.
What the hell would be the point in buying vista when it comes out then? I mean especially since they keep saying DX10 won't be for XP.
 
zg75 said:
http://dailytech.com/article.aspx?newsid=4450

more g80 features

"As if we mere mortals needed more reasons to be excited about G80, here are a couple more tidbits: 128-bit high dynamic-range and antialiasing with 16X sampling.

The high dynamic-range (HDR) engine found in GeForce 7950 and Radeon series graphics cards is technically a 64-bit rendering. This new HDR approach comes from a file format developed by Industrial Light and Magic (the LucasFilm guys). In a nutshell, we will have 128-bit floating point HDR as soon as applications adopt code to use it..."
thats fucking awesome!!! thanks for the post..... that might have just made my day....
 
chinesepiratefood said:
read the dailytech comments, the author posts about them not being true unified shaders, but nvidia is calling them that or something
Not exactly--he alludes to the fact that the 128 part may depend on your method of counting, but he does NOT say they aren't really unified. In fact, in another post under one of today's other G80 stories, he says:

"It is unified. We'll post the details of the architecture later today when the DT writers recover :-P "

So, yes Virginia, it's really unified, but whether there are 128 of them depends on your point of view.
 
Kubicki confirms HDR+AA, doesn't know yet about HQ AF. Hopefully the IQ bugaboo can now die a long-deserved death...
 
Hornet said:
Holy cow :D
128 unified shaders, hahaha... comparing to x360 48 unified shaders... who'd thought these new gen console gets beaten so fast :p


They finally increase the memory data bus to 384 bit, so I guess for now, that'll be the means of improving memory bandwidth. At such high frequency, those extra bits could actually translate to a sufficient increase in bandwidth...

heh the console are really already beaten


X1950 has 48 shaders (ok theyre straight Pixel shaders) and they run 150mhz faster, and they have 512mb GDDR4 (1Ghz) all to them selves!! and id be willing to bet that the 7900GTX is also better than whats in the PS3, and if its not then the 7950GX2 is
 
Warrior - welcome! I am so excited about this card and glad to see a comment that (sort of...) clears up the unified shader architecture question. HDR+AA makes me grin too :)
 
Man, whenever the specs are revealed is when I get the most exited and think about cranking up my games...

I'm on a lowly X800GTO right now (hate ati drivers) so I am looking forward to this upgrade like there's no tomorrow.

16xAA and 128-bit HDR is going to be sweeeeet. Somehow I'll probably end up buying one the day it comes out, I want to support a good company :D
 
It's hard to say what "128 unified shaders" really means.

The difference between fast and Real Fast is the ALU configuration, not the number of shader processors. I have a feeling that "shader processor" is going to be the new "pipeline". Very ambiguous stuff.

I'm thinking I may wait until January/February to pull this trigger. We'll see what happens.
 
There's a hard to understand comment on the original dailytech specs revealed thread that says:
------------------------------------------------------
quote:
speaking for DX10 ASIC such as G80 per se . with numerous arrays of MIMD 1D ALU which could be seen as:

128X1(MIMD 1D)X2(Double-Clock)=256

R600

(4+1D) 5X64=320

besides R600 has better raw performance whilst G80 have better arrays of ALU suffice for better utilization hence optimal future.
------------------------------------------------------
sounds like Team Red is going for brute force and Team Green is investing in a little elegance/flexibility?
 
A couple of thoughts..........

1) I have a 7950GX2, and will probably stay with it for a couple of gens. Unless I just get to a point where I wanna upgrade, or nothing will run on it anymore. Whichever of the 3 happens first. Now dont get me wrong, it looks hella nice, but after dropping $1200 in 3 years on cards, ims stick with this one for a while

2) Pffff 360 graphics sucked compared to my 7800GTX. PS3 has a lot of hype, so i wasnt really suprised that they were beaten by computer GPU's so quickly. It was all about timely progression.
 
Somebody tell Nvidia that they are gonna be sued by Sega for the use of "High Definition gaming." (hah,saw that the other day on one of those old genesis consoles. What did they mean exactly by high definition anyways? Not exactly what Nvidia is trying to represent)
 
we will have 128-bit floating point HDR as soon as applications adopt code to use it..."

So this will likely end up being another one of those items where people are like "buy nvidia is has 128bit HDR" ATI doesnt, so ATI sucks ass for not having it.

and yet no games will actually use it until years later anyways, SMS 3 anyone? all over again!
 
Those Sega systems were high definition for their time. Today, not so much.

High definition is just a buzz-word, so expect it to be used to describe nearly every product in the next five years. The next Coke, I presume, will be advertised as having high definition flavour.
 
StalkerZER0 said:
Dang nabbit your right!!! The missing piece of the puzzle is Microshaft's late to the party DX10. So late in fact I hear that they are actually leaving out DX10 from vista and will add it seperately as a patch.
What the hell would be the point in buying vista when it comes out then? I mean especially since they keep saying DX10 won't be for XP.

And yet another person who thinks vista is nothing more then a GUI upgrade - try reading up more on vista and what is REALLY behind that new GUI.....
 
Truly separate user and kernel spaces, being able to stop and restart entire subsystems (such as video) without a reboot, better i/o handling and memory addressing, more flexible and efficient use of multiple threads. Personally I am looking forward to Vista (and yes I've used over half a dozen builds of it at this point).
 
zg75 said:
Truly separate user and kernel spaces, being able to stop and restart entire subsystems (such as video) without a reboot, better i/o handling and memory addressing, more flexible and efficient use of multiple threads. Personally I am looking forward to Vista (and yes I've used over half a dozen builds of it at this point).


"Truly seperate"? Really? Wow, I've never known Microsoft to truly seperate anything. It would be nice if they got a few things right with Vista. A guy can hope I guess.

Anyway thats off topic, back to the G80. For those of us not up to date on our graphics lingo - What is the definition of a unified shader? Not what Nvidia says it is or ATI says it is but like a textbook definition.
 
This comment from a DT reader deserves re-posting here. I personally don't have any problem with the alleged heat output of these cards as long as I can replay Call of Duty 2 in all its glory. But I thought this was the funniest public comment I'd read in a long time:

This is great news. For $300, I can buy a great 1300watt psu and then spend the amount of money I would on a new heating system on these cards in SLI mode. I'm hoping they have a Quad Configuration as well because I need to exhaust the entire side of the case directly into my AC Ducts and I need the extra fan power to push it upstairs, and I want it to be as warm and toasty as possible. I may just have to put all silent heat sinks and setup a warning monitory system to generate EVEN MORE HEAT and then use an external exhaust fan to whiff it up into the house.

I am very happy that such technology can provide true comfort for the family room in the upstairs where we will stream video to for the family visits.
In the basement, where this will be, I may have to wear shorts to keep from sweating too much, but it will be worth it.

The Pentium 4 Extreme Edition doesn't, by itself, generate the amount of heat to warm more than an entire basement, but Intel Technology together with Nvidia push the Thermal Envelope to new heights, and I thank them, and my cat thanks them.

". . . and my cat thanks them." LOL

:D
 
metallicafan said:
"Truly seperate"? Really? Wow, I've never known Microsoft to truly seperate anything. It would be nice if they got a few things right with Vista. A guy can hope I guess.

Anyway thats off topic, back to the G80. For those of us not up to date on our graphics lingo - What is the definition of a unified shader? Not what Nvidia says it is or ATI says it is but like a textbook definition.


A shader unit that can be used to calculate either vertex shader instructions or pixel shader instructions. (general purpose ALU)
 
Hurin said:
This comment from a DT reader deserves re-posting here. I personally don't have any problem with the alleged heat output of these cards as long as I can replay Call of Duty 2 in all its glory. But I thought this was the funniest public comment I'd read in a long time:



". . . and my cat thanks them." LOL

:D
Absolutely hilarious. Haven't laughed so hard in a while........
 
solobaricsrock said:
A couple of thoughts..........

1) I have a 7950GX2, and will probably stay with it for a couple of gens. Unless I just get to a point where I wanna upgrade, or nothing will run on it anymore. Whichever of the 3 happens first. Now dont get me wrong, it looks hella nice, but after dropping $1200 in 3 years on cards, ims stick with this one for a while

2) Pffff 360 graphics sucked compared to my 7800GTX. PS3 has a lot of hype, so i wasnt really suprised that they were beaten by computer GPU's so quickly. It was all about timely progression.
While I'm sure that it's quite likely the PS3 will not be that impressive compared to the upcoming generation of GFX cards, perhaps we should pause to reflect that the PS3 has not yet been released--and neither have the g80 or r600. Perhaps declaring "Pwned" is a little premature?

We need to have a forum moderator with a user name of Alan Greenspan who pops in once in a while to growl, "Irrational exuberance!"

That said, my "happy area" is feeling a lot of irrational exuberance about the g80 right now, too!
 
Hey people..

It's safe to assume that current video cards are more powerful than consoles...
But it's a moot point if PC games don't fully use it, hence why people complain.

I can see a 7950GX2 owning a X360 powerwise (or X1900XTX if you will), but developers have to bear in mind many configs... so it's kind of normal to see consoles ahead in terms of GRAPHICS at least for 1 year...
until more people adopt the hardware needed to play a more demanding game.
 
$650 for Video card is ridiculous. I like my 7600GT, they should be no more than a cpu price $200.
 
annaconda said:
$650 for Video card is ridiculous. I like my 7600GT, they should be no more than a cpu price $200.

No, *that's* ridiculous, a CPU doesn't need a seperate PCB, power conditioning, rams, etc. I'm sure the chip itself is probably close in price to the CPU, don't forget there's almost a gig of highspeed ram and a custom PCB.
 
Well i broke the street date by 3 weeks with my 2 7900GTX 512 cards which practically shut down Hard OCP with all the bandwith soaked up hehe, and when push comes to shove i will have 1 if not 2 of these beasts a little early again. ;) Will post it here in Hard Ocp first.
 
razor1 said:
A shader unit that can be used to calculate either vertex shader instructions or pixel shader instructions. (general purpose ALU)

and with DX10, geometry shader instructions as well
 
We should have some clarity of truths as Nvidia is holding a media launch press conference in Santa Clara on October 18th.
 
Lepke said:
We should have some clarity of truths as Nvidia is holding a media launch press conference in Santa Clara on October 18th.

Of which everything will be under embargo until a certain time.
 
Hilarious, I like the pussy reference, it will keep all kinds of pussies warm ;), note Iam talking about the cat.
 
wow....this seriously went from not hearing anything about it to, it's coming out in a month!!! I'm way un-prepared (un-funded)...lol. I have a 6800GT and planned on skipping a generation and getting this. I'm so glad to hear that it's gonna be a nice big jump. i'm think i'll have to throw in a new PSU into my upgrade. I'm still on AGP, so it's gonna be a big transition to pay for. AM2 Dual core, 2gb DD2, 8800GTS, PSU, oh my....Prices should drop a bit by the time I get money for all that. It will be glorious though. :D:D:D

 
Brent_Justice said:
Of which everything will be under embargo until a certain time.


Embargos which are always broken by somebody. The news always makes it to us. It's inevitable.
 
Possibly by the Inq, unless they get the real scoop and decide to make the news up anyway. One never knows.
 
Russell said:
Embargos which are always broken by somebody. The news always makes it to us. It's inevitable.

And then that somebody gets lawsuit, yay! I'm still in shock that DailyTech isn't getting fallout for this yet, they are definately under NDA with their prerelease boards...
 
Back
Top