G80 specs revealed

Russell said:
Embargos which are always broken by somebody. The news always makes it to us. It's inevitable.

But you won't know what is the real news and what isn't, there are facts that are always wrong I notice before the launch that is out there, so take everything you see as rumor until it is officially announced.
 
eXzite said:
And then that somebody gets lawsuit, yay! I'm still in shock that DailyTech isn't getting fallout for this yet, they are definately under NDA with their prerelease boards...


don't think they got the board from nV so they might not be under NDA at least not yet.
 
What are the chances that a single 8800gtx will outpace a 7950gx2 at really high resolutions like 1920x1200 and 2560x1600 ?

1000
 
Well I want to know if it is really coming out in november and how well it performs. Currently I'm saving money for a core 2 rig and 1920x1200 lcd and was going to get a X1950 xtx. I'm hoping that this 8800 gtx can play oblivion at 1920x1200 everything maxed or close enough and it does infact come out in november if not then I doubt I could stand to wait much longer (haven't had a pc capable of playing games farcry and beyond and I WANT TO PLAY THEM ;_; )
 
It looks like the G80 / 8800GTX is on the way to fulfilling the predictions of the G90 with TWO power connectors on one card!!!

Sounds exciting, but the power requirements look like things are getting out of control again!

Anway.... Remember that G90 thread? LOL
 
Physics on GPU. At no additional cost. nVidia just earned my money, regardless of how good this thing performs.
 
chrisf6969 said:
It looks like the G80 / 8800GTX is on the way to fulfilling the predictions of the G90 with TWO power connectors on one card!!!

Sounds exciting, but the power requirements look like things are getting out of control again!

Anway.... Remember that G90 thread? LOL


The power requirements aren't that bad, under load the gtx should take up to 160 watts by itself not much more,

The gts veriety will only have one power connector so that gives us a good idea it won't take up more then 150 watts no matter what. So whats left, 128 mb of vram 200 mhz on the ram, and 75 watts on the GPU, which doesn't add up to much more wattage on the GTX.
 
i call it judgement day for the Gfx industry.. a dawning of a new era :eek:
 
randsom said:
So Brent,

Are the specs that have been published, More or less what you expected out the G80 or are you a little shocked, and suspicous.

Youre the resident [H] GFX card guru, what have you been hearing?
Uh, I doubt he can say. Hence the: "We cannot confirm or deny. . ."

I suspect, though, that NVIDIA is happy about these leaks considering there was all sorts of erroneous assumptions going on (1KW power supplies required, water cooling required, etc.).

H
 
chrisf6969 said:
It looks like the G80 / 8800GTX is on the way to fulfilling the predictions of the G90 with TWO power connectors on one card!!!

Sounds exciting, but the power requirements look like things are getting out of control again!

Anway.... Remember that G90 thread? LOL

Your computer will BSOD in 11 sec............All to funny to those who have never experienced it :D Priceless none the less
 
Hawk said:
450W for one. Wonder what it is for 2. Don't think my 535W can cut it :D They should be kcikers :D


450w TOTAL SYSTEM POWER...not a dedicated 450w supply....
:p
 
IM impressed w/ these specs. I still want to see it officially announced, and will wait to see what ATI has to offer. I have no brand loyalty, what ever is faster and has more features. Dont count ATI out yet. Whats in teh X360 is over a year old now, so IM sure they have moved on to something bigger and better :D
 
Brent_Justice said:
This next generation excites me greatly.


I got wood thinking about the 8800GTX....seriously.
:eek: :D



Taking into account I am staying with my base rig of dual core X2 @2.6ghz and 2gb ram, and a 21" display at 1680x1050, I am thinking this card will run all of my games at that res with ALL the eye candy ON and maintain 55+fps in anything....plus it should still whip up pretty good on Flight Simulator X and Crysis.....
 
Hopefully I'll be able to step up by 7900GT, ordered on August 10. Do you think I have a chance?
 
banned_user said:
I would have thought they would have moved on to GDDR4 for this, but i haven't kept up with G80 progress. The rest of the specs look positively insane, but does that memory controller have the ability to support GDDR4?

ATi is using an early Sample of GDDR4 based of DDR2 technology. by next year the speed of GDDR4 should be faster and in full production. Secondly these are all rumors and speculartion. And the uneven RAM amounts makes everything sound fake. Does anyone know how many nanometers this new 8800 chip will be? In addituion I thought chip makers were trying to cut the amount of power needed for these CPU's/GPU's. :confused:
 
What I dont like about the direction that video cards are going is that in a few years you will need a direct line from the power pole to just your video card to power it. Right now its 450W next year it will be 700W. Can they not go for energy efficiency to reduce power consumption?
 
BoogerBomb said:
What I dont like about the direction that video cards are going is that in a few years you will need a direct line from the power pole to just your video card to power it. Right now its 450W next year it will be 700W. Can they not go for energy efficiency to reduce power consumption?

The recommended power supply for the "overall system" is 450W. For the past couple of years, it's been 350-400W, so 50W is not that big of an increase.
 
ToastMaster said:
The recommended power supply for the "overall system" is 450W. For the past couple of years, it's been 350-400W, so 50W is not that big of an increase.

It is when only 100W of the total (350-400W) before was for VC. (other 250-300W for all of the other components, drives, motherboard, ram, etc.) So 50+ more watts, for a 100W part = 50% increase. Thats A LOT.
 
I just HOPE nvidia uses the same normal 4 hole pattern on this card than they have for other cards... I hear the GTO has 2 holes that are "closer to the core"

I really dont want to buy a new waterblock, or use their crappy restrictive version :)
 
All these nice things but did they fix their AF?

On a serious note, not that the above isn't, this seems like way to much marketing at this point. I'm more interested in seeing hard data like results from reviewers.
 
chrisf6969 said:
It is when only 100W of the total (350-400W) before was for VC. (other 250-300W for all of the other components, drives, motherboard, ram, etc.) So 50+ more watts, for a 100W part = 50% increase. Thats A LOT.

If nVidia truly does only suggest a 450W PSU, then I trust that that's all that's required, as long as it's a good, reliable unit.

Besides, if you're willing to shell out $600-700 on a top-line graphics card, I'm hoping that you'd atleast be willing to drop down $150 on a decent power supply. Any reputable 450-500W unit should be able to power this card.

Don't skimp in one core area to be able to get something better in the other.
 
On the subject of power supplies, I've got an Antec NeoPower 480 with 36amps on the dual 12v rails. Do you think this would be enough to power the g80 even if it is overclocked?
 
ToastMaster said:
If nVidia truly does only suggest a 450W PSU, then I trust that that's all that's required, as long as it's a good, reliable unit.

Besides, if you're willing to shell out $600-700 on a top-line graphics card, I'm hoping that you'd atleast be willing to drop down $150 on a decent power supply. Any reputable 450-500W unit should be able to power this card.

Don't skimp in one core area to be able to get something better in the other.

Yes, but with 2 hard drives and an overclocked CPU, that 450 watt requirement looks pretty low, with a 7900GT it's fine.
 
ToastMaster said:
Besides, if you're willing to shell out $600-700 on a top-line graphics card, I'm hoping that you'd atleast be willing to drop down $150 on a decent power supply.
Pshhh, I only spent $5 on my PSU (Yay $50 rebate!), and I'm pretty sure it will handle at least one GTX, maybe two GTS's. Its an "SLI Certified" Ultra xFinity 500W PSU

If 34A on the dual 12v rails isn't enough I would be shocked :eek:
 
firewolf said:
Yes, but with 2 hard drives and an overclocked CPU, that 450 watt requirement looks pretty low, with a 7900GT it's fine.
Even most systems with a dual core cpu, 7900gtx and a couple hard drives rarely go above 325-350 watts under full system load. Anandtech even ran a 7800gtx in a HP comp with a 250 watt psu and had no issues putting it through benchmarks.
 
adonn78 said:
And the uneven RAM amounts makes everything sound fake.
The reasoning behind the seemingly odd RAM amounts and channel widths has been addressed before, look around the other G80 threads. But seriously, the computing world stabilized around an 8-bit foundation back in the 1960s-70s, and since then all memory sizes/bus widths/etc. have been built in multiples of 8.

Look at color depths--8-bit, 16-bit, 24, now 32. Look at the RAM address space on the original IBM PC: 640K, which seems weird, but it's just 512+128K. We tend to double things each generation now, but not in every area, and it's certainly not some kind of engineering requirement for everything to be a power of 2. But it IS (usually) a requirement for it to be a multiple of 8.

The only reason total system RAM has tended to be a power of 2 lately is because of dual-channel memory controllers requiring pairs of identical modules in order to work properly. Before that, it was no big deal to add a cheap spare 128 MB stick to your 256 MB system if memory was getting tight, giving you 384 MB total, which looked strange but worked just fine. Notice I'm choosing examples that match some of the numbers on the leaked G80 specs, just to show that numbers like this crop up all the time in the computer world.

Here it is in slogan form: "If it divides by eight, it must be great!" :D
 
Commander Suzdal said:
...
Here it is in slogan form: "If it divides by eight, it must be great!" :D
Pity about all those 853/2x480 and 1367/6/5x768 plasmas & LCDs that cause grief to so many video cards.

Nvidia has just added a 1366x768 mode to ForceWare, so maybe the G80 will have the grunt to bludgen those TVs into submission. :)

Adrian
 
LunchboX3904 said:
wow....this seriously went from not hearing anything about it to, it's coming out in a month!!! I'm way un-prepared (un-funded)...lol. I have a 6800GT and planned on skipping a generation and getting this. I'm so glad to hear that it's gonna be a nice big jump. i'm think i'll have to throw in a new PSU into my upgrade. I'm still on AGP, so it's gonna be a big transition to pay for. AM2 Dual core, 2gb DD2, 8800GTS, PSU, oh my....Prices should drop a bit by the time I get money for all that. It will be glorious though. :D:D:D


You and me both, brother. No more 5500 for me!
 
Back
Top