Leaked GeForce GTX 880 Pics?

HardOCP News

[H] News
Joined
Dec 31, 1969
Messages
0
The Chinese language website GamerSky has posted a handful of pics of what they claim is the upcoming GeForce GTX 880. You'll need a translator to read the post but the pics are universal.
 
is that an 8 pin and 2x 6 pin power? Is this common on preproduction samples or is it really going to be that much of a hog?
 
that could just be a test board config , this way they could test with various power inputs (wattage amounts ) .
 
I was under the impression that early samples like this often ave extra power connectors
 
The article indicates that the same power could be delivered by 2 8-pin connectors, so the three power connectors are probably only due to it being an engineering sample.
 
I'm already building a new system for X99 and Haswell-E this year. And I definitely need a new video card cuz I skipped my regularly scheduled upgrade last year. C'mon nVidia, release this shit already. And I need to know what AMD/ATI has up their sleeves. When all the new 2014 GPUs are out, then I can make my decision of who to go with. Liking the sound of Maxwell so far though. Power efficiency, runs cooler, and I heard the whole series is gonna be cheaper on the whole, though I have no idea how true that is. I really hope so. GPU prices are absolutely insane.
 
is that an 8 pin and 2x 6 pin power? Is this common on preproduction samples or is it really going to be that much of a hog?

Engineering sample. I would assume 8+6 pin will be the final result.
 
I'm already building a new system for X99 and Haswell-E this year. And I definitely need a new video card cuz I skipped my regularly scheduled upgrade last year. C'mon nVidia, release this shit already. And I need to know what AMD/ATI has up their sleeves. When all the new 2014 GPUs are out, then I can make my decision of who to go with. Liking the sound of Maxwell so far though. Power efficiency, runs cooler, and I heard the whole series is gonna be cheaper on the whole, though I have no idea how true that is. I really hope so. GPU prices are absolutely insane.

Good for you

I skipped last year as well, bitcoin miners having destroyed the pricing on all video cards, it was impossible to get 290x's, which drove up the price, because you couldn't get 290x's, the price on 780ti's was through the roof because you couldn't buy a 290x etc etc

I want bitcoiners to get off videocards and get onto SIAC (System in a chip) I'm seeing all over, such as ASIC miners, that way they can leave the rest of us alone and the market can recover

Its like how 290x's are finally coming onto the market, because bitcoin miners stopped buying them in anticipation of the next videocard gen, and in canada alone the prices on 290x's has dropped nearly $200 on both 290x's AND 780ti's
 
Anybody know what size those memory chips are? Looking like either 4GB or 8GB total.
 
Hopefully we see this by EOY

I know yield had been their biggest headache for the new die.
 
Anybody know what size those memory chips are? Looking like either 4GB or 8GB total.

Nvidia has a tendency to be on the low end of VRAM for anything that isn't a professional card (e.g. Titan and Quadro), then either they or their board partners release a second model with double the VRAM. They like to release cards that have just enough: GTX480 1.5GB, GTX580 1.5GB, GTX680 2GB, GTX780 3GB, etc.

Since the 780's came stock with 3GB, and 2-3GB seems to be the current sweet spot for 1080p single screen gaming with only a few outliers pushing 3GB usage, I'm going to assume that's 4GB.
 
4gb is what I'm thinking too. Then again with some games being v-ram hogs (cough watchdogs cough), maybe nvidia decided to be generous this time around.
 
Since Mar 2005, the molex pins are required to be "HCS" rather than "Std", which each carry a max of 11Amps. So properly made, an 8 pin PCI-e can supply 12V*11Amps*3lines=396Watts of power for the graphic cards.
 
I find it funny that Nvidia would release the 880M before the desktop series 800
hk5.png
 
Nvidia 800 Series! 880 specs leak! Maxwell! 28nm or 20nm! TSMC delays!


Mooooo. Don't care anymore until it's released.
 
I find it funny that Nvidia would release the 880M before the desktop series 800
[IMG ]http://gpuz.techpowerup.com/14/07/03/hk5.png[/IMG]

It's a rebrand with higher clocks ;)

GTA 5 will run maxed out a a 580, much less an 880.

unless there's been benchmarks no reason to believe that. gta 4 was a horrible unoptimized POS, and i personally don't have faith in rockstar to improve it this time around
 
I want bitcoiners to get off videocards and get onto SIAC (System in a chip) I'm seeing all over, such as ASIC miners, that way they can leave the rest of us alone and the market can recover

Its like how 290x's are finally coming onto the market, because bitcoin miners stopped buying them in anticipation of the next videocard gen, and in canada alone the prices on 290x's has dropped nearly $200 on both 290x's AND 780ti's

Being in Canada I know exactly how you feel. It was a rediculous market out there. But don't tell that to all the miners here on [H] :p

I just hope everyone can put it behind them, so many people got greedy snatching up all R9 series just to resell NIB. I guess that's a free market for ya though.
 
gta 4 was a horrible unoptimized POS, and i personally don't have faith in rockstar to improve it this time around

Guess you didn't play Max Payne 3. It ran great, and on a much newer revision of the RAGE engine than was in GTA4 - which by extension will only have improved in GTA5.
 
I want bitcoiners to get off videocards and get onto SIAC (System in a chip) I'm seeing all over, such as ASIC miners, that way they can leave the rest of us alone and the market can recover

If you are bitcoin mining on GPUs still, I've got a bridge to sell ya. :D

Besides, nVidia cards aren't that good for mining (though great for gaming).
Those altcoin mining will be using ATi's offerings at least until some good ASICs come out.

That said...."GTX 880"....man...one digit shy of nostalgia
I still have my 8800 somewhere.
 
Good for you

I skipped last year as well, bitcoin miners having destroyed the pricing on all video cards, it was impossible to get 290x's, which drove up the price, because you couldn't get 290x's, the price on 780ti's was through the roof because you couldn't buy a 290x etc etc

I want bitcoiners to get off videocards and get onto SIAC (System in a chip) I'm seeing all over, such as ASIC miners, that way they can leave the rest of us alone and the market can recover

Its like how 290x's are finally coming onto the market, because bitcoin miners stopped buying them in anticipation of the next videocard gen, and in canada alone the prices on 290x's has dropped nearly $200 on both 290x's AND 780ti's

You really just have no clue what the hell you're talking about. Nobody mines bitcoin with GPUs. Nvidia cards are shit for mining and it had absolutely nothing to do with their prices. Lastly, miners aren't waiting for anything. The altcoin market collapsed, nobody gives a shit what cards are coming out. Even if they did, miners have every right to buy GPUs for whatever the hell they want. Get over yourself.
 
I find it funny that Nvidia would release the 880M before the desktop series 800
hk5.png



As you can see it's mostly a re-branded 770, still powerful, but meh. Nothing about that including the 8GB RAM that was literally thrown in there for no reason by the manufacturer, will have nay representation to Maxwell.

Nvidia seems to like releasing mobile parts before desktop now despite them being a generation old. That strategy is actually what many believe led to the canceling of the 300 Series as the game they were playing confused the market.
 
is that an 8 pin and 2x 6 pin power? Is this common on preproduction samples or is it really going to be that much of a hog?
I don't get why its an 8 pin + 6pin +6pin. 8pins provides 150 watts of power. a 6 pin is 75 watts. so why 150+75+75=300 watts they could just use 2 8 pins 150+150=300 and don't forget the slot alot provides another 75 watts if i'm correct. so the card is 375watts if it uses it all
 
I don't get why its an 8 pin + 6pin +6pin. 8pins provides 150 watts of power. a 6 pin is 75 watts. so why 150+75+75=300 watts they could just use 2 8 pins 150+150=300 and don't forget the slot alot provides another 75 watts if i'm correct. so the card is 375watts if it uses it all
It's probably just for development purposes.

They might have it wired to now draw any power from the slot (to make sure unproven hardware doesn't blow out their test bench). That would explain the extra 6-pin power connector.
 
If you are bitcoin mining on GPUs still, I've got a bridge to sell ya. :D

Besides, nVidia cards aren't that good for mining (though great for gaming).
Those altcoin mining will be using ATi's offerings at least until some good ASICs come out.

That said...."GTX 880"....man...one digit shy of nostalgia
I still have my 8800 somewhere.

oh man. I only retired my 8800 GTX last year due to me thinking "Surely it'd get smashed by modern cards now". Only to buy a cheap a 670 as i could and find out that its the same or slower in some applications. talk about a let down. At the time i put it in a 2nd backup system to play with Win8 and one day.... out of the blue.. it shat itself. Really disappointed!
 
oh man. I only retired my 8800 GTX last year due to me thinking "Surely it'd get smashed by modern cards now". Only to buy a cheap a 670 as i could and find out that its the same or slower in some applications. talk about a let down. At the time i put it in a 2nd backup system to play with Win8 and one day.... out of the blue.. it shat itself. Really disappointed!
Uh...what? A GTX 670 should have kicked an 8800GTX's teeth in :confused:
 
Uh...what? A GTX 670 should have kicked an 8800GTX's teeth in :confused:

Pretty sure its a PEBCAK issue. Because yeah, a 670 is going to blow the doors off a circa 2007 8800GTX. I'm going to take a wild guess that the motherboard hasn't been upgraded since 2006 either.
 
oh man. I only retired my 8800 GTX last year due to me thinking "Surely it'd get smashed by modern cards now". Only to buy a cheap a 670 as i could and find out that its the same or slower in some applications. talk about a let down. At the time i put it in a 2nd backup system to play with Win8 and one day.... out of the blue.. it shat itself. Really disappointed!



You need to upgrade from a Pentium 4 and have a resolution higher than 1280x720 to see a major difference.
 
Back
Top