Kepler Lineup Released

wicked_chicken

Limp Gawd
Joined
Nov 30, 2009
Messages
316
I have no idea how solid this information is, but I haven't seen it posted here yet...

http://www.itproportal.com/2012/02/06/nvidia-kepler-lineup-revealed/

If we begin this whistlestop tour of Nvidia's next generation graphics cards at the top, there's the twin chip GTX 690. This won't be released until sometime in Q3 this year, but it will pack a whopping 2x1024 stream processors, a shader clock of around 1.5GHZ and a core clock of 750MHZ. In terms of memory there's a bus width of 2x252GBps with an effective memory clock of 4.5GHZ. Overall it will feature 2 X 1.75GB of GDDR5. Considering AMD's top end single chip card already has 3GB, will we see Nvidia packing 3.5GB vs it's opposite number packing 6GB?
Bumping down the list we have the GTX 680 and 670 both making use of the same GK110 core as the twin chip variant, They both feature the same 850MHZ core clock and 1.7GHZ shader clock. However, the bigger of the two has more stream processors, with each having 1024/896 respectively. While both are set to debut at the same time in early April, the 680 will come with a memory clock of 5.5GHZ (effective) and a memory bus width of 352GBps. Comparatively, the 670 is 5GHZ and 280GBps.
A quick bit of pricing info, the GTX 680 is set to come in at $650, which puts it at $100 more than the competitor Radeon card, the AMD HD 7970. It'll be interesting to see how AMD responds to the launch of these GPUs. Perhaps it will lower prices further to remain more competitive, or keep the current setup. Chances are it will depend on real world Geforce 600 series performance.
The mid-range GK104 core is found within the GTX 660 and 560Ti. For core clocks, they are 900MHZ/850MHZ each, with only a 100MHZ shader clock difference. The GTX 660 hits 1.8GHZ, while the GTX 650 reaches just 1.7GHZ. Running with memory, the clock speeds for them will be 5.8GHZ/5.5GHZ with memory bus widths of 186GBps/154GBps respectively.
Down at the budget end there are the GTX 650 and GTX 640, both making use of the GK106 core. The clock speeds for the core are the same for the 660/560Ti cards, though stream processors are slightly less. Effective memory clock for both is 5.5GHZ, with memory bus bandwidth at 132GBps/88GBps respectively.
Most of these cards will be released in early April/May, with the rest coming towards the end of Q2 and near the start of Q3.
Along with the leak over at EXPreview, there are claims of how these new cards will compare to previous generations and current AMD hardware. The GTX 680 will apparently perform almost 50 per cent better than the HD 7970. This is a bold claim as AMD's new 7000 series cards made big performance gains over the past generations already. There's also a 1GB difference in memory at the top end, suggesting the Radeon cards will have an advantage in some benchmarks.


Read more: http://www.itproportal.com/2012/02/06/nvidia-kepler-lineup-revealed/#ixzz1ldRdp57r

http://en.expreview.com/2012/02/06/entire-nvidia-kepler-line-up-unearthed/20836.html

NVIDIA-600-1.jpg
 
Why does the GTX 690 part have the same stream processors (x2) as the 680, but the bus width of the 670? Kinda doesn't make sense.
 
Why does the GTX 690 part have the same stream processors (x2) as the 680, but the bus width of the 670? Kinda doesn't make sense.

Because its a dual gpu in single card solution. They never have the full amount of stuff as the card below it. The GTX295 was like this. They typically do it to reduce heat so they can keep it as a 2 slot card.

GTX280
GT200
Cuda Cores: 240
Graphics Clock: 602MHz
Processor Clock: 1296MHz
Memory Clock: 1107MHz
Memory Amount: 1024MB
Memory Interface Width: 512

GTX295
GT200x2
Cuda Cores: 480
Graphics Clock: 576MHz
Processor Clock: 1242MHz
Memory Clock: 999MHz
Memory Amount: 1792MB
Memory Interface Width: 896
 
Ahh, yeah. Though basically it was like a 2x GTX 260 except with more CUDA cores.
 
Everyone just keeps reposting the same information pulled from Lenzfire. There's nothing new yet.
 
Everyone just keeps reposting the same information pulled from Lenzfire. There's nothing new yet.

Nope, once it's on the second site it is as good as confirmed. Doesn't matter that it's all based off the same dumb rumor - it's been independently confirmed by a second site! Such is the logic of video card release rumors.
 
Don't hate me :( This is tongue in cheek, I love you all and I still love nvidia

pdvk0.jpg
 
Because its a dual gpu in single card solution. They never have the full amount of stuff as the card below it. The GTX295 was like this. They typically do it to reduce heat so they can keep it as a 2 slot card.

GTX280
GT200
Cuda Cores: 240
Graphics Clock: 602MHz
Processor Clock: 1296MHz
Memory Clock: 1107MHz
Memory Amount: 1024MB
Memory Interface Width: 512

GTX295
GT200x2
Cuda Cores: 480
Graphics Clock: 576MHz
Processor Clock: 1242MHz
Memory Clock: 999MHz
Memory Amount: 1792MB
Memory Interface Width: 896

Wasn't the GTX 295 essentially two GTX 275?
 
Wasn't the GTX 295 essentially two GTX 275?

It was basically a GTX260 that had some stuff enabled to have features of two GTX280s. The GTX690 is likely two GTX670s with some stuff enabled to have features of two GTX680s.
 
http://www.tomshardware.com/news/Nvidia-Kepler-GPU-GeForce-600-Series,14642.html#xtor=RSS-998

this is from toms. According to them Nivida's top single gpu solution the 680, will be 45% faster than the 7970. HAHA i seriously doubt this. The 7970 can be overclocked like mad and the non reference designs haven't even come out yet. Not to mention zerocore, huge huge advantage for amd.

I'd like to see nividia do well with kepler, but honestly i see these things suffering from the same problems the 480's suffered from. Heat, power consumption. Makes you wonder after Amd's Ceo spooke about focus on leading the graphics, low power, portable, and server markets. This sort of thinking might lead to a 7975 from Amd if the performance figures from Toms over 45% are anywhere near correct. Even if the 7970 is slower than the 680, the 7990 will prolly beat the 690.
 
Not to mention zerocore, huge huge advantage for amd.

Is anyone who is splashing out $550 for a video card really caring about a 50W idle power reduction (or whatever it is)? I seriously doubt it. Guys buying Ferraris don't give a crap about the gas mileage while sitting at a stop light.

And this is just further rehashing of the same Lenzfire article from a few days ago.
 
It was basically a GTX260 that had some stuff enabled to have features of two GTX280s. The GTX690 is likely two GTX670s with some stuff enabled to have features of two GTX680s.

As I understood it, the GTX 275 was the same thing. The 275 probably was the 295 split in half, with higher clock speeds.
 
As I understood it, the GTX 275 was the same thing. The 275 probably was the 295 split in half, with higher clock speeds.

GTX295 was discontinued before the GTX275 was out. It may have been half of a GTX295 but the GTX295 wasnt two GTX275s. Im sure process was different as well as many other things between them.
 
Is anyone who is splashing out $550 for a video card really caring about a 50W idle power reduction (or whatever it is)? I seriously doubt it. Guys buying Ferraris don't give a crap about the gas mileage while sitting at a stop light.

And this is just further rehashing of the same Lenzfire article from a few days ago.

Exactly... I know I sure as heck don't.
 
Is anyone who is splashing out $550 for a video card really caring about a 50W idle power reduction (or whatever it is)? I seriously doubt it. Guys buying Ferraris don't give a crap about the gas mileage while sitting at a stop light.

And this is just further rehashing of the same Lenzfire article from a few days ago.

I know some of those guys and they do care about their car sitting at a stoplight and catching fire though.
 
Is anyone who is splashing out $550 for a video card really caring about a 50W idle power reduction (or whatever it is)? I seriously doubt it. Guys buying Ferraris don't give a crap about the gas mileage while sitting at a stop light.

And this is just further rehashing of the same Lenzfire article from a few days ago.

the difference between a 200k$ car and a 500-1000$ video card is far to extreme. Zerocore turns off the second gpu, and third and forth. The wattage adds up, monthly and yearly. For those who live in high cost electric areas it is definitely something to consider.


That made me LOL!

ya i definitely smiled when i saw that pic. I'm surprised we haven't seen any kepler cookies yet.
 
Is anyone who is splashing out $550 for a video card really caring about a 50W idle power reduction (or whatever it is)? I seriously doubt it. Guys buying Ferraris don't give a crap about the gas mileage while sitting at a stop light.

And this is just further rehashing of the same Lenzfire article from a few days ago.

In the area I am at right now, the monthly cost on electricity is extremely high.

Heck, I had a single GTX 480 @ 880Mhz that replace my 5970. It went from $40 a month to $90 at minimum a month. That was with almost all the electronic turned off, only freezer and couple room lights that is still on...

If I leave my PC on 24/7, I don't even dare to look at my bill.....
 
In the area I am at right now, the monthly cost on electricity is extremely high.

Heck, I had a single GTX 480 @ 880Mhz that replace my 5970. It went from $40 a month to $90 at minimum a month. That was with almost all the electronic turned off, only freezer and couple room lights that is still on...

If I leave my PC on 24/7, I don't even dare to look at my bill.....

I can't believe it was that big a difference unless you are running the card flat out all the time. Idle power use between those two cards is only 10W and load is less than 50W when stock. Where do you live, in one of Gingrich's moon bases?

http://www.anandtech.com/show/2977/...x-470-6-months-late-was-it-worth-the-wait-/19
 
I really dont see any damage on the card, its hard to see...

plus, his case is full of dusts... look at the backside of the case, HOLY SMOKE....

No evidence there of fire... could be a dust bunny caught fire and did a quick flash/burn
 
In the area I am at right now, the monthly cost on electricity is extremely high.

Heck, I had a single GTX 480 @ 880Mhz that replace my 5970. It went from $40 a month to $90 at minimum a month. That was with almost all the electronic turned off, only freezer and couple room lights that is still on...

If I leave my PC on 24/7, I don't even dare to look at my bill.....

This just doesn't add up as the cause of your bill increase. Electricity is billed per kilowatt/hour. That's 1000 watts for one hour. Even in expensive areas, it only runs about 15 cents or less per Kw/H. Say your 480 used 300 watts at full blast (keep in mind that [H]'s power usage charts count the entire system draw, not just the card). Even if you ran the card full tilt 24/7, and let's round the rate up to 20 cents and the wattage up to 333, that's 20 cents every 3 hours or $1.60 a day.

So IF you ran Furmark 24/7 and IF your 480 drew more power than anyone else's and IF your electricity rate was among the highest in the country, you might see that kind of bill increase.

But wait--you had a video card before the 480, and it didn't draw 0 watts, so now we have to subtract what it drew from your total. Make it 100 watts at full load. Now your worst-case bill increase is $1.20 per day, or enough to boost you to $75, not $90 per month.

Assuming you didn't run Furmark 24/7, or your rate was less than 20 cents per KwH, or etc., and I could see the 480's bill being a lot more like $5-10 per month.

If the rest of the system was high-end, OCed to the max, etc., and was new along with the 480, that's a different matter. But that's not the 480's fault either.
 

233KWH - $56USD

That is the bill recently when half of the month there is no one there..

10 cent? 20 cent? Where in the world you lives? :p

PS: the electricity bill tend to increase a lot higher after certain amount of usage.
 
Last edited:
I am currently in Davis area...


Try to test a overclocked 480 @ 880Mhz and a 5970 @ 5870 speed ;)

I have some serious doubts about this. I used to live in Davis (CA I assume) and went to UCD. When I went from normal PC use to 24x7 bitcoin mining for a month my electric bill didn't go up as much as yours did. This was with my two 6950s with shaders unlocked at 1015MHz at 1.3V. Those definitely used more power than your single OC'd GTX480. My housemate also had dual GTX 470s and was constantly playing games. My house's avg monthly electricity bill was about $120 under normal use and went up by about $20-$30 when I started bitcoin mining IIRC (which was fine because I was making like $200-$250/mo from mining).
 
I have some serious doubts about this. I used to live in Davis (CA I assume) and went to UCD. When I went from normal PC use to 24x7 bitcoin mining for a month my electric bill didn't go up as much as yours did. This was with my two 6950s with shaders unlocked at 1015MHz at 1.3V. Those definitely used more power than your single OC'd GTX480. My housemate also had dual GTX 470s and was constantly playing games. My house's avg monthly electricity bill was about $120 under normal use and went up by about $20-$30 when I started bitcoin mining IIRC (which was fine because I was making like $200-$250/mo from mining).

No clue why you paying less, I been moving to 2 different places, and both of them cost this much.

And no, I don't do bitcoin mining...
 
233KWH - $56USD

That is the bill recently when half of the month there is no one there..

10 cent? 20 cent? Where in the world you lives? :p

PS: the electricity bill tend to increase a lot higher after certain amount of usage.

Oh, you're gonna hate me--quick math tells me your rate is about 24 cents KwH. As of my latest bill, mine is about 7.4 cents a KwH. I live in Missouri. See, I doubled my rate and rounded it up to cover potential "outrageous rates" where you live, but obviously not outrageous enough. :D

On the flip side, it's cold here in the winter and my total bill for the month (electric, gas, water, sewer) was over 370 for a 2800 square foot house. 7.4 cents per KwH is cheap, but when you burn 3330 KwH in a month..... :eek:
 
Back
Top