NVIDIA GK104 Coming Soon?

Status
Not open for further replies.
More Chinese rumors for the fire.

2012011011030732.png

I love the background of that image. The sausages tell me that this chart is legit. :D
 
That GK104 looks interesting, but1.5GB vram is disappointing. Even if there are 3GB versions, they'll be higher than $399.
 
That GK104 looks interesting, but1.5GB vram is disappointing. Even if there are 3GB versions, they'll be higher than $399.

Yeah, but it's the mid-range (560 Ti) kind of part, so the real high-end could easily have 3GB.
 
I don't understood, nvidia doesn't presented new cards at ces.
this means that we will not see any GK104 soon, am I wrong?
 
When high-end is $550 it is.


No, it is not. :p

No official specs for midrange but it used to be $100~200 cards. Not sure who upped it first but I recall Nvidia coming out with GTX 460 at two price points $199 and $229. I don't remember if ATI did anything similar.
 
No, it is not. :p

No official specs for midrange but it used to be $100~200 cards. Not sure who upped it first but I recall Nvidia coming out with GTX 460 at two price points $199 and $229. I don't remember if ATI did anything similar.

And a cup of coffee used to be 25 cents. If NVidia has an entire line of card above the GK104, then by definition the GK104 isn't high-end. Just like the 7850/7870 will be mid-range cards, even if they outperform 6970s and cost $300. It's product positioning, not absolute price.

I'll allow that you can call them mid-high end, if you want call something like the 6770 mid-range, but that's hardly the [H] target market.
 
When high-end is $550 it is.

I would consider that an enthusiast card. I don't see how you can call a $400 video card a mid range card. Thats $80 more than a GTX570. Thats $100 more than I paid for my old GTX280 nib when it was still the top single gpu on the market. $400 is more than I paid for the two 2GB GTX560 cards that I'm using now. that'll probably match GK104 if those specs are accurate. Sounds like I won't be upgrading for a while.

Mainstream = midrange in my opinion. I don't see how you can call a $400 video card mainstream.
 
They invented a new segment a few years ago - performance. Midrange is still $150-$200. Performance is $200-$300. High is $300+.
 
And a cup of coffee used to be 25 cents.


Yeah, but how long ago was that the average?

What I'm talking about happened in the middle of 2010.



If NVidia has an entire line of card above the GK104, then by definition the GK104 isn't high-end. Just like the 7850/7870 will be mid-range cards, even if they outperform 6970s and cost $300. It's product positioning, not absolute price.


Well I'm not even sure if any of this stuff is true. But if so, why couldn't there be two lines both considered high-end? Just like with mid-range there is no official definition of high-end. And yeah, there will be multiple ways to consider it and even more important than positioning and price is sales volume. I know that Nvidia and AMD would not consider $400 cards to be high volume (in comparison to the rest of the products). The popular cards in this category are probably in the $125~150 range. Nvidia pushed the boundaries taking it up to $229.



I'll allow that you can call them mid-high end, if you want call something like the 6770 mid-range, but that's hardly the [H] target market.


and I'll allow you to keep posting here but I'm keeping an eye on you. :)
 
Yeah, but how long ago was that the average?

What I'm talking about happened in the middle of 2010.

Well I'm not even sure if any of this stuff is true. But if so, why couldn't there be two lines both considered high-end? Just like with mid-range there is no official definition of high-end. And yeah, there will be multiple ways to consider it and even more important than positioning and price is sales volume. I know that Nvidia and AMD would not consider $400 cards to be high volume (in comparison to the rest of the products). The popular cards in this category are probably in the $125~150 range. Nvidia pushed the boundaries taking it up to $229.

and I'll allow you to keep posting here but I'm keeping an eye on you. :)

All good points. As several people have pointed out, it depends on your definition of mid-range - and my definition of mid-range may be skewed from spending too much time here. I'll recalibrate it to mid-range performance, as opposed to "true" mid-range.
 
BTW, I just measured the percent improvement for the GTX 780 vs GTX 580 in the new chart and compared it with this old chart:

0908157p020sn1srmmmkln.png


Here's a summary of my calculations:

Code:
Chart                Model                Percent Difference
New                  GTX 580              100.00%
New                  GTX 780              230.16%
Old                  GTX 580              100.00%
Old                  GTX 780              231.06%

Regardless of whether or not the charts are indeed authentic, I think that it's safe to say that they were made by the same entity.

Ohh man. Metro Last Night should be such a great game. Metro meets The Hangover! AMIRIGHT?!?!

(both charts just screams fake, bee tee dubs) Though I wouldn't mind that kind of performance...
 
sincerely I don't know where people got that graph but history teaches.
have you ever seen a cards with a single chip with a 230% performance improvements on previous top tier?

no I have never seen it.
 
sincerely I don't know where people got that graph but history teaches.
have you ever seen a cards with a single chip with a 230% performance improvements on previous top tier?

no I have never seen it.

8800 GTX almost did it :p
 
sincerely I don't know where people got that graph but history teaches.
have you ever seen a cards with a single chip with a 230% performance improvements on previous top tier?

no I have never seen it.

How much was the 9700 Pro? That was about the biggest jump I can remember.
 
8800GTX never gone over 180% improvements while comparing to a 7900GTX and not so often.

That kind of improvement over a one generation span is pretty fucking impressive. If results happened like that every time a new GPU was released, we'd already be playing games on a holodeck, not some shitty and antiquated CRT/LCD monitors. :p
 
That kind of improvement over a one generation span is pretty fucking impressive. If results happened like that every time a new GPU was released, we'd already be playing games on a holodeck, not some shitty and antiquated CRT/LCD monitors. :p

I think I need to sell my GTX580 SLI soon :D
 
I hear you there. I might just be ditching my GTX 570 for this newer model. Would be the first time in quite a while that I haven't skipped a generation when upgrading my GPU.
um you have an A64 6400 X2 with a gtx570? that cpu is already a MASSIVE limitation for you current gtx570.
 
so higher core clock but lower shader clock, I Wonder how it will change things :p



That chart doesn't say that. Supposedly Kepler no longer runs individual clocks on Core/Shader. That would explain why the Shader clock is N/A on the slide.
 
Not with a 256 bit memory bus it's not...that would have to be OC'd to cherry red hot.

amd got away with 256 bit memory since 9700pro

i dont think every model will be using 256 bit anyway, high end model is for sure to use 384 bit, or maybe even 512 bit.
 
amd got away with 256 bit memory since 9700pro

i dont think every model will be using 256 bit anyway, high end model is for sure to use 384 bit, or maybe even 512 bit.

Is he not talking about their high end? I don't think it says either way.

PS: All the AMD high end 256 bit cards were slower than NV's 4xx/5xx 384 bit and higher cards.
 
Status
Not open for further replies.
Back
Top