[TPU] Did NVIDIA Originally Intend to Call GTX 680 as GTX 670 Ti?

WorldExclusive

[H]F Junkie
Joined
Apr 26, 2009
Messages
11,548
[TPU] Did NVIDIA Originally Intend to Call GTX 680 as GTX 670 Ti?

Although it doesn't matter anymore, there are several bits of evidence supporting the theory that NVIDIA originally intended for its GK104-based performance graphics card to be named "GeForce GTX 670 Ti", before deciding to go with "GeForce GTX 680" towards the end. With the advent of 2012, we've had our industry sources refer to the part as "GTX 670 Ti". The very first picture of the GeForce GTX 680 disclosed to the public, early this month, revealed a slightly old qualification sample, which had one thing different from the card we have with us today: the model name "GTX 670 Ti" was etched onto the cooler shroud, our industry sources disclosed pictures of early samples having 6+8 pin power connectors.

dPa5J.png

http://www.techpowerup.com/162901/Did-NVIDIA-Originally-Intend-to-Call-GTX-680-as-GTX-670-Ti-.html

Just because something is priced as an enthusiast card, labeled as an enthusiast card and performs better than the competitions enthusiast card, doesn't mean it is one.
Kepler is a complete 180 from Fermi. It allowed them to sandbag the GK110 and release it just before the AMD 8000 series.
nV is still up to their tricks, but if the product is great and cheaper there shouldn't be any complaints really. 256bit, 2GB, small die size and 6+6 power were dead giveaways.
 
Welcome to the world of business. This is why competition is healthy and useful to the consumer.

If it wasn't for AMD, where we would be with Intel right now? (Take it for what it's worth, which is just heresay, but I once read online that if it wasn't for AMD, it may have taken Intel around another 4-5 years to even break the gigahertz barrior).

What purpose does NVidia have to innovate if AMD can't catch up? This isn't NVidia's fault. They're in the business to make money, and if they have a better solution already in existance, it's their perogative to keep it to themselves until AMD can catch up or surpass them.
 
Nv was trying to confuse AMD that they are only have mid-range card coming out first, so that AMD can sit back and relax, without setting the 7970's default clock too aggressive.
 
nvidia wasnt kidding when they said that they werent impressed with 7970's performance.
 
Nv was trying to confuse AMD that they are only have mid-range card coming out first, so that AMD can sit back and relax, without setting the 7970's default clock too aggressive.

A 1.1GHz 7970 stock card from AMD would have forced nV to release a hot and power hungry GK110. They wouldn't have release the GK104 in that position.
After nV finishes respinning that GK110, oh boy, watch out.

The original clocks on the GK104 was 702MHz if some of you remember. Raise the clocks and bam, 1GHz GTX 680
 
Last edited:
A 1.1GHz 7970 stock card from AMD would have forced nV to release a hot and power hungry GK110. They wouldn't have release the GK104 in that position.
After nV finishes respinning that GK110, oh boy, watch out.

The original clocks on the GK104 was 702MHz if some of you remember. Raise the clocks and bam, 1GHz GTX 680

Or they would have just released GK104 as the GTX 560 or 570 at a lower price point (which is what it seems like it was originally intended for anyway) and just given AMD the top-end crown until they could get GK110 out the door. A GK104-based GTX 560 Ti at $399 would still be a compelling value, even against a higher-clocked 7970.
 
Other than speculation there is nothing to prove that this part was going to be anything other then what Nvidia released it as. It doesn't matter what they have done in the past or what rumors say it, in the end it only matters what Nvidia decides to call it and how they plan to market it.
 
Other than speculation there is nothing to prove that this part was going to be anything other then what Nvidia released it as. It doesn't matter what they have done in the past or what rumors say it, in the end it only matters what Nvidia decides to call it and how they plan to market it.

How about the fact that the screenshot he posted has the drivers identifying the card as a GTX 670 Ti and not a GTX 680??? Is that not proof enough?
 
Just because something is priced as an enthusiast card, labeled as an enthusiast card and performs better than the competitions enthusiast card, doesn't mean it is one.

What is it then? And who are you to judge it as such?

I can guarantee you that AMD is currently working on the HD 8900 series - and probably has been working on it for a while. Does that mean that the 7900 series shouldn't be "enthusiast" cards because AMD will be coming out with something better not too long from now? All you are basing it off of is a press event in which Nvidia said , yes we have two GPU's we are working on - one of which we be released later.

IMO - if it looks like a duck and quacks like a duck..
 
What is it then? And who are you to judge it as such?

I can guarantee you that AMD is currently working on the HD 8900 series - and probably has been working on it for a while. Does that mean that the 7900 series shouldn't be "enthusiast" cards because AMD will be coming out with something better not too long from now? All you are basing it off of is a press event in which Nvidia said , yes we have two GPU's we are working on - one of which we be released later.

IMO - if it looks like a duck and quacks like a duck..

You just compared two different AMD series, 7900 and 8900. We're talking about 670Ti vs 680, which are of the same series.

Try again
 
How about the fact that the screenshot he posted has the drivers identifying the card as a GTX 670 Ti and not a GTX 680??? Is that not proof enough?

Could just as easily be a typo, so no this isnt proof unless Nvidia comes out and specifically says it.

Also why would they scrap something that cost them MORE R&D just to release another product for more money. They may have a product that they want to release as the 680 instead but if its un manufacturable right now then that product doesn't exist.

Like another poster already said "smells like a duck, quacks like a duck and ... tastes like a duck...."

Trust me if nvidia had a product that was more high end and ready to ship they would.
 
Could just as easily be a typo, so no this isnt proof unless Nvidia comes out and specifically says it.

Also why would they scrap something that cost them MORE R&D just to release another product for more money. They may have a product that they want to release as the 680 instead but if its un manufacturable right now then that product doesn't exist.

Like another poster already said "smells like a duck, quacks like a duck and ... tastes like a duck...."

Trust me if nvidia had a product that was more high end and ready to ship they would.

I doubt it. Amd will release something faster and then they will drop the true 680
 
I doubt it. Amd will release something faster and then they will drop the true 680

Thats a horrible business move to hold out a product you spent a good portion of your bank account on just because you want to keep one upping your competition. If the "true" 680 was ready to release Nvidia would of done it. Its just a hardware geek's wet dream to invision these super amazing products are only in the waiting. If they are holding out on a product then as a consumer I would be pissed at the green team.
 
nvidia wasnt kidding when they said that they werent impressed with 7970's performance.

I can imagine them cheering when the 7970 launched.

"It's slower than our chip"
"Yeah we expected it to be slower than the 110, so what?"
"Actually it's slower than the 104"
":eek:"
 
Trust me if nvidia had a product that was more high end and ready to ship they would.

The GK100/110 wasn't ready to ship. Too hot and too power hungry. Needs more respins.
GK104 on the other hand was ready to go.

Nvidia will not admit anything until the expected GK110 part is released, if then.
 
I can imagine them cheering when the 7970 launched.

"It's slower than our chip"
"Yeah we expected it to be slower than the 110, so what?"
"Actually it's slower than the 104"
":eek:"

More along the lines of "if we build in auto-clocking it will compete and win some games"

"what if they sell stock OC'ed cards?"

"SOB's!"

and from the looks of it SLI scaling is terrible vs CF....2.5 months later to the party as well...tsk tsk.
 
Thats a horrible business move to hold out a product you spent a good portion of your bank account on just because you want to keep one upping your competition. If the "true" 680 was ready to release Nvidia would of done it. Its just a hardware geek's wet dream to invision these super amazing products are only in the waiting. If they are holding out on a product then as a consumer I would be pissed at the green team.

Actually it makes perfect business sense, while GK110 is likely not ready today they could have waited another couple of months and done a Fermi launch if they had to. Now however, the relative strength of GK104 allows them to buy some time working yield issues before they have to launch.

The other factor is that they don't want to show their whole hand and decimate AMD in one round when they can still eek out the win with their midrange part. If they launched the GK110 their lead would be greater but if AMD came back with a viable competitor Nvidia would have nothing in the wings to counter. This is the same reason for instance they launched the GTX 460 with some disabled shaders despite yield issues largely being resolved at that point, it buys them a reserve portion of performance to pull out as needed.

This happens elsewhere too, if AMD was actually threatening Intel's CPU's do you think they would have delayed Ivy Bridge from late last year to June of this year to make their yields nicer? No fucking way. If they don't need the performance at the moment it makes more sense for them to milk the R&D for all its worth and have two 'one ups' rather than one. This also will lower their manufacturing costs on GK110, therefore improving their return on investment in that area as well.

I'm sure there were plenty of Nvidia executives bemoaning the money they left on the table when they priced the 8800GT so low when it trounced AMD, they are working to avoid a repeat of that scenario.
I can imagine them cheering when the 7970 launched.

"It's slower than our chip"
"Yeah we expected it to be slower than the 110, so what?"
"Actually it's slower than the 104"
":eek:"

Keplar engineers have probably been feeling like superbowl champs for the last several months. :p
 
Its business sense to trickle out technology. Cuts r&d and manufacturing costs.

So if the 104 chip was fast enough to beat amd's top chip then thats a logical business decision to release that as flagship and leave the 110 etc. in reserve.

Whether or not its good for the consumer I dont know not so clear cut.
 
More like the 660, then it started to turn out pretty good so they called it a 670. Realised the 7970's were about the same and followed suit (but still undercut them).

I would like to say I'm sitting this out in protest, but I did that last time and I want a new GPU...
 
I don't care what they call it. It doubles the performance of my unlocked 6950, uses less power, makes less heat, is shorter and costs 300usd after I sell the old card. I consider that a steal.
 
I don't care what they call it. It doubles the performance of my unlocked 6950, uses less power, makes less heat, is shorter and costs 300usd after I sell the old card. I consider that a steal.

This.

It's not important what Nvidia intended this card to be. What matters is what this card offers against the competition, and this card does an absolutely fantastic job at that.

I'm very interested to see a GTX 660 and 670 version of this, I'm already excited at the idea of having a GTX 660 as successor for my 560. (Which still works perfectly though)
 
IMO taking a mid range part and slapping the top tier name on it reeks of "crap"

I don't know about you guys but I value consistency - and the only thing I can see nvidia doing that will make any sort of sense is that if they roll out the high range chip, the only thing they could call it would be the 685. As I'm assuming the 690 would be reserved for their eventual dual GPU solution.

I tip my had to them if their mid range chip (now with a top tier name) can (barely) stick it to the 7970. However I think it's generally confusing to consumers that now the "X80" name which used to symbolize cream of the crop is now going to play second fiddle to whatever chip coming down the pipe in 6 months or a year.

Personally -- I think it would have put nvidia in a much brighter light had they named the card the 670Ti. Imagine the reviews showing that the mid range card from nvidia beats the top range card from AMD?
 
IMO taking a mid range part and slapping the top tier name on it reeks of "crap"

I don't know about you guys but I value consistency - and the only thing I can see nvidia doing that will make any sort of sense is that if they roll out the high range chip, the only thing they could call it would be the 685. As I'm assuming the 690 would be reserved for their eventual dual GPU solution.

I tip my had to them if their mid range chip (now with a top tier name) can (barely) stick it to the 7970. However I think it's generally confusing to consumers that now the "X80" name which used to symbolize cream of the crop is now going to play second fiddle to whatever chip coming down the pipe in 6 months or a year.

Personally -- I think it would have put nvidia in a much brighter light had they named the card the 670Ti. Imagine the reviews showing that the mid range card from nvidia beats the top range card from AMD?

they can always call the top tier one 780
 
they can always call the top tier one 780

What I've gather from the rumors, that seems like the plan.
Remember the early rumor about the mid-range being 600s and top-end being 700s?

The dual GK104 is coming, the 195W rating will make it easy for nV to produce, no crippling downclock this time.
 
IMO taking a mid range part and slapping the top tier name on it reeks of "crap"

I don't know about you guys but I value consistency - and the only thing I can see nvidia doing that will make any sort of sense is that if they roll out the high range chip, the only thing they could call it would be the 685. As I'm assuming the 690 would be reserved for their eventual dual GPU solution.

I tip my had to them if their mid range chip (now with a top tier name) can (barely) stick it to the 7970. However I think it's generally confusing to consumers that now the "X80" name which used to symbolize cream of the crop is now going to play second fiddle to whatever chip coming down the pipe in 6 months or a year.

Personally -- I think it would have put nvidia in a much brighter light had they named the card the 670Ti. Imagine the reviews showing that the mid range card from nvidia beats the top range card from AMD?

Again, this is what happened with G92 and the 8800 GT, Nvidia's midrange card decimated AMD. It was very good for market share, but they also only got midrange pricing for it. They are using this tactic to forestall the race to the bottom on pricing, good for them, not so great for us.

Lets hope AMD pulls a 1.2ghz 7980 out of their hat to threaten the pricing structure soon and force Nvidia's hand. If they cannot I worry GK110 will see a return to Nvidia's favorite $650-$700 pricepoint that we got with 8800 GTX and GTX 280.
 
Lets hope AMD pulls a 1.2ghz 7980 out of their hat to threaten the pricing structure soon and force Nvidia's hand. If they cannot I worry GK110 will see a return to Nvidia's favorite $650-$700 pricepoint that we got with 8800 GTX and GTX 280.

I hope AMD does respond with a higher clocked 7970, then we will see the GTX690/GTX780 much sooner.
 
If this was meant as the 670Ti, then isn't it reasonable to assume that the card *meant* to be the 680, the GK110, is/was about 15-20% faster? That's usually how they design that stuff.
 
If this was meant as the 670Ti, then isn't it reasonable to assume that the card *meant* to be the 680, the GK110, is/was about 15-20% faster? That's usually how they design that stuff.

Possibly, however I personally expect it to be a much different beast. NVidia has invested heavily in GPGPU and will not cede that market. This means the static scheduler and pared down FP64 we see on GK104 will be gone in favor of much more flexible execution units.

I'm excited. :)
 
I'm not too up on my nvidia history - the last "green" hardware I had was a GTX280.

I suppose in my mind I always attributed each "100" iteration to be a generation, such that having a 680 and a 780 makes it sound like they are creating two generations from one. (which is confusing)

With such massive headroom on the 7000 series chips - I'm excited to see AMD come out with something newer here after a while to keep the 'fight' going.

If nvidia comes out with their actual top level chip in 6 months, it will be interesting for sure. Not enough to make me want to jump ship, but I enjoy a good consumer goods war.

While 499 is a great price for supposed top end hardware, the luster wears off when you realize the price range for the GK110 is sitting at 600-700 dollars. for a SINGLE gpu solution.

From where I sit right now it looks like nvidia is trying to make the new midrange price point 499. Yes, yes, I'm sure there will be a 660, 650, whatever. Wrapping everthing up I get the notion that the 680 was a stop gap solution to take a bit of the heat off, or just another tier of price/performance they wanted to add to an already saturated market structure.
 
I suppose in my mind I always attributed each "100" iteration to be a generation, such that having a 680 and a 780 makes it sound like they are creating two generations from one. (which is confusing)

My guess is GK110 is gonna be called GTX 685 or something.
 
..... (Shakes head side to side with frown)

Uhmm

1. GTX 680 has the same memory bandwidth as a GTX 580 due to higher memory clock
2. On your own example in 2560x1600 it beats the 580 in minimum FPS (which matters more than in a low res like 1080p)
3. Your skyrim example was for 1280x1024 which is CPU bound and no sane person would use with a GTX 680 SLI setup

Unless you're trolling in which case well done sir.
 
Just because something is priced as an enthusiast card, labeled as an enthusiast card and performs better than the competitions enthusiast card, doesn't mean it is one.

256bit, 2GB, small die size and 6+6 power were dead giveaways.

So Price, Performance and naming don't make it an enthusiast product.

But being a big, inefficient, power hog does? :rolleyes:

I think an enthusiast product is better defined by it's positive qualities than its negative ones. So much the better if it is more efficient and uses less power.
 
Uhmm

1. GTX 680 has the same memory bandwidth as a GTX 580 due to higher memory clock
2. On your own example in 2560x1600 it beats the 580 in minimum FPS (which matters more than in a low res like 1080p)
3. Your skyrim example was for 1280x1024 which is CPU bound and no sane person would use with a GTX 680 SLI setup

Unless you're trolling in which case well done sir.

Shhh, don't confuse him with facts.
 
This settles it in my mind.

The card really was intended to be a mid range card but then NVIDIA got greedy and changed the name just to get more money.
 
This settles it in my mind.

The card really was intended to be a mid range card but then NVIDIA got greedy and changed the name just to get more money.

as long as it's the #1 for at least 6 months I don't care what it's called or based off of. If they turn around in 3 months with a new card I suspect there will be some uproar.
 
as long as it's the #1 for at least 6 months I don't care what it's called or based off of. If they turn around in 3 months with a new card I suspect there will be some uproar.

GTX680 is midrange pcb design and will not be hard for NVIDIA to improve upon. GTX680 owners should enjoy it for what it is.
 
Really, give it a rest already.
Report his post as trolling and let the mods deal with him. Obvious troll is obvious. He has been doing this for over a year now and already got banned for it once.
 
Last edited:
Report his post as trolling and let the mods deal with him. Obvious troll is obvious. He has been doing this for over a year now and already got banned for it once.

How about following your own advice? You're posting advice, in a thread, to just report it and let the mods deal with it instead of talking about it in the thread....:confused:

I really hate giving infractions to people because they decide to post some rant against a troll, but frankly you're just about as much of a contributor to keeping the thread off-topic as the original troll post was.

Sorry for the interruption to those of you actually wanting to have a meaningful conversation.
 
GTX680 owners should enjoy it for what it is.

A little more research and time and I am able to confidently clarify myself here.

The GTX680 is a top tier performing video card by current video card standards.

The GTX680 may have started out with the intention of being a mid range card and based on the design of the PCB and TPU findings it certainly appears very likely.

It would have been very kind of NVIDIA to offer this card at release for $399 but being how efficient and powerful it is certainly merits a top tier price tag.

I recently read that the GTX680 is not capable of extreme over clocking and would fall against the HD7970 but I am finding out that isn't true.

The GTX680 has only been out a day and it has already claimed 1 world title in benchmarking in 3dMark11. Over 15,000 benchmarks in the performance mode.

http://hwbot.org/submission/2267660_kingpin_3dmark11___performance_geforce_gtx_680_15327_marks


Being that the PCB design certainly seems mid range I am sure NVIDIA will be able to build upon the new Keplar GPUS. It took NVIDIA a long time to get to this point but right now they are looking great.

Only reason I bitch so much about the price is because my budget doesn't permit this type of luxury video card but I'd love to have one.
 
Back
Top