GTX 790 details surface!

Meanwhile the mid-range suffers.

The 770 is just a re-badged 680, 2 years old GPU.
280x is a re-badged 7970, 3 years old GPU.

Mid range suffers? Innovation and new products are always focused on two ends -- the high end and the low end ( mobile, etc ). Mid-range is simply older high range products, be it used market or rebrands of former high end.

If you have a 680 or a 7970, there is not much you can't do at 1080p ( primary mid-range resolution ).
 
Mid range suffers? Innovation and new products are always focused on two ends -- the high end and the low end ( mobile, etc ). Mid-range is simply older high range products, be it used market or rebrands of former high end.

If you have a 680 or a 7970, there is not much you can't do at 1080p ( primary mid-range resolution ).
Keeping the same GPU for 2-3 years is a stretch even by Nvidia standards, the masters of re-badging.

I don't think they kept the G80 that long and they beat that thing into the ground.

I should forgive them though, Maxwell should help. AMD is the worst offender here.
 
Has SLi/Xfire changed, or is it still just 5 GB of usable memory per card?
Because if it is... I thought these forums were smarter than that.

How many YEARS have we had to say "No it's actually just half of that vram"? Like, 8? 9?

Too many years...
 
Keeping the same GPU for 2-3 years is a stretch even by Nvidia standards, the masters of re-badging.

I don't think they kept the G80 that long and they beat that thing into the ground.

I should forgive them though, Maxwell should help. AMD is the worst offender here.

Keeping the same GPU is definitely a stretch for a high end enthusiast. My 7950, overclocked, would have been good enough for me if I had stayed at 1080p, and I am a fairly high end enthusiast. When I upgraded to 1440p and 120hz, I needed more power.

At 1080p, there is no reason a mid range card should not last 2-3 years for good settings in good games for mid range enthusiasts.

High end? Upgrading every 1 to 2 years will continue.
 
I'm talking about AMD/Nvidia rehashing the same GPU on new cards for 3 years, not people owning the same card for 3 years.

The 280x came out in October and it's based on the same GPU as the 7900 series from 2011. It turns out those cards just turned 2 years old (Dec 2011) so my math is off anyway. :/
 
I'm talking about AMD/Nvidia rehashing the same GPU on new cards for 3 years, not people owning the same card for 3 years.

The 280x came out in October and it's based on the same GPU as the 7900 series from 2011. It turns out those cards just turned 2 years old (Dec 2011) so my math is off anyway. :/

My mistake. You quoted my post, which was a response in context to mid-range GPU's so I figured that was what you were talking about. On your subject, I think the gradual 'de-crippling' trend of GPUs will continue into the future as releasing completely different chips at a quicker pace is just not feasible nor warranted with the current demand from games. Plus, pushing multi-card solutions and extending the life of each architecture revision is really in both vendors best interest.
 
Will most likely get one of these, if it really has higher VRAM then it's a guaranteed buy to replace my 690 lol
 
Will most likely get one of these, if it really has higher VRAM then it's a guaranteed buy to replace my 690 lol

Not to derail this thread too much, but is the desire for more ( greater than 3gb ) VRAM by folks a triple monitor thing, or a future proofing thing?
 
If I can't get the Nofan CR80 to work decently, and thus enable SLI, then this card will be my entry to 4K. I only hope it works in a vertical mount.
 
the article says gk110 x 2, so it's still SLI, just like the 690
 
Few things on my mind; Definitely need PCI-E 3.0 and this card will pretty decently out perform two 780s regardless of vram.
 
Few things on my mind; Definitely need PCI-E 3.0 and this card will pretty decently out perform two 780s regardless of vram.

I'm not so sure. Power limits will require a substantial downclock.
 
For the low price of 999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999.99, wow that's a lot of 9's
 
The world isn't even worth that much money in USD.
 
Gotta cost a grand.... Titans are like $800 and 780TI's are not far behind
 
For me it would be surround

I figured. 10gb of VRAM, geebus if it's anywhere near that would be astounding then. How much VRAM usage are you seeing on a surround setup? If a game uses 2gb with a solo monitor, how much would a surround add?
 
im confused... if its a dual gpu card and it has 10gb vram.. doesnt it mean only 5gb of it is actually usable?

so if this were true it's 1gb less than the 6gb titan....

clarification please....
 
im confused... if its a dual gpu card and it has 10gb vram.. doesnt it mean only 5gb of it is actually usable?

so if this were true it's 1gb less than the 6gb titan....

clarification please....

That is most likely what it means. However it just doesn't make sense. The 780 has 3GB of RAM and supposedly it's going to be two of those GPUs on a single PCB. Therefore it makes sense to be 6GB total, but then the Titan has 6GB with a single GPU. So why wouldn't it be 12GB of VRAM so each GPU will have 6GB. Unless they discovered that the Titan's 6GB of RAM wasn't usable.

Still seems a bit fishy and I don't see anyone else reporting or supporting that rumor.
 
That is most likely what it means. However it just doesn't make sense. The 780 has 3GB of RAM and supposedly it's going to be two of those GPUs on a single PCB. Therefore it makes sense to be 6GB total, but then the Titan has 6GB with a single GPU. So why wouldn't it be 12GB of VRAM so each GPU will have 6GB. Unless they discovered that the Titan's 6GB of RAM wasn't usable.

Still seems a bit fishy and I don't see anyone else reporting or supporting that rumor.

Supposedly because they would slash the vram bus width from 384 to 320.
 
That is most likely what it means. However it just doesn't make sense. The 780 has 3GB of RAM and supposedly it's going to be two of those GPUs on a single PCB. Therefore it makes sense to be 6GB total, but then the Titan has 6GB with a single GPU. So why wouldn't it be 12GB of VRAM so each GPU will have 6GB. Unless they discovered that the Titan's 6GB of RAM wasn't usable.

Still seems a bit fishy and I don't see anyone else reporting or supporting that rumor.


Because it's going to be a flaship card like Titan. It's what the enthusiasts think they need and have more money than brains so they buy it. I say good for them.
 
Why did you edit your reply? Still doesn't make sense that they'd do that.

Hell if I know. Maybe they don't want to sell a 12GB card just yet? Can't they just stack 4.5GB per gpu at 384 width? Maybe they can't mix vram densities (3+1.5) like that, I dunno. Seems like a lot of trouble of spinning out another rev with 320 width just to hit 5GB. Again these are just total WAGs, who knows. Knowing nvidia, they'll put the 12GB model on sale 3 months later, making the 1%ers buy new cards yet again haha.
 
my guess is it will cut costs/higher profits to drop the bus to 320/10GB vs 384/12GB

curious if asus will release a mars 780ti x2 card with 5 8-pin connectors and it's own built-in TEC
 
Interesting card, I speculated before that they should release something with this cuda count (or less) and call it a 770 Ti. Hopefully its priced to move!
 
When has a single dual GPU card EVER been better or cheaper than multi-card solutions?
Did you read what you quoted? He said ITX systems, which only have a single expansion slot.

Only way to get dual-GPU on ITX, at all, is with a dual-GPU card

Meanwhile the mid-range suffers.

The 770 is just a re-badged 680, 2 years old GPU.
Two things...

First off, the GTX 770 isn't just a re-badged GTX 680. The physical hardware is different, the power regulation is beefier, the clocks are higher, and it uses entirely different RAM chips capable of 7GHz (where as the GTX 680 shipped with 6 GHz RAM). Power requirements are higher. The reference cooler has been upgraded to the one that the Titan uses, and the new fan control algorithm makes it a much quieter card than the GTX 680. Also consider that the GTX 770 has full GPU Boost 2.0, which allows for better out-of-the-box clockspeed control than the GTX 680.

That's more than enough to warrant a new card and a new name.

Second, the GK110 core (Used the the GTX Titan, GTX 780, GTX 780 Ti, and now the GTX 790) is about the same age as the GK 104 used on the GTX 680 and GTX 770. If you're going to complaint about the mid-range "suffering" due to the re-use of old GPU designs, then the high-end is doing no better.
 
Last edited:
Single Titan user here (see sig).

I'd jump on this for 1200 if it comes out (I paid that much for my Titan anyway)
 
I'm skeptical that this will happen.

Cutting the memory bus from 384 to 320 is not insignificant and there has been no mention of clock speed which always takes a hit on dual cards.

I am guessing that, at best, it would be 90% as fast as a pair of GTX780s
 
Not surprised Nvidia would release a 790. That's a bit overkill though, wonder how far it will knock the Titan down in price.
 
Back
Top