nVidia "It will be a long time"? Computex

tangoseal

[H]F Junkie
Joined
Dec 18, 2010
Messages
9,743
This is from Anands feed over at Computex when they asked nV when the next GeForce is going to be released....

This is simply a low information thread and nothing more. All we have to go off is "it will be a long time" but what defines long time?

What do you guys/gals think about this very ambiguous exchange between Anandtech's reporter and the nV representative stating the red arrow indicated line in the screenshot?

They indicated the following:
 

Attachments

  • computex Geforce.jpg
    computex Geforce.jpg
    390.3 KB · Views: 0
They have no competition. Their "old" technology flies of the shelves. There's little incentive to drop a new card :(

I bet they milk it for another year
 
Why does everyone want new cards? If it is for better performance, why not buy the current top of the line card?
 
It's not good enough for 4k on many games..

I kinda feel sorry for those who jumped in so early on 4K. I remember those extremely early adopters who got 4K display when their best option was a 980ti/Titan X and honestly it doesn't seem much better now for them as lots of games have higher requirements. But then again you'd have to be kind of stupid to think you wouldn't be turning down lots of settings going in that early on 4k.
 
I kinda feel sorry for those who jumped in so early on 4K. I remember those extremely early adopters who got 4K display when their best option was a 980ti/Titan X and honestly it doesn't seem much better now for them as lots of games have higher requirements. But then again you'd have to be kind of stupid to think you wouldn't be turning down lots of settings going in that early on 4k.

Oh, I agree. I'm happily playing at 1440p but I also think it's irks people that we're 2 years into this architecture and 11 month since the 1080 ti. There's expectations. Intel did that same thing until AMD finally became competitive again.
 
He could have said no comment. But I think with old cards flying off shelf with price still above launch MRC. what is the incentive to come with anything new right now? Not much. People are assuming NVidia shut off production of pascal but I haven't seen any solid confirmation on that. They can drop the new cards anytime with in a few months. AMD is quiet and they will likely compete in 2019. Nvidia can drop this later in the year or even beginning next year and still be good.
 
This is from Anands feed over at Computex when they asked nV when the next GeForce is going to be released....

This is simply a low information thread and nothing more. All we have to go off is "it will be a long time" but what defines long time?

What do you guys/gals think about this very ambiguous exchange between Anandtech's reporter and the nV representative stating the red arrow indicated line in the screenshot?

They indicated the following:
Too bad you don't read the HardOCP news page, you would have known that 5 hours earlier. ;)
https://www.hardocp.com/news/2018/06/04/dont_expect_new_geforce_gpu_for_long_time
 
This is the problem with no real competition. Nvidia is trying to wind down production on last gen cards and transition to new cards, mining spike happens, Nvidia ramps up production on old cards as new cards aren't ready, mining falls off, Nvidia now has all these last gen cards they want to get rid of before they launch new cards.
 
I'm not surprised. Not. One. Bit.

If it weren't for crypto, the 1060 alone has definitely earned it place in history as a great 1080p card.

I can't think of a better value than say, the Geforce 4 ti 4200?

(for the price segment etc etc)
 
This is the problem with no real competition. Nvidia is trying to wind down production on last gen cards and transition to new cards, mining spike happens, Nvidia ramps up production on old cards as new cards aren't ready, mining falls off, Nvidia now has all these last gen cards they want to get rid of before they launch new cards.
Plus manufacturing costs on old cards continue to go down but prices haven't had to. Makes zero sense to put out a new card business wise.
 
Plus manufacturing costs on old cards continue to go down but prices haven't had to. Makes zero sense to put out a new card business wise.

True, 2 years later probably costing NVidia way less to manufacture those cards. Plus on top card prices are still selling at msrp. Crazy!
 
The only thing that has changed in the last 2 years is that memory prices have increased which does hurt their margin. It is probably comparable to the original manufacturing costs.
 
The only thing that has changed in the last 2 years is that memory prices have increased which does hurt their margin. It is probably comparable to the original manufacturing costs.
Honestly nvidia is probably still getting a damn good deal on memory at the rate they are selling. Straight to consumer that might be different story.
 
I've been extremely pleased with my 1080ti at 1440p but I'd hate to have to buy one for more than $700
 
I've been extremely pleased with my 1080ti at 1440p but I'd hate to have to buy one for more than $700

not worth the cost to value ratio. SLI isn't guaranteed in every game. just ordered z35p 3440x1440 off ebay yesterday, had the 10% cash back going on. Ended up being 680 out the door, shipped, brand new but open box. I think it will do me just right for some time to come. But the display does overclock to 120hz. But I wanted to try out gsync more than anything.
 
not worth the cost to value ratio. SLI isn't guaranteed in every game. just ordered z35p 3440x1440 off ebay yesterday, had the 10% cash back going on. Ended up being 680 out the door, shipped, brand new but open box. I think it will do me just right for some time to come. But the display does overclock to 120hz. But I wanted to try out gsync more than anything.

I've read that 3 times and still can't find the relevancy.

You bought a new monitor. Cool!
 
I've read that 3 times and still can't find the relevancy.

You bought a new monitor. Cool!

Yea monitor is pretty damn cool, cant wait to try. I don't know what part of shitty SLI support you didn't get, or a gsync monitor will add to my experience then getting another card would. Any questions let me know lol
 
It's not good enough for 4k on many games..
umm the 1080TI with an OC is a 4k/60 single GPU solution. Yeah turning on AA will hurt framerates but if you feel you don't need AA at 4k then a 1080ti is perfect. Yes, it needs good overclock but still.
 
Kepler to Maxwell_v2 was longer than this. The rebadged/refreshed 700 series only masked the transition period. Also not releasing the 780Ti for a long time.

GTX 680 - GTX 980: March 2012 - September 2014 = 30 months
GTX 780 (closest to modern x80 Ti) to GTX 980 Ti: May 2013 - June 2015 = 25 months

And what if you were a GTX 580 (November 2010) owner who didn't want to upgrade to the 680? 30 months for the GTX 780, and 36 months for the true fully unlocked equivalent (780 Ti)

There's been much longer. At least Titan V is there if you want. Clobbers the Titan Xp in games, and will probably be faster than 2080/1180 anyway..
 
Yeah, I'm surprised there hasn't been an across-the-board re-brand yet, with the longevity we're now expecting from Pascal.

The last release of new part to the mix was November. And I suppose the 1050 3GB, which has a otherwise fully-enabled 1050 Ti core with cut memory bandwidth. But it's so non-official you can't buy cards yet.

Nvidia is moving at the speed of slow,
 
Last edited:
I kinda feel sorry for those who jumped in so early on 4K. I remember those extremely early adopters who got 4K display when their best option was a 980ti/Titan X and honestly it doesn't seem much better now for them as lots of games have higher requirements. But then again you'd have to be kind of stupid to think you wouldn't be turning down lots of settings going in that early on 4k.

Are you kidding me? I actually picked up performance going to 4K. There were several of us hoping for large enough displays to drop from mutliple displays down to a single larger format display. I was running 3x27" ROG Swift monitors which had a combined resolution of 7680x1440. Before that I had 3x30" Dell 3007WFP-HC's running at 7680x1600. 3840x2160 is cake by comparison. Of course no single card has been able to drive my setup at any point. I've basically had to run SLI or Crossfire the entire time. For the most part, I haven't had to turn down anything for 4K.

4K is demanding, but many of us were already dealing with resolutions that were as demanding or even more so long before we ever heard of 4K. Early adopters tend to know what they are in for before diving in.
 
Are you kidding me? I actually picked up performance going to 4K. There were several of us hoping for large enough displays to drop from mutliple displays down to a single larger format display. I was running 3x27" ROG Swift monitors which had a combined resolution of 7680x1440. Before that I had 3x30" Dell 3007WFP-HC's running at 7680x1600. 3840x2160 is cake by comparison. Of course no single card has been able to drive my setup at any point. I've basically had to run SLI or Crossfire the entire time. For the most part, I haven't had to turn down anything for 4K.

4K is demanding, but many of us were already dealing with resolutions that were as demanding or even more so long before we ever heard of 4K. Early adopters tend to know what they are in for before diving in.

Well in your case you kind of "downgraded" to 4k. I've had to turn down settings in 1440p to maintain roughly 144 fps on my display. You haven't had to turn something down to maintain a solid 60fps on your 4k display?
 
Well in your case you kind of "downgraded" to 4k. I've had to turn down settings in 1440p to maintain roughly 144 fps on my display. You haven't had to turn something down to maintain a solid 60fps on your 4k display?

On rare occasion, I've had to sacrifice one or two settings like higher levels of anti-aliasing, often dropping to FXAA from TXAA or something like that to get a solid 60FPS. Its pretty rare though. I can't think of anything I've had to turn settings down for since getting my 1080Ti's. Back when I had Titan X Maxwell cards in SLI, they worked well in a few games that would support NVSurround. However, many didn't do it well and I ended up running on a single 30" at 2560x1600 most of the time. The purchase of my 1080Ti's coincided with my 4K display and therefore, I've never run it without them.

Initially, before I could get Andromeda to run on both graphics cards there were performance issues, but only on the Tempest. Anywhere else the game ran smoothly. When I got both cards running in SLI, it crushed the game for everything but a few areas that are bad due to the game's design.
 
Man I can't even FPS under 140 FPS.... My kill ratio is through the roof in games like Titan Fall 2 if I am running excess of 140fps... I run on an Acer 240hz panel and I absolutely love it.

I have no idea how to even game at 60hz... it would punish my eyes with it's slideshow like frame rate at 60hz sigh

4k Will have to wait for me until we can reliably get 120 fps minimum at high or above average settings.

I am not adopting 4k until that time comes. Maybe if we get cards with really high vram like 32GB that is exceptionally high in bandwidth like GDDR6 or higher we will be fine. Maybe two more generations
 
Newegg sold like 500 of the Zotac 1070 TIs today with the 20% EBAY deal, and a special listing of $80 below their normal price, so to me that points to the "long time" comment to just be an attempt not to slow sales down waiting for the new video card. I actually picked one up as I think it will be a good bridge to the 1180 which I want to get, but have no interest in the founders versions as noise is a huge criteria of me, but I expect really quiet card from Asus and EVGA will be available for Thanksgiving deals and the 10670 should still have some resale value then, and if I can find a buyer the 1060 I have now but replacing with the 1070 TI should have some value now, so I wont lose much getting an interim card while waiting for the non-founders 1180s.

But I think the July 30 date is probably pretty likely this point. I don't think it will stretch into 2019 as some seem to think
 
Man I can't even FPS under 140 FPS.... My kill ratio is through the roof in games like Titan Fall 2 if I am running excess of 140fps... I run on an Acer 240hz panel and I absolutely love it.

I have no idea how to even game at 60hz... it would punish my eyes with it's slideshow like frame rate at 60hz sigh

4k Will have to wait for me until we can reliably get 120 fps minimum at high or above average settings.

I am not adopting 4k until that time comes. Maybe if we get cards with really high vram like 32GB that is exceptionally high in bandwidth like GDDR6 or higher we will be fine. Maybe two more generations

It's not ram size that's the problem, it's memory bandwidth. 16GB GDDR6 will be overkill for a very long time, even at 4k, but the memory bandwidth jump from GDDR6 is necessary
 
Man I can't even FPS under 140 FPS.... My kill ratio is through the roof in games like Titan Fall 2 if I am running excess of 140fps... I run on an Acer 240hz panel and I absolutely love it.

I have no idea how to even game at 60hz... it would punish my eyes with it's slideshow like frame rate at 60hz sigh

4k Will have to wait for me until we can reliably get 120 fps minimum at high or above average settings.

I am not adopting 4k until that time comes. Maybe if we get cards with really high vram like 32GB that is exceptionally high in bandwidth like GDDR6 or higher we will be fine. Maybe two more generations
GTX 1080 Ti SLI would probably do ok. I can run most games at 6k resolution at 60fps.
 
GTX 1080 Ti SLI would probably do ok. I can run most games at 6k resolution at 60fps.

Yeah I run 2x 1080ti.

I also game on my 3440 x 1440 acer which is sub 4k but much higher than standard 2k.

I find that DSR punishes my cards though.
 
Back
Top