• Some users have recently had their accounts hijacked. It seems that the now defunct EVGA forums might have compromised your password there and seems many are using the same PW here. We would suggest you UPDATE YOUR PASSWORD and TURN ON 2FA for your account here to further secure it. None of the compromised accounts had 2FA turned on.
    Once you have enabled 2FA, your account will be updated soon to show a badge, letting other members know that you use 2FA to protect your account. This should be beneficial for everyone that uses FSFT.

Official NVIDIA GeForce GTX 1060 Announcement @ [H]

Status
Not open for further replies.
This is the same horseshit argument as "pc gaming is dead!" Nonsense. PC gaming is growing worldwide. Yes there is a huge number of mobile devices but mobile gaming is garbage. This is like saying Mercedes Benz is going to go out of business because bicycles are cheaper and everyone has one. Don't be retarded man. There will ALWAYS be discrete GPUs, mark my words. Maybe not quad SLi anymore, but GPUs are here to stay.
Yes, here to stay but in smaller numbers. Better analogy would be there will always be Mercedes but there are a lot more Volkswagons
 
The 1060 seems like a strong card, but not really a direct competitor to the RX480 4GB @ $200. nVidia needs something at the $200 price point.

1050ti, anyone? :p
 
Nice card, now I have to figure out what route to go to get out of my 980 either 1070 or 1080, this 1060 is in a nice sweet spot and looks like it would be good for budget oriented builds.
 
How many of those PCs are using integrated graphics? It is easy to sell things as fast as you make them when you make so few.

Not big is not the same thing as dying. Just built this sig rig last week and in doing research on everything there doesn't seem to be any lack of components to build such a thing. There will always be a market for better than average.
 
Yes, here to stay but in smaller numbers. Better analogy would be there will always be Mercedes but there are a lot more Volkswagons

Also think of it like this. Someone who buys a Mercedes wouldn't buy 5 Volkswagons as a substitute. They want a Mercedes.
 
If GPUs are selling like hotcakes what are laptops and smart phones selling like?

Most laptops sold cost less than one of the 1080s many people on this forum bought. And smartphones are subsidized by the service provider. Find me anyone that would buy two of the same smartphones for $1400 upfront, the MSRP of two 1080 FEs currently. Talk about tiny.
 
Motley Fool thinks Nvidia has a 1050Ti in the works. But I don't see that their speculation is much more than just -- speculation.

Whatever they call it, surely nVidia is going to release something competitive at $200? I have a very hard time imagining they'll leave a hole in their line up and just cede the $200 price point to AMD!
 
The 1060 seems like a strong card, but not really a direct competitor to the RX480 4GB @ $200. nVidia needs something at the $200 price point.
1050ti, anyone? :p
Thing is, the 1060 isn't all that much faster. Nvidia claims 15% and a leaked FireStrike shows it at 10%. Driver updates can really close that gap, along with DX12 titles. AMD is pretty good at making their graphic cards perform faster with driver updates. Just look at the R9 390.
 
Thing is, the 1060 isn't all that much faster. Nvidia claims 15% and a leaked FireStrike shows it at 10%. Driver updates can really close that gap, along with DX12 titles. AMD is pretty good at making their graphic cards perform faster with driver updates. Just look at the R9 390.

However, you forget that the leaked FireStrike was without official driver from nVidia, once the driver is released wouldn't it be better?

And also, I am still wondering how people are trying to "sell" the $200 point of the RX 480, and they are forgetting that the card that was shown and reviewed everywhere was the 8 GB one (price for it starts at $239). Oh and regarding the DX 12 Titles and "Async Compute"? Could you please inform yourself from here :

nVidia’s GeForce GTX 1080, and the enigma that is DirectX 12

I would always prefer the 1060 over the RX 480, as nobody knows if AMD will survive for the next 2 years.

I've been reading [H] for more than 5 years now, and never been bothered by registering and answering, because I thought it was a waste of my time.

Now I just got to the point where I would love to put my hat down to Kyle, for the hard work and the wonderful editorial and go and drink a nice cold beer.
 
This card should do pretty well, especially with all the shenanigans around the RX480.
 
Thing is, the 1060 isn't all that much faster. Nvidia claims 15% and a leaked FireStrike shows it at 10%. Driver updates can really close that gap, along with DX12 titles. AMD is pretty good at making their graphic cards perform faster with driver updates. Just look at the R9 390.

You can buy wine now that you won't drink for three years and there's a market for that. If I'm spending ~250 to ~300 USD for a graphics card I want it to be usable in today's games now; not three years down the line when AMD gets around to unlocking the full potential of the card. I say this as someone who has only had AMD cards in the PCs in this house for the last decade.
 
Whatever they call it, surely nVidia is going to release something competitive at $200? I have a very hard time imagining they'll leave a hole in their line up and just cede the $200 price point to AMD!

They might very well to target Esport players.
 
You are kidding right? The amount of capital needed to start and compete is astounding. You can't compare today's market to the mid 90's.

Kidding about what? Market disruption always seems impossible until somebody does it. The industry is definitely more mature now than it was back in the voodoo and rage days but it was also harder to attract VC money then too.

Today you can get a business started with a few smart guys and a cloud subscription. Obviously you can't beat the established guys at their game. You have to change the game.
 
The only reason I would be reluctant to chose a 1060 over a 480 would be DX12 performance concerns. Not sure if the 480's extra 1.5Tflops will pull it ahead in DX12. Either way, the 1060 vs 480 is going to be a fun battle to watch play out over the next several months.

The way I see it, is that until either games show actual improvement (performance or IQ) when running in DX12 mode than in DX11 mode, or if the game ONLY supports DX12, would that become a significant factor. At least for me. Sure, DX12 IS the future, but when is that future going to get here, nobody knows for sure, and when it does get here, the hardware landscape may have changed significantly.

Using RoTR as an example, DX12 is a showpiece in that game more than anything else. RX 480 was able to run it in DX12 with near exact same performance as in DX11, where as 980 drops off a cliff. But in the grand scheme of things, RX 480 still cannot beat 980 running in DX11 mode, and AFAIK there is no benefit of running the game in DX12 than in DX11 in terms of VQ.

Also, for the ordinary layman, or even moderately informed gamers, they will just choose to run the mode that runs the best for them, which is DX11 for nVidia, and depending on game, DX12 for AMD. In the example that I used, many would choose to run DX11 mode for RoTR with nVidia and probably DX12 for AMD.

If RX 480 in DX12 mode beat 980 in DX11 mode, then the argument completely changes, but such a thing has not yet happened.

Personally I'd be most interested in how 1060 performs in DX12 mode compared to DX11 mode
 
Not big is not the same thing as dying. Just built this sig rig last week and in doing research on everything there doesn't seem to be any lack of components to build such a thing. There will always be a market for better than average.
And I had been looking toward something in the $300USD approximate range to throw in the rig in my own sig; now I actually have a choice in the price range without wallet OR PSU breakage - and for once I can consider NEW in that range. And in both cases, not only will it support 1080p in the games of today, at the upper end (GTX1070/RX490) it could even do 4K (or even VR, if it were to float my boat) without quibbling.
 
Kidding about what? Market disruption always seems impossible until somebody does it. The industry is definitely more mature now than it was back in the voodoo and rage days but it was also harder to attract VC money then too.

Today you can get a business started with a few smart guys and a cloud subscription. Obviously you can't beat the established guys at their game. You have to change the game.
Software all the VC money is going to software and the occasional IOT device, creating a new GPU company from scratch would be a momentous task.
 
Software all the VC money is going to software and the occasional IOT device, creating a new GPU company from scratch would be a momentous task.

Indeed. Probably several tens of millions of dollars, maybe even hundreds of millions to even get a product out the door. Especially if the new company has to make most of the tech, hardware, and drivers from scratch.
 
The GTX 1060 FE won't be from AIBs - it will, in fact, be a uniquely-from-nV item (per the same launch announcement).

Therefore, as far as AIBs are concerned, a non-issue (same applies to retailers). nV also announced exactly squat about FE from anywhere except the existing Webfront (though they could indeed sell direct via eBay or Amazon).

source?.. because actually lot of AIB vendors offer Founders Edition of both 1070 and 1080.
 
The way I see it, is that until either games show actual improvement (performance or IQ) when running in DX12 mode than in DX11 mode, or if the game ONLY supports DX12, would that become a significant factor. At least for me. Sure, DX12 IS the future, but when is that future going to get here, nobody knows for sure, and when it does get here, the hardware landscape may have changed significantly.

Using RoTR as an example, DX12 is a showpiece in that game more than anything else. RX 480 was able to run it in DX12 with near exact same performance as in DX11, where as 980 drops off a cliff. But in the grand scheme of things, RX 480 still cannot beat 980 running in DX11 mode, and AFAIK there is no benefit of running the game in DX12 than in DX11 in terms of VQ.

Also, for the ordinary layman, or even moderately informed gamers, they will just choose to run the mode that runs the best for them, which is DX11 for nVidia, and depending on game, DX12 for AMD. In the example that I used, many would choose to run DX11 mode for RoTR with nVidia and probably DX12 for AMD.

If RX 480 in DX12 mode beat 980 in DX11 mode, then the argument completely changes, but such a thing has not yet happened.

Personally I'd be most interested in how 1060 performs in DX12 mode compared to DX11 mode
However, there aren't just the priced DX12 games any more - one game that I would advise anyone to add to the test rota is Forza 6 Apex - and for two reasons; Forza Horizon 3 is coming not just to XBOX ONE, but to Windows 10 - and this year; meanwhile, Forza 6: Apex - the Forza 2-based predecessor -is available for Windows 10 *right now* - and for a grand total of zilch. (If you also have Origin, the trial version of the current Need for Speed is also available, and, like Forza Apex, it is ALSO DX12-driven.) The DX12 software landscape IS going to undergo massive - if not monstrous - change - and that is just in terms of Windows-based gaming; dismissing all the XBOX ONE-based games showcased at E3 would have been a massive mistake, though it would have made sense merely a year ago, as all of them are coming to Windows 10 - no exceptions whatsoever. All are also going to support DX12 - no exceptions whatsoever. DX11 and DX12 are going to co-exist on Windows 10 - and at every price point and gaming point - from AAA to free. Both APIs are going to matter -- and especially going forward. I don't plan on tossing so much as ONE DX11 game in my current rota - which runs from Ashes of the Singularity (in DX11 mode) to ANNO 2205 (my go-to city sim - which replaced both Civ V and Simcity 2013 in that spot) to the entirety of the Starcraft II trilogy - and I am still leaving room for surprises between now and Christmas. The only real weak spot on the hardware landscape is notebooks and portables - and I don't expect that to remain weak merely between now and Christmas, either.
 
Software all the VC money is going to software and the occasional IOT device, creating a new GPU company from scratch would be a momentous task.

Definitely. It would have to be something new and exciting to grab people's attention.

Or even better. ARM or one of the other established mobile GPU players could decide to take a shot at the desktop if they come up with something special.
 
Definitely. It would have to be something new and exciting to grab people's attention.

Or even better. ARM or one of the other established mobile GPU players could decide to take a shot at the desktop if they come up with something special.
The only company even in position to due this is Imagination Technologies, but their gpu side PowerVR has had declining sales for awhile now even though they make the mobile gpu for Apple.
 
The only company even in position to due this is Imagination Technologies, but their gpu side PowerVR has had declining sales for awhile now even though they make the mobile gpu for Apple.

Samsung as well, potentially.
 
Samsung as well, potentially.
The issue with Samsung is it seems already spread too thin like Sony, and like Sony almost all of their divisions have been losing money. I do not see them making a move like this at all, and if they did it would probably be after shedding itself from the dross first.
 
The way I see it, is that until either games show actual improvement (performance or IQ) when running in DX12 mode than in DX11 mode, or if the game ONLY supports DX12, would that become a significant factor. At least for me. Sure, DX12 IS the future, but when is that future going to get here, nobody knows for sure, and when it does get here, the hardware landscape may have changed significantly.

Using RoTR as an example, DX12 is a showpiece in that game more than anything else. RX 480 was able to run it in DX12 with near exact same performance as in DX11, where as 980 drops off a cliff. But in the grand scheme of things, RX 480 still cannot beat 980 running in DX11 mode, and AFAIK there is no benefit of running the game in DX12 than in DX11 in terms of VQ.

Also, for the ordinary layman, or even moderately informed gamers, they will just choose to run the mode that runs the best for them, which is DX11 for nVidia, and depending on game, DX12 for AMD. In the example that I used, many would choose to run DX11 mode for RoTR with nVidia and probably DX12 for AMD.

If RX 480 in DX12 mode beat 980 in DX11 mode, then the argument completely changes, but such a thing has not yet happened.

Personally I'd be most interested in how 1060 performs in DX12 mode compared to DX11 mode

I think you are completely wrong here. AMD cards consistently gain perf when comparing DX12 vs DX11 perf in all games which support DX12. In the price segment that Rx 480 launched AMD wins the majority of the DX12 games currently released by a significant margin. Its a thrashing to current Nvidia cards like GTX 980 / GTX 970. It remains to be seen how the GTX 1060 will fare in DX12 games. Of the DX12 games released , AMD Gaming Evolved titles like Ashes and Hitman are clearly running faster on DX12 with AMD cards. Nvidia cards seem to perform slightly slower with DX12. RoTR which is a Gameworks title shows performance regression for Nvidia cards when comparing DX11 vs DX12 again while AMD cards gain slightly.

The Radeon RX480 8GB Performance Review - Page 24
The AMD Radeon RX 480 Review - The Polaris Promise | Gears of War: Ultimate Edition
The AMD Radeon RX 480 Review - The Polaris Promise | Hitman (2016)
The AMD Radeon RX 480 Review - The Polaris Promise | Rise of the Tomb Raider

Right now DX12 performance is an embarassment for Nvidia Maxwell cards especially the GTX 980 and GTX 970. For AMD cards its their key advantage. If GTX 1060 does not fix that its going to get ugly pretty soon with upcoming major AAA titles like Deux Ex Mankind Divided, Battlefield 1, Watch Dogs 2, Star Citizen all being DX12 based.
 
The issue with Samsung is it seems already spread too thin like Sony, and like Sony almost all of their divisions have been losing money. I do not see them making a move like this at all, and if they did it would probably be after shedding itself from the dross first.

Yeah, true. I really don't see any company trying to get into the dedicate GPU market right now. Intel tried and completely gave up. If Intel didn't want to attempt it nearly a decade ago I doubt anyone else is going to want to bother now. Unless Nvidia or AMD goes belly up and someone buys their GPU divisions and patents we're stuck with the two we've got.
 
I think you are completely wrong here. AMD cards consistently gain perf when comparing DX12 vs DX11 perf in all games which support DX12. In the price segment that Rx 480 launched AMD wins the majority of the DX12 games currently released by a significant margin. Its a thrashing to current Nvidia cards like GTX 980 / GTX 970. It remains to be seen how the GTX 1060 will fare in DX12 games. Of the DX12 games released , AMD Gaming Evolved titles like Ashes and Hitman are clearly running faster on DX12 with AMD cards. Nvidia cards seem to perform slightly slower with DX12. RoTR which is a Gameworks title shows performance regression for Nvidia cards when comparing DX11 vs DX12 again while AMD cards gain slightly.

The Radeon RX480 8GB Performance Review - Page 24
The AMD Radeon RX 480 Review - The Polaris Promise | Gears of War: Ultimate Edition
The AMD Radeon RX 480 Review - The Polaris Promise | Hitman (2016)
The AMD Radeon RX 480 Review - The Polaris Promise | Rise of the Tomb Raider

Right now DX12 performance is an embarassment for Nvidia Maxwell cards especially the GTX 980 and GTX 970. For AMD cards its their key advantage. If GTX 1060 does not fix that its going to get ugly pretty soon with upcoming major AAA titles like Deux Ex Mankind Divided, Battlefield 1, Watch Dogs 2, Star Citizen all being DX12 based.

Async Shaders are to nVidia cards like Tessellation is to AMD cards. Whereas tessellation is explicitly supported in DX, async shaders is not. Pascal discards async shaders just like Polaris discards tessellation. I'd call it a wash amongst current hardware, which is Pascal and Polaris.

As DX 12 is adopted and matures and is implemented correctly amongst IHVs, it remains to be seen if Pascal is neutered compared to Polaris.
 
Yeah, true. I really don't see any company trying to get into the dedicate GPU market right now. Intel tried and completely gave up. If Intel didn't want to attempt it nearly a decade ago I doubt anyone else is going to want to bother now. Unless Nvidia or AMD goes belly up and someone buys their GPU divisions and patents we're stuck with the two we've got.
Anyone tried eating their own words? Mine tasted a bit sour...

Must have remembered it completely wrong, or I was reading the wrong graph... oops..
 
Async Shaders are to nVidia cards like Tessellation is to AMD cards. Whereas tessellation is explicitly supported in DX, async shaders is not. Pascal discards async shaders just like Polaris discards tessellation. I'd call it a wash amongst current hardware, which is Pascal and Polaris.

As DX 12 is adopted and matures and is implemented correctly amongst IHVs, it remains to be seen if Pascal is neutered compared to Polaris.

Your analogy is horrible. AMD implements tesselation according to DX11 spec. AMD cards perform well with tesselation unless you go for obscene tesselation with zero or negligible image quality gain. Remember Crysis 2

Crysis 2 tessellation: too much of a good thing?

or the Witcher 3 Hairworks 64x tesselation perf which had terrible perf cost and negligible IQ gain compared to 8x/16x which AMD enabled through drivers.

Optimizing The Witcher 3: Wild Hunt Performance on AMD Radeon™ Graphics

Funnily the developer introduced a slider for tesselation a few months later to give the gamer the option to choose perf vs IQ gain tradeoff.

Game Settings - The Witcher 3 Wild Hunt Gameplay Performance Review

BTW Asynchronous compute or async shaders (graphics and compute concurrent execution) is a DX12 feature

Synchronization and Multi-Engine (Windows)
Gaming: Major new features of DirectX® 12 | Community

So in contrast to your claims AMD supports DX11 tesselation and DX12 async shaders while Nvidia does not support DX12 async shaders.
 
Nobody has the technology and patents right now to take on nvidia. If AMD folds or sells off RTG then that might become possible.
 
Not sure if means anything but I dug up a comparison between the "leaked" GTX 1060 Fire Strike score against a (basically) stock RX 480, both using an i7 6700K.

GTX 1060
NVIDIA GeForce GTX 1060 3DMark Firestrike Performance Revealed

RX 480
I scored 11 622 in Fire Strike

jo85vCY.jpg


Based on Fire Strike alone, the RX 480 is slightly ahead.
 
However, you forget that the leaked FireStrike was without official driver from nVidia, once the driver is released wouldn't it be better?
Can you run the 1060 without official working drivers?
And also, I am still wondering how people are trying to "sell" the $200 point of the RX 480, and they are forgetting that the card that was shown and reviewed everywhere was the 8 GB one (price for it starts at $239).
Because the 8GB version has zero difference in performance, and it makes sense to buy the 4GB. Nobody cares about the missing 2GB on the 1060 cause it'll make zero difference as well. None of these cards can effectively make good use of 8GB of vram.

I would always prefer the 1060 over the RX 480, as nobody knows if AMD will survive for the next 2 years.
Good thing AMD has open source drivers that work. How's that Nouveau drivers working out there Nvidia? Not that I expect AMD to go anywhere soon.
You can buy wine now that you won't drink for three years and there's a market for that. If I'm spending ~250 to ~300 USD for a graphics card I want it to be usable in today's games now; not three years down the line when AMD gets around to unlocking the full potential of the card. I say this as someone who has only had AMD cards in the PCs in this house for the last decade.
You're going to spend $50 more than the RX 480, and eventually the RX 480 may match that of the 1060. I'm willing to bet that the $300 FE is going to set the price much higher for the 1060's than $250, just from observing the fiascos with the 1080/1070's.

Gigabyte GTX FE 1080 for $886 on NewEgg. That's in supply now, and if you hurry you can buy it. That's what we could see with the 1060.

Async Shaders are to nVidia cards like Tessellation is to AMD cards. Whereas tessellation is explicitly supported in DX, async shaders is not.
You sure Async Shaders isn't part of the DX12 spec? Sounds like nonsense you pulled from your arse.
Pascal discards async shaders just like Polaris discards tessellation.
If Polaris discards tessellation then we have a problem here, cause how the hell is the RX 480 rendering gaming without it? BTW, Polaris is now better at tessellation.
I'd call it a wash amongst current hardware, which is Pascal and Polaris.
I call it made up on the spot, and really badly done as well. Next thing you're going to tell me is that polygons is something AMD discards as well. How about textures? No wait, AMD's tile based rendering actually does discard textures.
As DX 12 is adopted and matures and is implemented correctly amongst IHVs, it remains to be seen if Pascal is neutered compared to Polaris.
If it isn't apparent, it depends on which company game developers work with? If they work with AMD, they get Ashes of Singularity. If they work with Nvidia, they get Rise of the Tomb Raider.
 
Last edited:
I feel rather irrational about it, but I'm seriously tempted to sell off my 970's in my HTPC and just get a 1060. For what I can realistically sell two 970's for here locally I could probably straight purchase the 1060.

Would it be worth the raw performance drop to ditch SLI and gain 2.5GB of video memory? The TV is an older plasma (1080p) so I'm not sure how soon the "3.5GB" 970's are going to be inadequate.

I'm not concerned about saving electricity, but lower case temps would also be nice. I feel like I have the upgrade itch and just want to scratch it!
 
Status
Not open for further replies.
Back
Top