Maybe, but the problem is you'll have better than 2080Ti on much smaller die. And it's on EUV. So Nvidia can make big dies as on average 14nm without many defaults and much better than 7nm TSMC which is last node possible on DUV and which is now great for small dies but just okay for middle size. It means that small and middle size (let's say up to 250mm²) dies will cost very little to produce to Nvidia compared to AMD now and in line to the size of the chips on Turing. Apart from more than 50% percent performance increase for the same transistor count, there will also be a 1/4 die shrink for Ampere. So a same featured GPU as the 2080Ti will be on 7nm Samsung less than 200mm² and will run 50% faster for the same TDP, but mind that Nvidia will prefer to drop power consumption instead of preserving full speed benefit (so they announced), so that may be around 20/30% better. And mind that at 200mm² this won't be even the middle range of Ampere line. It will be like what's replacing the GTX 1660Ti. AMD on 7nm TSMC is using a 251mm² GPU for its 5700XT but 7nm DUV at TSMC is not quite as efficient and dense as 7nm EUV as Samsung but that doesn't explain all. Nvidia Turing architecture is far ahead of AMD RDNA 1 used in Navi. AMD needs to show what they can do better with RDNA2. If it's not much more efficient than RDNA1 they won't be even close to Ampere.Knowing Nvidia's typical cadence it's probably going to be a small die that comes out first. x80 Ampere will be 10-15% faster than Titan RTX for $799 FE. Reason why the next x80 won't have a 980 Ti to 1080 increase of 30% from die shrink is because the 2080 Ti is gigantic--754mm2 is 25% bigger than 601mm2 of 980 Ti.
You'll have to wait for 2021 for the big Titan/Ti Ampere card.
In fact Nvidia has a so much better architecture with Turing, that they could afford using a much older chip 16nm+ technology than AMD and also put on half of the chip Raytracing and IA parts not used by raster part, knowing that raster part alone, only on half of the chip, beats AMD with whole chip used for raster and also using recent 7nm technology. It's just double unfair competition for Nvidia and Nvidia still beats AMD. And mind that Nvidia makes huge money out of their GPU sales, unlike AMD who uses its CPU success to help for their GPU business vs Nvidia. And now Nvidia had 3 years free after launching Turing to prepare all new architecture made for cutting edge chip technology this time.
Not a big fan of any company but I am very impressed by Nvidia lately. Looking for AMD only if it's cheaper and AMD knows that.