Well. Now that you can use a Freesync monitor with Nvidia GPUs pretty much just fine, there's no longer a 'gsync' tax on monitors. At $699, the Radeon 7 and the 2080 cost the same and perform about the same. The question now is 16gb of memory vs 8gb + Raytracing + DLSS. Seems to me that AMD was forced to go with HBM2 memory again because of RND costs already sunk into Vega. It was easier for them to release a Vega 2.0 than to redo the memory interface to support cheaper GDDR6 memory. I still question the amount of memory though, 16 GB is only useful for professional applications, AI, data center usage, etc. I feel a cut down 12GB version with 756 MB/s memory bandwith for say $150 or $200 less would sell very well and see pretty much no performance dropoff compared to the 16GB 1000 MB/s version.