Been using a 4k 40" display for some time with a custom wide screen res, 3840x1620. This is about 25% less pixels than full UHD, so the performance gain vs UHD is useful. Problem is, after extended use (about a year), their is slight burn-in at the edges of the black bars.
If AMD raster = Nvidia, if power draw roughly same, if price within $50 of Nvidia, if RT better with Nvidia, I think majority of buyers will go Nvidia. Reviews will show RT performance in games and that will be a determining factor. Ppl are swayed by benchmarks. +/- $50 wont make much difference...
More important to me is 4k raster performance. RT not important for me at this stage, though with next gen cards (Hopper, RDNA 3) things should look more interesting with more RT titles and more capable HW to run RT @ 4k.
Techpowerup because they test the most games and of course for their relative performance charts.
Guru3d since been a member there for long time and the reviewer (Hilbert) uses a FLIR (thermal imaging camera) for temps which can detect weaknesses in GPU cooling designs.
Anandtech but they test...
RGB FULL. Even though thats 4:4:4, it will be 8-bit since need HDMI 2.1 to get above that. Still better than other options at this stage until HDMI 2.1 arrives.
Two subcontractors responsible for the Founders Edition manufacturing are now under investigation: Foxconn and BYD (Build Your Dreams). It is also said that even NVIDIA product and sales managers have not seen the design yet, which also may or may not be true, but if it is then it just proves...
Some speculation that there will not be a 3080ti (initially at least). That the 3080 will be the top card and based on a GA102 chip. Sounds logical to me since Nvidia had always done that with the Ti since Kepler (release it several months after a xx80). Only Turing bucked the trend.
Never had a card that bothered me noise-wise over last decade. There may be a few hot and loud cards this may apply to, but doubt the 2070 is one of them.
Best way to resolve your issues is to have a secondary display, preferably 1080p since that is a standard res compatible with AV receivers. Plug your GPU directly into your main monitor (with DP), then plug the HDMI from GPU directly to receiver which is connected to your second display. The...
This was a unique case involving an inferior architecture, Kepler, which has not kept up with the times. To use it as a general all encompassing example for your argument is misleading really. Try comparing other Nvidia vs AMD cards, ie, 980ti vs FuryX or GTX 1080 vs Vega 64 (release date...
Nvidia brightness, contrast, gamma settings do not effect the actual monitor hardware. Meaning if you have monitor settings at full brightness and drawing 80w power, but reduce brightness from Nvidia controls, it will still draw 80w power. Adjusting these settings directly from the monitor is...