- Apr 29, 2005
1080 ti no doubt.
No Gsync? No problem.
No Gsync? No problem.
Should of said dusty 2 year old 1080ti with half worn fans, vs 2 month old navi card. Another thing to consider, future driver support for Pascal.
If it were an RTX 2070, then different story.
By the time drivers aren't specifically developed for an architecture, all the available performance has already been squeezed out of it and the notion that newer drivers degrade performance has been debunked many times over. Game-Ready drivers apply to all supported cards.
nah they still happen but a lot of times it's not avertised because they want people to upgrade.. e.g. AMD constantly says X fix for 5700 series but the fix actually effects vega/palaris as well but it's never mentioned. or X performance improvement for the 5700 series in some game but palaris and vega also see a boost in performance in the same game but it's never mentioned. nvidia does the same thing til they eol a card series in their drivers after about 4-5 years.It’s likely that driver optimization is mostly replacing game shaders with a faster AMD/Nvidia specific one that “hopefully” produces the same result. The general purpose optimizations probably happen very early in the life of each architecture with everything else being game specific.
At some point they will stop doing the game specific stuff for older hardware. Hard to tell if that’s already happened for Pascal/Polaris/Vega.
You realize multigpu sucks right? Microstutter (or worse) is a very very common issue with multigpu with both vendors even when it was more supported.
There’s a reason I watercool, bios mod and hard mod the best single card rather than run multigpu.
I actually really enjoyed PhsyX in Borderlands... if only nVidia didn’t hardware lock gpu PhysX maybe we’d have seen it more often.
It’s a BS technology that has been superseded by software physics models long ago. That’s why it’s not used any longer. Wavy cloth isn’t hard to do.
Have an RTX 2080 Super so the 5700XT as long as it works. My last one did not and so I went with the 2080 Super. I really wanted the 5700XT to work so I could say I had a PCIe 4.0 card that went with my X570 board.
That’s a strange reason considering there’s no perceivable advantage to using PCIe 4.0.
Well you are talking to a person with a 3900x and 2080 super that only plays games once in a while and mostly uses the system to surf the web. So I bought the card to have something new and give AMD a shot again. And I just went from a B450 MB to an X570 MB and figured why not try this new card that is the only PCIe 4 one out right now.
Sadly I could not even get the drivers to install and tried for two days. Heck even reset my BIOS to defaults just in case there was a setting that was screwing it up. Drivers would just lock the system and reboot it after a while. Never ever installing. Ran every file cleaning trick I could think of to make sure it was not some NVidia drivers left over. The fans on the card were spinning like crazy and finally I pulled it out and returned it. Got the RTX 2080 Super instead.
I feel for AMD, being as they are, the underdog. So it always makes sense to try to undercut the competition by driving/leading open standards, hoping to undercut by simply "being first".
But, as Steve Jobs pointed out, being first is irrelevant: you have to be the first thing that actually COUNTS.
Luckily, for AMD, their CPUs are certainly doing exactly that right now.
p.s. Forget Intel in terms of driving GPU prices down, the other way around Intel would like...
p.p.s Intel won't open with a 'value' discreet GPU, I practically guarantee it. No one at Intel gives an actual fuck about enthusiast desktop graphics, most notably Raja...
That’s astounding you know exactly what nine out of ten people need without knowing their hardware or use case.