I usually use the techpowerup gpu database when comparing different gpus, primarily comparing the pixel rate since I'm not using gpus for gaming or anything that will make too much use of the texture rate (unless browsers are getting smart about that now).
And there's a limit to driver support for my application beyond a certain generation of cards, so I was looking at that as a ceiling for what cards I could use until it dawned on me--do drivers really matter when the raw pixel rate is just super-fast anyways?
Here's an example:
GTX 770 - 34.72 GPixel/s and has driver support in my application:
https://www.techpowerup.com/gpu-specs/geforce-gtx-770.c1856
vs
GTX 1060 3GB - 81.98 GPixel/s but no driver support
https://www.techpowerup.com/gpu-specs/geforce-gtx-1060-3-gb.c2867
Between these two cards in the same hardware setup, my instinct says that the 1060 would be much faster even without drivers for drawing general computing stuff on the screen. Or am I way off base here? Thoughts?
And there's a limit to driver support for my application beyond a certain generation of cards, so I was looking at that as a ceiling for what cards I could use until it dawned on me--do drivers really matter when the raw pixel rate is just super-fast anyways?
Here's an example:
GTX 770 - 34.72 GPixel/s and has driver support in my application:
https://www.techpowerup.com/gpu-specs/geforce-gtx-770.c1856
vs
GTX 1060 3GB - 81.98 GPixel/s but no driver support
https://www.techpowerup.com/gpu-specs/geforce-gtx-1060-3-gb.c2867
Between these two cards in the same hardware setup, my instinct says that the 1060 would be much faster even without drivers for drawing general computing stuff on the screen. Or am I way off base here? Thoughts?