Trepidati0n
[H]F Junkie
- Joined
- Oct 26, 2004
- Messages
- 9,269
This article must have taken a cubic shit ton of time to get both the data and make it all purty for us. I think it answers a lot of questions but also opens some others. Thanks again.
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
Thank you for taking the extra time to ensure you got things right.
This really begs the question though: do we need a faster Crossfire / SLI bridge technology, or, how can we make multi-GPU solutions faster?
Change the whole way multi-GPU is currently done, though AFR is the most efficient way right now, it's entirely inefficient in the grand scheme of things, the framebuffers for one are not shared. You want to make multi-GPU better, figure out a way to combine both video cards framebuffer into one usable combined virtual space of memory and negate the need to copy the framebuffers.
wtf is this rant about?WOW!
Kyle's a fucking asshole, and the performance difference between PCI-E modes is very little more than the ipc difference, and pretty much no net difference for AMD cards.
Why isn't Captain Rectal Headgear chiding Nividia about a clear CPU dependence. Pretty obvious to me that NV cards are leaning on CPU for something that ATI covers on card.
Go ahead and ban me. You're a biased asshole. Oh, and you act completely unprofessional, given as you have official capacity here.
Clean the green jizz of your chin.
Just a thought, but does PCIe 3 affect level loading times? I'm guessing but maybe at the start of each level the game loads as many textures as possible into the video memory, and PCIe 3 might make that much faster.
Just a thought, but does PCIe 3 affect level loading times? I'm guessing but maybe at the start of each level the game loads as many textures as possible into the video memory, and PCIe 3 might make that much faster.
didn't notice any differences
WOW!
Kyle's a fucking asshole, and the performance difference between PCI-E modes is very little more than the ipc difference, and pretty much no net difference for AMD cards.
Why isn't Captain Rectal Headgear chiding Nividia about a clear CPU dependence. Pretty obvious to me that NV cards are leaning on CPU for something that ATI covers on card.
Go ahead and ban me. You're a biased asshole. Oh, and you act completely unprofessional, given as you have official capacity here.
Clean the green jizz of your chin.
Without calling names, it's obvious you have an AMD bias here, and that is not appreciated, and certainly won't garner you any respect.
To add to that, consider that the game typically has to load stuff into main memory before piping it to the GPU(s). Main memory is considerably slower than any PCIe 16x interface.
You sure about that? PCIe 3.0 x16 is 16 GB/sec one-way, while main memory (on Sandy, YMMV) is 20+ GB/sec depending on speed. I guess it depends how you define it, but main memory isn't considerably slower, I don't think.
Thing is, it doesn't matter- even if the Nvidia cards are a little more CPU dependent, we have such a relative excess of CPU performance that their CPU dependence just won't make a difference
You sure about that? PCIe 3.0 x16 is 16 GB/sec one-way, while main memory (on Sandy, YMMV) is 20+ GB/sec depending on speed.
Great review! This reminds me of the old days of DDR speeds and everyone pushing the highest speeds with little to no real-world performance difference.
I'm also happy to hear this, since I was considering a mobo/cpu update along with a new higher-end video card, but it really won't make the amount of difference to make up for the money spent.
Thanks again!