athenian200
Gawd
- Joined
- Mar 29, 2012
- Messages
- 837
My old Z77 I just upgraded from used to have that technology, and I'm kind of curious what happened to it after the fact since I haven't heard anything about it since. Specifically technologies like HyperFormance and Virtual V-Sync. I put this in the "Video Cards" section, even though you could make the argument it's about software and virtualization to some extent (I won't mind if the thread gets moved), because the goal is getting more performance out of graphics hardware.
The way it worked was... you had to plug your monitor into the integrated GPU, turn on the technology, and it would let you use the integrated GPU and the dedicated one together to get slightly higher performance in games. I used to occasionally enable HyperFormance on games to get maybe another 5-10 FPS, so I could get smoother framerates. It really only worked well at 1080p, but it was pretty good for that. It didn't always work well, but when it did it was a neat trick to have.
Did that technology pretty much die off? If so, it's kind of a shame because I would really like to see what would happen if I used an older card like the GTX 560 Ti or 670 with a UHD 750 taking some of the load off it. More out of curiosity than because it's practical, but still. It seems like technology that shares the load between a dedicated GPU and an integrated one would be pretty useful right about now. Or really any technology that can load up a CPU and/or an iGPU to get more performance out of old or low-end graphics hardware paired with newer computer hardware.
It seems like nothing really happened there and everything is back to how it was before... basically, everyone accepts that you only get the performance your chosen GPU gives you, maybe whatever you can achieve from overclocking, and not many people attempt bizarre or quirky technical tricks like this to try and get more out of it.
The way it worked was... you had to plug your monitor into the integrated GPU, turn on the technology, and it would let you use the integrated GPU and the dedicated one together to get slightly higher performance in games. I used to occasionally enable HyperFormance on games to get maybe another 5-10 FPS, so I could get smoother framerates. It really only worked well at 1080p, but it was pretty good for that. It didn't always work well, but when it did it was a neat trick to have.
Did that technology pretty much die off? If so, it's kind of a shame because I would really like to see what would happen if I used an older card like the GTX 560 Ti or 670 with a UHD 750 taking some of the load off it. More out of curiosity than because it's practical, but still. It seems like technology that shares the load between a dedicated GPU and an integrated one would be pretty useful right about now. Or really any technology that can load up a CPU and/or an iGPU to get more performance out of old or low-end graphics hardware paired with newer computer hardware.
It seems like nothing really happened there and everything is back to how it was before... basically, everyone accepts that you only get the performance your chosen GPU gives you, maybe whatever you can achieve from overclocking, and not many people attempt bizarre or quirky technical tricks like this to try and get more out of it.