p67 vs z68

Skulltrail

n00b
Joined
Jun 21, 2011
Messages
29
Is the z68 really worth it? I don't think I'll be using an SSD anytime soon so I don't won't benefit from the SSD Caching. I don't convert videos much and am okay with waiting a little longer if I ever do, so the Quick Sync feature seems useless. From what I've read about the transcoding, and hybrid IGP+GPU gaming, Virtu doesn't do much to increase FPS ingame.

What else is there to look forward to with z68? The use of the Intel HD 3000? I really don't see how spending $50 more than a P67 board just to get these crappy features is worth it. Or am I misinformed?
 
Virtu doesn't increase gaming performance at all. It hurts it in certain configurations. It sounds like you've pretty much answered your own question. P67 sounds like the way to go for you.
 
The point of Virtu never was to increase FPS. In actuality, it decreases FPS because of latency between the GPU and IGP.

The point of Virtu is to use the IGP on regular desktop applications, shutting off the GPU and saving electricity. When you start something that demands the power of the GPU, the GPU seamlessly (in theory) turns on, processes everything, and transmits it to the IGP, which then outputs it to your monitor. And this is where the decrease in FPS comes into play, because it needs to transmit the info to the IGP before it can go out to the display, instead of directly outputting it. I would say it's not really worth it if you have a single mid-range GPU, but it may be worth it for the high-end/multi-GPU setups.

As for everything else Z68 offers, it seems like you don't need it.
 
The point of Virtu never was to increase FPS. In actuality, it decreases FPS because of latency between the GPU and IGP.

The point of Virtu is to use the IGP on regular desktop applications, shutting off the GPU and saving electricity. When you start something that demands the power of the GPU, the GPU seamlessly (in theory) turns on, processes everything, and transmits it to the IGP, which then outputs it to your monitor. And this is where the decrease in FPS comes into play, because it needs to transmit the info to the IGP before it can go out to the display, instead of directly outputting it. I would say it's not really worth it if you have a single mid-range GPU, but it may be worth it for the high-end/multi-GPU setups.

As for everything else Z68 offers, it seems like you don't need it.

Not even then. The software can't power down the GPUs and even the high end GPUs idle at low power so even under the best of circumstances the power savings is almost nothing. 20-30 watts at best assuming the thing worked which it doesn't.
 
They need to find a way to completely cut power to the discrete GPU until a game is being launched. This Virtu feature is halfway there lol. Couldn't you just connect the GPU to the system by USB so that you could gain control of the GPU by software in order to switch it on/off on demand? I dont understand why they havent developed something like that yet considering how rediculously power hungry GPUs have become even at idle.
 
They need to find a way to completely cut power to the discrete GPU until a game is being launched. This Virtu feature is halfway there lol. Couldn't you just connect the GPU to the system by USB so that you could gain control of the GPU by software in order to switch it on/off on demand? I dont understand why they havent developed something like that yet considering how rediculously power hungry GPUs have become even at idle.

20 watts is power hungry at idle? Go back only about two generations and make that comparison. You'll find that GPUs are using less and less power at idle all the time. USB wouldn't work for anything GPU related. Too slow, too much latency. What they need to do is add the technology to actually disable and power down PCI-Express slots as they can in some servers. The only problem at that point will be the need to have drivers loaded, and available without re-detecting the device and reloading drivers, which normally requires a reboot.
 
Wasn't nVidia able to develop something where they were able to do that though? Hybrid SLI, specifically Hybrid Power as I recall it was called. The graphics card was essentially shut off, the fan was not even spinning from the reports that I heard. Power consumption was about 3 watts higher than it would be without the graphics card at stock. Unfortunately, they discontinued it with the GTX 275/285 generation and drivers above 180. Does nVidia's Synergy work better than Virtu, perhaps in a similar way that Hybrid Power worked in the past?
 
Wasn't nVidia able to develop something where they were able to do that though? Hybrid SLI, specifically Hybrid Power as I recall it was called. The graphics card was essentially shut off, the fan was not even spinning from the reports that I heard. Power consumption was about 3 watts higher than it would be without the graphics card at stock. Unfortunately, they discontinued it with the GTX 275/285 generation and drivers above 180. Does nVidia's Synergy work better than Virtu, perhaps in a similar way that Hybrid Power worked in the past?

It is my understanding that Hybrid SLI was problematic. I've never tried it so I can't say much about it. As for Synergy, I've not tried that either. Every Z68 board I've seen thus far used the Lucid solution rather than NVIDIA / AMD.
 
Computex came and went and I didn't see any banter or news about Nvidia Synergy. I thought that was strange because there were some talk about it beforehand.
 
Does nVidia's Synergy work better than Virtu, perhaps in a similar way that Hybrid Power worked in the past?
I was trying to find answers, I thought it was just like Virtu after reading some comments about Nvidia Optimus. However after searching again I found the 'Optimus Whitepaper' (pdf document) on Nvidia's technologies webpage. I only skimmed the document, but page 11 onwards there are several reference to the GPU being shut off completely and the power usage comparison shows the idle state power consumption with powered off GPU as being a fraction more than idle power consumption without the GPU. Synergy is alternatively called Optimus for Desktop, so if the desktop version is the same as the notebook solution then may be better than I previously thought.

EDIT: actually there are conflicting statement in the paper, on page 11 it says both of these:
If the application can benefit from running on the GPU, the GPU is powered up from an idle state and is given all rendering calls.
When less critical or less demanding applications are run, the discrete GPU is powered off and the Intel IGP handles both rendering and display...
page 14:
When using non-taxing applications to accomplish basic tasks, like checking email or creating a document, Optimus recognizes that the workload does not require the power of the GPU. As a result, Optimus completely shuts off the GPU (and associated PCIe lanes) to provide the highest possible efficiency and battery life.
AMD also have an equivalent system to do the same thing, though I'm not aware of any name for it yet, but it should be announced soon.
 
Last edited:
Back
Top