ASUS P8Z68-V Pro Z68 Chipset Motherboard Review @ [H]

I dunno Dan, I think they tested both modes:

modes.png


Take a close look at that chart. One line has "Native HD Graphics, Virtualize Discrete Graphics" (I MODE) while the other has "Native Discrete Graphics, Virtualize HD Graphic" (D MODE). Both have (roughly) the same power consumption at idle.

It's possible that the Virtu software isn't capable of deactivating all GPUs. It could simply be that what we are seeing is a lack of maturity in the software and the technology in general. I don't have an answer for that right now.
 
I see.
I guess we'll have to wait for proper UD7 testing. It'd be a shame if it weren't possible. The GPU is there and the chipset supports it. We've seen the outputs don't need to be used.
The UD7 being the high end, I find it surprising that they would prevent quicksync usage just to cut on software costs.
 
re: Transcoding

What about this scenario. I have a 30" monitor that runs at 2560x1600, so using the 'I' mode doesn't sound like an option since it's single-link DVI. I do however want to take advantage of Virtu for transcoding video. Will Virtu work in 'D' mode? I thought it would but now I'm not so sure.

If the onboard video must be in use to get the advantages of Virtu, then what if I run a second lower-resolution monitor off the on-board video and use the discrete card for my 30?

Finally, if I was considering an i7 2600k for the added processing power to transcode and now it's handled by the motherboard, would it make more sense to go with the i5 2500k?
 
I thought that lucid chip enabled would increase overall performance though. Wonder if some bios updates or driver updates will fix that?
It only increase performance for media coding in terms of allowing Quick Synk.

Even if you use virtu in mode "d" ?
Could you install the virtu app on a headless z68 motherboard?
Asus will allow this for some of their upcoming boards. Gigabyte doesn't.

Why would you need or want them to be headless? That's what I don't get. Sure if you had a farm for them or something, but I don't see a reason for anyone to have a transcoding farm in their house. Maybe I'm missing something.
Cheaper production cost, save space on the back plate for use with other connectors, etc.
Most enthusiasts probably won't use the motherboard's graphics connectors anyway.

The 10% drop in performance noticed I think was probably the result of you using such a high performance graphics card. My suspicion is that the data channel through the motherboard practically throttled the throughput.
With a less potent graphics card the performance impact should likewise be far less. A drop from 40fps average to 39fps average seems more like the results I've seen elsewhere.
 
Hot stuff! Looks like a nice board!

Edit: A question? Can you use both the integrated and discrete at the same time for two different monitors? Right now my 30" is on my 6850 and a 16" on the integrated on my mobo, can I do that with the z68 or is one or the other?


Not only will you be able to connect a single monitor to your system and take advantage of both the integrated and discrete graphics, but you will also be able to utilize both sets of display outputs at the same time so you can connect quite a few monitors to your computer based on the Z68 platform. Above you can see we are using one DVI connection on the discrete NVIDIA graphics card and the DVI output on our ASUS P8Z68-V Pro motherboard as well.

http://pcper.com/reviews/Motherboar...iGPU-living-together-SSD-Caching-and-Overcloc
 
I run a 3 monitor setup now with a 5850 (dvi/dvi/displayport). So if I get this z68 board, I can attach the displayport monitor to the dvi port on the mobo and have it work fine?

Will it work fine in an eyefinity setup for gaming?
 
re: Transcoding

What about this scenario. I have a 30" monitor that runs at 2560x1600, so using the 'I' mode doesn't sound like an option since it's single-link DVI. I do however want to take advantage of Virtu for transcoding video. Will Virtu work in 'D' mode? I thought it would but now I'm not so sure.

If the onboard video must be in use to get the advantages of Virtu, then what if I run a second lower-resolution monitor off the on-board video and use the discrete card for my 30?

Finally, if I was considering an i7 2600k for the added processing power to transcode and now it's handled by the motherboard, would it make more sense to go with the i5 2500k?

Transcoding / Quick Sync works in both "I" and "D" modes. You can connect your monitor to which ever port you like. In your case you must use "D" mode. However, with "D" mode you do not have any chance of taking advantage of "potential" power savings by turning your discreet GPU off when you need it. As for which CPU to do with, I'd choose the 2600K. That's just me.

It only increase performance for media coding in terms of allowing Quick Synk.

Asus will allow this for some of their upcoming boards. Gigabyte doesn't.

Cheaper production cost, save space on the back plate for use with other connectors, etc.
Most enthusiasts probably won't use the motherboard's graphics connectors anyway.

The 10% drop in performance noticed I think was probably the result of you using such a high performance graphics card. My suspicion is that the data channel through the motherboard practically throttled the throughput.
With a less potent graphics card the performance impact should likewise be far less. A drop from 40fps average to 39fps average seems more like the results I've seen elsewhere.

I think you are confused. When I say headless, I mean monitor-less. This is what I assumed the other poster meant initially, which is why I was confused. The term "headless" has always referred to a lack of connected monitor. However, that poster was asking about using the transcoding / Quick Sync features on Z68 boards that do not have iGPU connectivity. As far as I know, you can't.

I'm well aware of why a motherboard manufacturer would want to leave out iGPU connectivity. The Z68X-UD7-B3 is an example of a board targeted towards people that won't be interested in using the iGPU features. As for the performance impact, well it did the same thing with the GeForce GTX 560Ti and the 6950. So I don't know how low you'd have to go but as I pointed out in the article, this performance hit was in the one game alone. There isn't a performance hit when using the "D" mode. Keep that in mind.

I run a 3 monitor setup now with a 5850 (dvi/dvi/displayport). So if I get this z68 board, I can attach the displayport monitor to the dvi port on the mobo and have it work fine?

Will it work fine in an eyefinity setup for gaming?

I didn't try that exact thing, but honestly, I'd doubt it. What you really need to use is "D" mode and then you'd be able to use your setup like you do now, but you'd also be able to use the Quick Sync / Transcoding feature as well.
 
I understand some of you think we did not do our jobs (as with every single other article we have ever published). The fact of the matter is that the technology and support is immature and we are not going to spend a lot more time on it right now till it gets a bit more stable.
 
So of these many board manufacturers, which are any good? I know ASUS kicks ass, and I'm not personally crazy about Gigabyte, but how is MSI or ASRock?
 
So of these many board manufacturers, which are any good? I know ASUS kicks ass, and I'm not personally crazy about Gigabyte, but how is MSI or ASRock?

ASRock is ASUS. That's just their lower tiered offerings. I've found them to be OK, but they lack the polish of ASUS' branded boards. Kind of like Honda is to Acura. Gigabyte is great and MSI is fine. I've experienced those two as slightly quirkier than some of ASUS' offerings at times but not recently. However ASUS has by far the best UEFI BIOS implementation of the three. Gigabyte's is non-existent. I don't think they are shipping their boards with UEFI yet and MSI's is not only badly laid out but I've had some issues with flickering and dialog boxes not clearing / screen refresh issues.
 
ASRock is ASUS. That's just their lower tiered offerings. I've found them to be OK, but they lack the polish of ASUS' branded boards. Kind of like Honda is to Acura. Gigabyte is great and MSI is fine. I've experienced those two as slightly quirkier than some of ASUS' offerings at times but not recently. However ASUS has by far the best UEFI BIOS implementation of the three. Gigabyte's is non-existent. I don't think they are shipping their boards with UEFI yet and MSI's is not only badly laid out but I've had some issues with flickering and dialog boxes not clearing / screen refresh issues.


Yeah, that's kinda what I assumed. I guess ASUS is it for me. Didn't know that about ASRock though. Thank you for the speedy reply, and thanks for the awesome article too by the way. Keep up the good work.
 
Gigabyte's TouchBIOS is on the Z68 boards, which is their version of EFI. Asus implementation looks like it will still be better though, at least in how it's laid out if not in actual functionality.
 
Kyle:
Could you ask AMD if they have plans to implement their own take of the switchable graphics for people who have AMD chipsets and AMD GPUs?

Considering they know the inner workings of their GPUs better than Lucid I would expect it to work a lot better.

My poor IGPU just sits there doing nothing...
 
Thanks for the review. Depending on factors, Z68 is probably the platform I'll end up building on ("factors" being when X79 comes out, and how much it costs).
 
I hate to grammar nazi this, but good lord! It's "discrete" not "discreet".:mad:
 
seems the lucid virtu still needs to be ironed out a bit. Its an awesome feature nonetheless.
 
I hate to grammar nazi this, but good lord! It's "discrete" not "discreet".:mad:
I dunno, you could say it's being discreet behind the iGPU with Virtu :D. But seriously, you're not the only one. I was fine with the mistake on page 1, but then it bugged me when I saw it repeated.
 
Found some typo's.. I think. The whole paragraph looks messed up to me but I decided to only include this one part. :)

Page 5:

"We have included WinRAR, a very popular zipping used by many when sending files or posting for download. We have also included the synthetic Cinebench 11.5 as well that should give you an idea about how video production programs will handle rendering scenes."
 
Last edited:
Asus chose PCI 2.0 vs PCI 2.1 for this also. Most current cards support 2.1, cant find any new boards with it, thier all still shipping 2.0, when will we see board with 2.1 ??

Also how is ASUS new ASMEDIA ( sub company owned by asus) USB3.0 controller working vs most boards have the NEC USB 3.0 on them.
The P6P67-Pro revision 3.0 uses NEC, they updated recently revision 3.1 uses the new ASMEDIA USB3.0. This new board looks like its using 2x ASMEDIA controllers for the USB3.0. Any chance to test compare them as i hear the new one actually get close to real USB3 speeds vs the buggy slower nec controller.

Is this controller better or worse then The NEC usb3.0 controller which has QA/Driver issues.
 
Last edited:
Asus chose PCI 2.0 vs PCI 2.1 for this also. Most current cards support 2.1, cant find any new boards with it, thier all still shipping 2.0.

Also does this USB 3.0 use Asus new controller or is it the NEC controller, the NEC controller has QA / Driver issues, want to know if all thats fixed with their new one.

If you look at P6P67-Pro they have revision 3.0 and now revision 3.1 which the change is the USB3.0 controller, couldn't find spec details on which USB3.0 controller was used here,
anyone know ??

RTFA. :D In the "Detailed Motherboard Specifications" table this question is answered. ASUS is now using the ASMedia controllers. These do in fact provide better performance than the NEC controllers they used to use.
 
Asus chose PCI 2.0 vs PCI 2.1 for this also. Most current cards support 2.1, cant find any new boards with it, thier all still shipping 2.0.

Also does this USB 3.0 use Asus new controller or is it the NEC controller, the NEC controller has QA / Driver issues, want to know if all thats fixed with their new one.

If you look at P6P67-Pro they have revision 3.0 and now revision 3.1 which the change is the USB3.0 controller, couldn't find spec details on which USB3.0 controller was used here,
anyone know ??

I know the P8P67-M Pro uses some ASMedia USB3 ASIC, however, the driver site includes a download for the NEC USB3 chip, too.... so maybe some boards have the other?

As for PCIe 2.0 vs 2.1, that's a choice Intel and AMD (chipsets, not GPU) made. No PCIe 2.1 or PCIe3.0 support until next generation chipsets/platform.
 
The only problems with the technology is that with "D" mode is you don't get the power savings, and with "I" mode you lose a slight amount of performance in games.

I think those are rather large problems considering 10% is more than a "slight" drop and typically represents a full drop-down in video card class. Kind of confused about the excitement. It all seems like a great proof of concept but kind of awkward in reality. Combined with the awkwardness and unpredictability of the Smart Response caching and it's like the tagline for Z68 should be "Hey, if you want to go half-ass with SSD or multi-GPU, Z68 is your option!"
 
I think those are rather large problems considering 10% is more than a "slight" drop and typically represents a full drop-down in video card class. Kind of confused about the excitement. It all seems like a great proof of concept but kind of awkward in reality. Combined with the awkwardness and unpredictability of the Smart Response caching and it's like the tagline for Z68 should be "Hey, if you want to go half-ass with SSD or multi-GPU, Z68 is your option!"

To be fair, that was one canned benchmark test. It doesn't represent what kind of performance you could loose in all games. You might lose next to none in some games, and more in others. This is an issue that needs to be explored more fully. That however is really a job for Brent / Mark, not I. It was a test I saw a change in that I felt represented what can happen when using it. And again, the "D" mode gives you Quick Sync support and you lose no performance. What's not to like about that? The SSD caching works, and when you are caching your OS volume I could see it being a major advantage. It's not quite as good as a real SSD, but it really can give you a middle ground between performance and storage capacity.

These features aren't for everyone, but you get everything P67 and H67 have to offer and more in one package.
 
To be fair, that was one canned benchmark test. It doesn't represent what kind of performance you could loose in all games. You might lose next to none in some games, and more in others. This is an issue that needs to be explored more fully. That however is really a job for Brent / Mark, not I. It was a test I saw a change in that I felt represented what can happen when using it.

After reading other reviews, and doing your own in-house tests, I think you'll find that the 10% drop can be very real. Wars have been fought on this board for much less. How is I-mode anything other than a software hack? I expected more from Intel when I first heard about Z68.

And again, the "D" mode gives you Quick Sync support and you lose no performance. What's not to like about that?

Obviously, what I don't like about D-mode is you lose the power savings (which would be useful 24/7/365). Clunky. Also, switching between the I- and D-modes requires rebooting. Clunky middleware, again.

I simply don't value transcoding (too much streamed and/or precompressed video is available in 2011). I would have been interested in 2005 when I was trying to compress DVDs for my Treo. Now, not so much.

The SSD caching works, and when you are caching your OS volume I could see it being a major advantage. It's not quite as good as a real SSD, but it really can give you a middle ground between performance and storage capacity.

I think any SSD + large conventional drive already gives you a middle ground between performance and storage capacity. I see no point to this technology because, as I said, I feel it's half-ass SSD. The concept was basically universally panned with hybrid HDDs and already done with products like the $40 SilverTone HDDBoost. It's not like this cache is basically free like with ReadyBoost.

Ultimately, I worry this tech will give the SSD producers yet another excuse to keep 100+GB SSDs (what most of us need to put 99.99% of our day-to-day stuff on a single SSD) from being truly affordable to the masses. Kingston is the only one really trying (barring closeouts from other manufacturers) and only with rebates. Food for thought.

These features aren't for everyone, but you get everything P67 and H67 have to offer and more in one package.

Yeah, but you pay for it. P67 has already dropped in price. My position is it's still the better buy.
 
Last edited:
...it's like the tagline for Z68 should be "Hey, if you want to go half-ass with SSD or multi-GPU, Z68 is your option!"

I couldn't agree more. I really hope Intel is spending its R&D budget on more compelling products/features than this snooze-fest. Seriously, no VT-d support? Yawn...:rolleyes:
 
kyle, you made it a point to include this TWICE in the review

"We have consulted with both ASUS and Intel on this but have not come away with any firm conclusions as how to fix the issue. What it comes down to is that the CPU clock is either running at its idle state of 1.6GHz, or its highest Turbo value of 3.8GHz. There is no scaling between the two clock values."

"We saw this in the last P67 ASUS motherboard we tested as well. If you leave the BIOS to full defaults, even with Windows 7 running in performance mode, you will see proper Turbo scaling, which we have never seen reach 3.8GHz under any kind of multi-threaded mode. Any changes to the BIOS, such as increasing the DDR3 memory clock to 1600MHz alone (as we test all motherboards) gives us the "3.8GHz" problem."

so excuse the question but at first you just say no scaling [whatsoever] and in the second part you actually mention "Turbo scaling"

you're right that people who OC wouldn't really care about turbo scaling, but can you clarify if EIST works properly even when you change the memory speed or are ALL cpu scaling functions compromised? when I OC i turn off turbo anyway but I wouldn't want the CPU running @ the max speed all the time....
 
Basically as I understood Kyle's notes the motherboard couldn't be set to run the processor with the same turbo frequency / settings we run the other test boards at. Basically they are set to reach a maximum of 3.4GHz. This didn't work on the ASUS board. Basically it allows turbo scaling up to 3.8GHz all the time. So the tests were actually showing favor towards the ASUS boards as they were running 400MHz higher than other boards with similar test configurations. Is this a problem? Yes and no. It isn't a problem for anyone who actually uses the board but it's a problem for us benchmarking as it unfairly skews the results towards favoring the ASUS boards. The wording may not have been clear enough in the article. If I'm off base Kyle can correct me, but I believe that's what is going on. He did those tests, no I. So I can't speak to that. This is however, what I understood to be going on.

Also, there is no point to altering power saving features or turning turbo mode off on P67 / Z68 boards. In fact turning turbo mode off basically kills your overclocking. There is no bus clocking to speak of anymore. Anything past about 103.3MHz on the base clock results in total system instability. Everything is done through turbo multipliers now. If you leave your system's CPU power savings features enabled (EIST, Turbo, Speedstep, C1E, etc.) and you increase the turbo multiplier to say 44x then this will result in a maximum clock speed of about 4.4GHz. Every CPU we've seen so far can do this. However when the system is more or less idle, it operates at about 1.6GHz. What we saw because the turbo multipliers weren't scaling down to what they were set to, was clock speeds of 3.8GHz (about 38x multiplier) instead of the 34 it was set to. So turbo scaling was off. (As in not scaling to what it was manually set to.)

Hopefully this clears things up. Kyle, correct me if I missed anything, or misunderstood your notes.

Note:
Sandy Bridge is a different animal. Gone are the days when you would turn off all your power savings features, turbo mode, etc. and mess with the bus clock. Hell Intel's even managed to do things that way with their DX58SO2 and Core i7 990X. (I think the 990X works as the 980X did, but only on other vendor's boards.)
 
Anyone know the difference between the P8Z68-V PRO and the P8Z68-V. TigerDirect has them both with at $20 price difference, but hard to tell what the difference between them is without going line-by-line through all the marketing stuff.

[Edit] Found the Asus compare page. Here's what the PRO has extra:
Marvell® PCIe SATA 6Gb/s controller : *2
2 x SATA 6Gb/s port(s), navy blue

Intel® LAN- Dual interconnect between the Integrated LAN controller and Physical Layer (PHY)

Absolute Pitch 192kHz/ 24-bit True BD Lossless Sound

VIA® 6308P controller
2 x IEEE 1394a port(s)
(2 at mid-board)

1 x ASUS USB 3.0 Bracket(s) Both have USB 3.0 on back panel, PRO just includes extra bracket
 
Last edited:
Basically as I understood Kyle's notes the motherboard couldn't be set to run the processor with the same turbo frequency / settings we run the other test boards at. Basically they are set to reach a maximum of 3.4GHz. This didn't work on the ASUS board. Basically it allows turbo scaling up to 3.8GHz all the time. So the tests were actually showing favor towards the ASUS boards as they were running 400MHz higher than other boards with similar test configurations. Is this a problem? Yes and no. It isn't a problem for anyone who actually uses the board but it's a problem for us benchmarking as it unfairly skews the results towards favoring the ASUS boards. The wording may not have been clear enough in the article. If I'm off base Kyle can correct me, but I believe that's what is going on. He did those tests, no I. So I can't speak to that. This is however, what I understood to be going on.

Also, there is no point to altering power saving features or turning turbo mode off on P67 / Z68 boards. In fact turning turbo mode off basically kills your overclocking. There is no bus clocking to speak of anymore. Anything past about 103.3MHz on the base clock results in total system instability. Everything is done through turbo multipliers now. If you leave your system's CPU power savings features enabled (EIST, Turbo, Speedstep, C1E, etc.) and you increase the turbo multiplier to say 44x then this will result in a maximum clock speed of about 4.4GHz. Every CPU we've seen so far can do this. However when the system is more or less idle, it operates at about 1.6GHz. What we saw because the turbo multipliers weren't scaling down to what they were set to, was clock speeds of 3.8GHz (about 38x multiplier) instead of the 34 it was set to. So turbo scaling was off. (As in not scaling to what it was manually set to.)

Hopefully this clears things up. Kyle, correct me if I missed anything, or misunderstood your notes.

ok so we have to leave turbo enabled on SB for better OCing...

now the next thing would be to confirm that EIST works properly when you set the memory to anything other than default


because it would also be silly for the CPU to run @ 3.4/3.8GHz at anything other than idle I don't care about the fact that turbo doesn't scale between 1-4 extra bins and appears to be locked to run 4bins higher full time, which I assume is what you guys are calling the Turbo scaling bug, but I DO care if it won't use any of the multipliers between 16-34x when the CPU is running at as small load [not idle] say 20%
 
ok so we have to leave turbo enabled on SB for better OCing...

now the next thing would be to confirm that EIST works properly when you set the memory to anything other than default


because it would also be silly for the CPU to run @ 3.4/3.8GHz at anything other than idle I don't care about the fact that turbo doesn't scale between 1-4 extra bins and appears to be locked to run 4bins higher full time, which I assume is what you guys are calling the Turbo scaling bug, but I DO care if it won't use any of the multipliers between 16-34x when the CPU is running at as small load [not idle] say 20%

We always set the memory frequency to DDR3 1600MHz. I've never seen any problems with turbo scaling between 16x and 34x when doing so.
 
When I say headless, I mean monitor-less. ... that poster was asking about using the transcoding / Quick Sync features on Z68 boards that do not have iGPU connectivity. As far as I know, you can't.
iGPU connectivity and video output are two different things. The latter requires the former, but you can have the connectivity without any on board outputs.
Asus provide this with P8Z68 Deluxe and MAXIMUS IV Extreme-Z, which both provide Virtu (d-mode) support.

... with "D" mode you do not have any chance of taking advantage of "potential" power savings by turning your discreet GPU off when you need it.
Obviously, what I don't like about D-mode is you lose the power savings ...
Actually, right now it's the other way around: D-mode offer some power savings (by turning off the iGPU) that I-mode doesn't. I-mode can't run power-efficiently as long as the motherboards can't be turned off while not in active use.
Perhaps a new generation of graphics cards will "fix" this...?
 
Actually, right now it's the other way around: D-mode offer some power savings (by turning off the iGPU) that I-mode doesn't. I-mode can't run power-efficiently as long as the motherboards can't be turned off while not in active use.
Perhaps a new generation of graphics cards will "fix" this...?

I'm sorry but I've read this several times and it still makes no sense. I'm just going to assume you're confused.
 
Right now there is evidently no power savings to be had. In D-mode the iGPU would be idle unless you are running quick sync. That basically means that you'd be drawing 1-3 watts I suspect at most which is enough to be classified as an acceptable margin of error in testing making validation of that theory hard to accomplish. With I-mode there should be some savings in power usage but since the Virtu application can't actually deactivate your discrete adapter it seems, then it will always draw it's normal idle power regardless of what you do.
 
Dan,

Were you able to run a primary monitor on the discrete graphics card and run an additional monitor(s) on the on-board integrated iGPU? I have the P8Z68-V Pro motherboard and a Radeon 6970 video card. No matter what I do in the Bios, I can't get both video sources to output at the same time.

Ideally, I'd like to run my 30" Dell 3007 at 2560x1600 off the 6970 and run two 20" Dell 2001s in portrait mode at 1200x1600 off the integrated graphics.

Thanks!
 
Last edited:
Dan,

Were you able to run a primary monitor on the discrete graphics card and run an additional monitor(s) on the on-board integrated iGPU? I have the P8Z68-V Pro motherboard and a Radeon 6970 video card. No matter what I do in the Bios, I can't get both video sources to output at the same time.

Ideally, I'd like to run my 30" Dell 3007 at 2560x1600 off the 6970 and run two 20" Dell 2001s in portrait mode at 1280x1600 off the integrated graphics.

Thanks!

I didn't perform that exact test. As I understand it you should be able to use both at the same time. Just like when you run a given GPU as your primary and some cheap PCI/PCIe card for the other monitors. However, you aren't going to be able to run more than one monitor off a single iGPU port. The only way you could is with some kind of signal splitter, but that will only result in cloned displays.
 
I've been trying with just a single monitor on the discrete GPU and a single monitor on the integrated GPU but I can't get it to work at all.

At one time, after trying removing drivers and I can't remember what all else, I did see what must have been the unrecognized intel graphics adapter in the device manager, but when I installed the driver it resulted in a blue screen. During other attempts I did see the intel graphics adapter properly recognized, but it said the device could not start.

My concern that I don't know if this is misconfigured, unsupported, or defective. If anyone has any success with running both video adapters, please let me know!
 
Back
Top