No Samsung Series 9 Ultrabook with Discrete GPU?

They honestly look like junk to me. Even in the "good" pictures you linked you can see MASSIVE panel gaps.

I wouldn't end up carrying around something that big, it would end up sitting around in one place. Might as well just be using a desktop from my perspective.

I'm sorry, what are panel gaps?

Are you also suggesting that any laptop thicker than 1" or heavier than 6 lb. is not worth having?

The NP6110 is not perhaps as sleek as some of the Asus models:

http://www.powernotebooks.com/specs/images/6110/gallery/12-6110_2.jpg
http://www.powernotebooks.com/specs/images/6110/gallery/13-6110_3.jpg

But it does get the job done and combines excellent weight to performance ratio giving you a full speed quad core i7.and the GT650M.

I suppose I'm a bit disheartened with the Sager models.

If they ever release something without plastic, I'll likely buy it.
 
I'm sorry, what are panel gaps?

Are you also suggesting that any laptop thicker than 1" or heavier than 6 lb. is not worth having?



I suppose I'm a bit disheartened with the Sager models.

If they ever release something without plastic, I'll likely buy it.

Panel gaps, as in you can see a space between where two panels meet. Try looking for panel gaps on a Samsung Series 9 or a Macbook or Lenovo Thinkpad.


Yes, I am suggesting that about a big laptop. They're no longer laptops at that point, so they do not serve any useful purpose to me.

You started out looking at ultraportables, it is a truly massive step to go to something thicker than 1". Have you carried a desktop replacement around before? It's rather tiresome and annoying. Actually using them as a laptop is annoying too, you actually have to find space for them. They're not something you just pull out and use on the spot.

They're only real use is serving as a portable workstation, you bring it to work, leave it on your desk for the week, then take it home on the weekend.
 
Last edited:
What you pay in thickness and maybe a little in design in the sager/clevo you get in performance and cooling.

It's great for not being too flashy but not looking cheap at the same time.
 
You're not going to be at the Turbo speeds for more than a fraction of a second.

http://www.anandtech.com/show/6194/asus-ux31a-putting-the-ultra-in-ultrabooks/8

Run any 3D game where the GT cores have to work and the story changes dramatically. The CPU cores at 100% load and 2.8GHz consume around 15W of the allowed 17W TDP, but the GT cores under load appear to be capable of drawing 10-11W. Try to use them both at the same time and what follows is a balancing act (i.e. throttling) in order to stay within the allowed power and thermal envelope. The CPU package does manage to exceed the 17W TDP for a time, but after seven or eight minutes it drops to 17W before eventually stabilizing around 15W (±5%). The GPU clocks are also all over the map initially, as in this case Batman is busy loading and we’re watching the intro videos and navigating the menus. After about five minutes we’re in the actual game and we can see the CPU and GPU clocks (mostly) stabilize. Even after more than an hour, however, we still see GPU clocks as low as 500 MHz and as high as 900 MHz, with CPU clocks ranging from 1.0GHz to 2.5GHz—all while we’re sitting still and watching over Arkham City from a high perch. Not surprisingly, the result in terms of actual frame rates is that they can vary upwards of 50%, which makes for a generally less than desirable experience even if average frame rates are 30+ FPS in some titles.

The net takeaway here is that the ULV Ivy Bridge processors can’t actually hit max clocks on both the GPU and CPU cores without exceeding their 17W TDP. There’s potential for configurable TDP to allow plugged-in Ultrabooks to run ULV chips at a higher power envelope to improve performance. In fact, you can set the UX31A to 25W TDP, but it appears the cooling solution isn’t actually able to deal with the higher TDP for longer periods of time and thus the CPU ends up dropping back to 17W after a few minutes of heavy lifting. That’s hardly surprising, considering how thin the UX31A is—there’s just not much space for air to flow through.

More to the point, other Ultrabooks often omit the ability to change the TDP levels, so even with better cooling it wouldn’t be possible to run the CPU and GPU at full tilt; for that, you’d need a 25W TDP in practice—around 10W for the HD 4000 and another 15W for the CPU cores. Dustin tested the HP Envy 14 Spectre, which tended to run quite a bit cooler than the UX31A (and it’s also quite a bit larger). While we didn’t perform a full throttling analysis of the Spectre, we can already see from the above results what would happen. If you’re hoping to run an Ultrabook (i.e. a ULV CPU) at max Turbo Boost speeds all the time while loading up both the CPU and GPU, that just doesn’t look possible. Unless Intel can do something unexpected, I don’t think Haswell will even fix the problem. The simple fact is that loading up all areas of an approximately 1 billion transistor processor die at high clock speeds uses too much power to fit within the ULV TDP, and clock speeds are the way to address the issue.

Advertised clocks != real world clocks. In fact, depending on the situation (and varying load), you might even have a CPU that dips down below stock clock speeds in order to give the GPU more room.
 
Panel gaps, as in you can see a space between where two panels meet. Try looking for panel gaps on a Samsung Series 9 or a Macbook or Lenovo Thinkpad.

Are you saying that the monitor end or "lid" doesn't close properly, leaving a gap between it and the other end which houses the components and the keyboard on top?

Does this matter?

Yes, I am suggesting that about a big laptop. They're no longer laptops at that point, so they do not serve any useful purpose to me.

I'm afraid I disagree with you quite a bit on this one.

I've recently been looking at the process it would take to transport a desktop, and I've found that it simply isn't do-able.
You'd have to take all the components apart, wrap them up in bubble-wrap, find a way to transport the case (which is likely at least a Mid-Tower), re-construct the entire build (hoping it still works), carry all the necessary cables and hook them back up, not to mention you'd need a monitor.. that's another set of thick cables.

All in all, it's simply not do-able. That's why I try to mitigate my needs on the go, so I can do the rest of the processing at home.

You started out looking at ultraportables, it is a truly massive step to go to something thicker than 1". Have you carried a desktop replacement around before? It's rather tiresome and annoying. Actually using them as a laptop is annoying too, you actually have to find space for them. They're not something you just pull out and use on the spot.

Well, I agree. That's why I intended on buying two laptops (which works well for me and my gf). But at 1,000$ a piece (because I need a 1080p screen on both), this is not going well.

It seems that two laptops are too expensive and one laptop that has the features of both is either also too expensive or not even physically possible as of 2013..

Although who the heck knows with manufacturers not even listing new/upcoming models on their own websites!

They're only real use is serving as a portable workstation, you bring it to work, leave it on your desk for the week, then take it home on the weekend.

Yes, that sounds good. But in my case, my workstation is on a mountain :)

What you pay in thickness and maybe a little in design in the sager/clevo you get in performance and cooling.

It's great for not being too flashy but not looking cheap at the same time.

No, I totally agree.

I'd just rather they didn't use plastic.

Honestly, if it were the same exact design but with a stronger material, I'd probably have bought it by now.

You're not going to be at the Turbo speeds for more than a fraction of a second.

Advertised clocks != real world clocks. In fact, depending on the situation (and varying load), you might even have a CPU that dips down below stock clock speeds in order to give the GPU more room.

Are you saying that Turbo doesn't exist when there's a Discrete GPU or that it doesn't exist at all?

Because obviously it works long enough to test Cinebench 11.5, and that's several minutes. I've checked the scores and they're on par with my 2500k per the Turbo clock.
 
Not under dual load scenarios. Turbo is there to give one side, the CPU or the GPU, a bit more TDP headroom when the other side is essentially sitting idle and acting as dark silicon. When both are loaded, like under gaming or certain productivity applications, then the clock speeds will fluctuate dynamically and neither side will hit its turbo clocks for more than a fraction of a second. In gaming, this leads the HD4000 to severely stutter in frame rates and the CPU almost never hits its turbo speeds.
 
Not under dual load scenarios. Turbo is there to give one side, the CPU or the GPU, a bit more TDP headroom when the other side is essentially sitting idle and acting as dark silicon. When both are loaded, like under gaming or certain productivity applications, then the clock speeds will fluctuate dynamically and neither side will hit its turbo clocks for more than a fraction of a second. In gaming, this leads the HD4000 to severely stutter in frame rates and the CPU almost never hits its turbo speeds.

So, you're saying that Turbo can only apply to either the CPU or the GPU?

Then what's the point of it..? You can't play any game that stresses the CPU..
 
So, you're saying that Turbo can only apply to either the CPU or the GPU?

Then what's the point of it..? You can't play any game that stresses the CPU..

Sure you can play games, just don't expect consistent or even acceptable frame rates :p

The Cinebench openGL benchmark is a GPU only benchmark, thus the turbo was hitting its max speeds while the CPU was idle. Most, or rather all applications aren't like that. They'll have dual loads with varying degree, so that turbo is never achieved for more than a millisecond before dropping back down.
 
Sure you can play games, just don't expect consistent or even acceptable frame rates :p

The Cinebench openGL benchmark is a GPU only benchmark, thus the turbo was hitting its max speeds while the CPU was idle. Most, or rather all applications aren't like that. They'll have dual loads with varying degree, so that turbo is never achieved for more than a millisecond before dropping back down.

You mean dropping back down in order to increase the turbo on the other end, right?

I mean, we'll still have "half-Turbo" on both the GPU and the CPU instead of "full-Turbo" on either, right?

Do you know why this happens? Is it because it can't go over 17w? If so, how can other Ultrabooks deal with 17w + 35w? Surely there's enough room for the CPU/GPU to get to 25w in Turbo... no?

And if not, then isn't this even more of a reason to get a dedicated card?
 
Read the link, but, yes, I'll explain it to you anyway.

The 17W TDP refers to the power consumption (more accurately the heat to be displaced by that power draw) that's allowed in a specific chip. The 17W ULV chips have a TDP - a thermal design power - of 17 watts. The CPU can consume close to 15 watts under load at full turbo, thus leaving only 2 watts of room for the GPU (and the rest). This also applies to the GPU, and the Cinebench openGL benchmark serves as a good model. The GPU can consume 10ish watts so the CPU is left with (slightly less than) 7W.

Obviously, that's going to present a problem. In dual load scenarios (games) you're not going to have enough room for both the CPU and GPU to hit their desired max clock speeds. That results in stuff like this:

asus-ux31a-stresstest-clocks.png


The clock speeds are all over the place and as a result -

Even after more than an hour, however, we still see GPU clocks as low as 500 MHz and as high as 900 MHz, with CPU clocks ranging from 1.0GHz to 2.5GHz—all while we’re sitting still and watching over Arkham City from a high perch. Not surprisingly, the result in terms of actual frame rates is that they can vary upwards of 50%, which makes for a generally less than desirable experience even if average frame rates are 30+ FPS in some titles.

Note that neither the CPU nor GPU actually hit their turbo clocks in gaming because they're both under load, and because they both consume a good chunk of that 17W TDP even when not in turbo, neither reaches the turbo clock speeds. The CPU seems to average around 1.8ghz while the GPU averages around 900mhz.

So why limit it to 17W at all? Why not just allow the chip to chew through more power? Remember that I said TDP is the thermal design power, thus the power draw has a very direct correlation to heat. Temperatures and cooling limitations are the reason.

asus-ux31a-gaming-temps.png


So even though at 17W TDP the laptop has only half of the TDP to cool compared to a standard 35W laptop it still has issues dissipating that heat. That 17W, even at just a measly 17W, is actually too high. Therefore the chip has to resort to more strict throttling and the OEMs have to more carefully design their cooling in thin form factors, consequently raising prices. The ULV processors are also specifically binned so that too increases prices (fewer of those parts per wafer). *important to note here that the TJmaxx (max temperature allowed for a given chip/architecture) is 105C, but Asus restricts the temperature to 90C so it doesn't get blistering hot to the touch. As a result the throttling occurs at an even lower temperature than it would for a desktop processor which doesn't worry about the case getting hot ;P

So why not just add discrete GPUs? Well, because discrete GPUs don't exist at the 17W level but 35W (because they can't bin them that low and retain performance. Contrary to popular belief, clock speeds don't linearly scale with voltage. Seasoned overclockers know this fact very well. At some point you hit a "ceiling" where going higher or lower just isn't worth the added [or decreased in the case of downclocking] voltage). An Ultrabook that already has issues cooling a mere 17W ULV will have a nightmare of a time trying to dissipate an additional 35W TDP on top of that. It *can* be done, but the cost required would make the Ultrabook such a low volume and high cost item that it wouldn't even be worth making.
 
Last edited:
pelo, thank you. That was VERY helpful.

But one thing still doesn't make sense..

If the ASUS UX31A, which is 0.68" thick, can barely dissipate 17w..

How can the ASUS UX51Vz-XH71, which has almost the same thickness, dissipate something around 70w?

This seems more like marketing and/or price-binning than actual physical limitations.
 
Because it's also a larger laptop at 15.6" and a larger footprint. When you've got a thin design that's also small in size then the cooling issues are exacerbated.

It's most definitely physical limitations. It's the reason Intel is binning 17W ULV chips and why there's such a concerted effort for Intel to go even further down - hence their introduction of 13W TDP chips labelled as 8W "SDP," whatever the fuck that means.

With lower TDPs you'll also get more throttling and higher prices. The only thing that can turn this ship around is a truly revolutionary cooling method that's also cheap and can be produced in high volume. This isn't an OEM or Intel specific issue, but rather an issue the entire industry is tackling with. I'm pinning my hopes on piezoelectric cooling to come to market sometime in the near future, but whether I'm disappointed or pleasantly surprised depends on a whole host of factors :D

*additional nerdy tidbit: piezoelectric cooling works via accelerating the air out of the tiny gap that closes shut after intake. The accelerated airflow reduces the air temperature and in turn cools the chip.
 
Last edited:
Are you saying that the monitor end or "lid" doesn't close properly, leaving a gap between it and the other end which houses the components and the keyboard on top?

Does this matter?

I'm not talking about the lid closing, I'm talking about the edge of the laptop.

http://cukimages.com/ebayproductpics/Sager_NP9150_front_angled.jpg

Look in that picture you linked earlier along the edge of the laptop, see that line that runs all along the side where the palmrest panel and the bottom panel meet?

That's a gigantic gap in my view, and just speaks about the fit and finish of that laptop. It probably will creak with use, and I question whether the keyboard is nice and firm. Looks like the junk HP and Dell have mostly been making for years.

Those hinges also look very plasticky and undersized.

Have you reconsidered the Samsung Series 7 line? Discrete graphics, under 1" thick, numpad on the 15" model and pretty damn good battery life.
 
Last edited:
Because it's also a larger laptop at 15.6" and a larger footprint. When you've got a thin design that's also small in size then the cooling issues are exacerbated.

It's most definitely physical limitations. It's the reason Intel is binning 17W ULV chips and why there's such a concerted effort for Intel to go even further down - hence their introduction of 13W TDP chips labelled as 8W "SDP," whatever the fuck that means.

With lower TDPs you'll also get more throttling and higher prices. The only thing that can turn this ship around is a truly revolutionary cooling method that's also cheap and can be produced in high volume. This isn't an OEM or Intel specific issue, but rather an issue the entire industry is tackling with. I'm pinning my hopes on piezoelectric cooling to come to market sometime in the near future, but whether I'm disappointed or pleasantly surprised depends on a whole host of factors :D

*additional nerdy tidbit: piezoelectric cooling works via accelerating the air out of the tiny gap that closes shut after intake. The accelerated airflow reduces the air temperature and in turn cools the chip.

I've honestly given up on new forms of cooling. Nothing has seemed to replace the old metal+air-flow bit and it's been a long fricking time.

One question:

Are you saying that the 3317U is fucked and cannot Turbo both the CPU and GPU because it cannot be made to go over 17w?

If so, then wouldn't the 3520M be able to Turbo both the GPU and CPU, since it is 35w?

In this case, wouldn't TDP = Turbo Diagnostic Performance?

I'm not talking about the lid closing, I'm talking about the edge of the laptop.

http://cukimages.com/ebayproductpics/Sager_NP9150_front_angled.jpg

Look in that picture you linked earlier along the edge of the laptop, see that line that runs all along the side where the palmrest panel and the bottom panel meet?

That's a gigantic gap in my view, and just speaks about the fit and finish of that laptop. It probably will creak with use, and I question whether the keyboard is nice and firm. Looks like the junk HP and Dell have mostly been making for years.

Those hinges also look very plasticky and undersized.

Have you reconsidered the Samsung Series 7 line? Discrete graphics, under 1" thick, numpad on the 15" model and pretty damn good battery life.

Oh, you mean the gaps in the portion of the chasis that houses the components.. as if the screws are loose or something? Isn't this by design.. as in, it's been poorly designed?

You're probably going to see less in me for this, but..

I can't buy a laptop that is not black. It's just not going to happen.

The inside, maybe, yes. It might even be nice. But the outside, hells no.
 
I've honestly given up on new forms of cooling. Nothing has seemed to replace the old metal+air-flow bit and it's been a long fricking time.

One question:

Are you saying that the 3317U is fucked and cannot Turbo both the CPU and GPU because it cannot be made to go over 17w?

If so, then wouldn't the 3520M be able to Turbo both the GPU and CPU, since it is 35w?

In this case, wouldn't TDP = Turbo Diagnostic Performance?



Oh, you mean the gaps in the portion of the chasis that houses the components.. as if the screws are loose or something? Isn't this by design.. as in, it's been poorly designed?

You're probably going to see less in me for this, but..

I can't buy a laptop that is not black. It's just not going to happen.

The inside, maybe, yes. It might even be nice. But the outside, hells no.

Aesthetics matter, you have to carry the thing around in public.
 
I've honestly given up on new forms of cooling. Nothing has seemed to replace the old metal+air-flow bit and it's been a long fricking time.

One question:

Are you saying that the 3317U is fucked and cannot Turbo both the CPU and GPU because it cannot be made to go over 17w?

If so, then wouldn't the 3520M be able to Turbo both the GPU and CPU, since it is 35w?

In this case, wouldn't TDP = Turbo Diagnostic Performance?

The TDP is obviously correlated with where the chip is going. If you've got a big workstation notebook with lots of cooling then a 45W CPU will work fine, but when you're working with Ultrabook dimensions and form factors then you've got to aim for a 17W TDP.

Yes, the 35W and 45W CPUs don't have the same throttling issues as the 17W ULVs.

The clock speeds are whatever they are (and the fluctuate a lot), but it's the TDP that provides the ceiling.
 
The TDP is obviously correlated with where the chip is going. If you've got a big workstation notebook with lots of cooling then a 45W CPU will work fine, but when you're working with Ultrabook dimensions and form factors then you've got to aim for a 17W TDP.

Yes, the 35W and 45W CPUs don't have the same throttling issues as the 17W ULVs.

The clock speeds are whatever they are (and the fluctuate a lot), but it's the TDP that provides the ceiling.

And there's no way to take off the ceiling?

Well, then I'd say the 17w chips are useless.
 
There is configurable TDP (cTDP), but whether it's implemented or not depends on the chip, and BIOS and manufacturer. It can allow you to go up or down a bit in TDP via software but not enough such that the both the CPU and GPU have room to really stretch their legs.

If you're looking for performance, particularly gaming, then yes, the 17W chips are utterly useless.
 
There is configurable TDP (cTDP), but whether it's implemented or not depends on the chip, and BIOS and manufacturer. It can allow you to go up or down a bit in TDP via software but not enough such that the both the CPU and GPU have room to really stretch their legs.

If you're looking for performance, particularly gaming, then yes, the 17W chips are utterly useless.

Thank you.
 
If you can wait for Haswell, that way you can still keep the form factor of the Ultrabook with more powerful igpu, first rummors place it close to a GT650m in performance.... and less than 6 months away from release. If you wish to read more check, Anandtechs Intel Haswell GT3e GPU Performance Compared to NVIDIA's GeForce GT 650M

But to be as powerful as a GT 650M, it would have to be more than 4x as powerful as HD 4000.

Also, as has been said in this thread, it wouldn't make a difference in a 17w CPU because the TDP would bottleneck nearly all the performance of the GPU.
 
Back
Top