First Hybrid Intel-AMD Chip Benchmarks Show Vega M Obliterating Intel UHD and MX 150

rgMekanic

[H]ard|News
Joined
May 13, 2013
Messages
6,943
HotHardware has gotten to run some benchmarks on the first Intel-AMD hybrid chip with the Dell XPS 15 convertible. The XPS 15 2-In-1 tested was configured with a Core i7-8705G processor (3.1GHz base, 4.1GHz boost), the Radeon RX Vega M GL graphics engine with 4GB of HBM2, 16GB of DDR4 memory, and an ultra-fast NVMe SSD.

Running at 1920x1080 resolution, the XPS 15 2-in-1 was able to maintain an average frame rate of nearly 35 frames per second with High image quality settings dialed in, 29.69 on Very High in the video. Intel's HD 620U could only manage 8 FPS on medium settings, and could not run the game on high as it would run out of frame buffer memory. An 8th generation Core processor paired with an NVIDIA MX 150 GPU was able to put up an average frame rate of around 23 frames per second in the same benchmark at High IQ settings.

It's been almost a year since we first heard the rumors of Intel getting Radeon graphics, and the result seems to be quite impressive. I'm really looking forward to when these get fully into the hands of reviewers so we can see how Intell plus Radeon stacks up not only in performance, but in battery life etc. Thanks to cageymaru for the story.

Needless to say, graphics performance definitely looks promising for these new 8th generation Intel Core processors with AMD graphics. We don't know how long this AMD partnership will last, but the bitter rivals should be commended for likely hitting it out of the park on the first try here.
 
How does the MX150 compare to the 560ti?
 
ezDL5df.jpg
 
As is usual for laptops:



Intel® Core™ i7-8705G Processor with Radeon™ RX Vega M GL graphics
4c/8t
base 3.1GHz; turbo 4.1GHz
65W TDP




Intel® Core™ i5-8250U
4c/8t
base 1.6GHz; turbo 3.4GHz
15W TDP

Geforce MX150
TDP guessed to be between 20W-30W
 
Maybe Intel will use better thermal interface material finally with this partnership just to keep AMD cooled off.
 
I wonder why the did not partner w/Nvidia?

Because Intel and AMD only compete in the server sector more or less and while that market is valuable the AI, mobile chip, Deep Learning, etc space.

Intel in some respects solves a problem for AMD and AMD in some respect solves a problem for Intel. For AMD it's fabs and for Intel it's a GPU wing.

It's actually a good match. Unfortunately they likely can't merge and this is a way to collude without getting shot down by regulatory issues.

I think it's interesting. I feel like we will like see Nvidia and Samsung team up for Mobile also.
 
It's worth keeping in mind that this SKU only has 32 ROPs, while the top binned version features 64. Will be very interesting to see what kind of difference that makes.
 
Battery life is going to be the big one; AMD hasn't been big on perf/watt, and that's gonna make a bit of a difference.


[personally, I have no dog in the fight, I'm long past trying to get decent gaming performance out of a laptop- the Intel mess in my ultrabook can play League at 1080p min settings, and that's about all I care about...]
 
Looks like a good start. Compared to Intel's past integrated offerings, I'm not surprised. Intel never could figure out how to make a gpu.
 
Grant it for the form factor but I have a Dell Inspiron 15 7000 gaming with a 7700hq, 16gb ram, 1050ti 4gb, 1080p ips and I paid 800$+tax for it, it blows this away at 1080p ultra settings and is 130 watts. So I really don't know what to make of this for the price, they said the target market was the 1060 but it can't even compete with a 1050 ti.
 
Grant it for the form factor but I have a Dell Inspiron 15 7000 gaming with a 7700hq, 16gb ram, 1050ti 4gb, 1080p ips and I paid 800$+tax for it, it blows this away at 1080p ultra settings and is 130 watts. So I really don't know what to make of this for the price, they said the target market was the 1060 but it can't even compete with a 1050 ti.

The Vega M GL targets the 1050, while the Vega M GH targets just about equal to a 1060MAX-Q.
 
What is the price premium for this over a standard intel chip?
 
What is the price premium for this over a standard intel chip?

That's not going to be so easy to distill; one of the advantages of this part is that it enables an entirely new form-factor, of sorts, so there won't be any 'direct' comparisons.

[and while there may be unit prices published, these parts are only going to be available in other products, and won't come ready to plug into some consumer socket]
 
I wonder why the did not partner w/Nvidia?

Long list of competition reasons but the big one is that most of Intel's GPU patent licensing was with amd anyways so both sides win. Amd gets more access to the mobile market and apple while Intel probably gets to save money on patent licensing.
 
I thought laptop chips did not have IHS?

But the grey goop they have been using for 20+ years has issues with separating and causing hot spots.

They really should just change to something like Arctic Ceramique 2 for.
 
this is where i see AMD doing well, weather paired with intel or paired with their own ryzen

their desktop cards are pretty much non existent flops, but the mobile option is quite nice
 
How susceptible is this new chip to spectre/meltdown?

Edited to add: ;)
 
Last edited:
this is where i see AMD doing well, weather paired with intel or paired with their own ryzen

their desktop cards are pretty much non existent flops, but the mobile option is quite nice

miners will likely disagree considering mining > gaming for sheer sales and value.
 
Pretty cool. All the more impressive considering how demanding ROTR can be, even at 1080p. Can't wait for the reviews to start popping up.
 
But the grey goop they have been using for 20+ years has issues with separating and causing hot spots.

They really should just change to something like Arctic Ceramique 2 for.

Thats not really how it works. Intel uses 2 different types of TIM as their recommended attach for heatsinks. For the pre-applied goop on the bottom of their oem heatsinks its typically Dow Corning TC5622, which has an insanely long lifetime at mildly OK temperatures. For die attach heatsinks on laptops and such i believe they recommend shin etsu x23, which again has a long lifetime and decent temps. The problem with pointing to a single goop or whatever as a replacement is that it can take years to fully qualify a new TIM for production, and it needs to be from a vendor that has a long history of doing things right. Arctic silver, noctua, thermal grizzly, arctic cooling, etc. none of these are anything but boutique vendors who just source pastes from Dow Corning, Shin Etsu, Dupont, etc.

That being said, I have no idea what intel uses for their internal tim between IHS and processor die.
 
As is usual for laptops:



Intel® Core™ i7-8705G Processor with Radeon™ RX Vega M GL graphics
4c/8t
base 3.1GHz; turbo 4.1GHz
65W TDP




Intel® Core™ i5-8250U
4c/8t
base 1.6GHz; turbo 3.4GHz
15W TDP

Geforce MX150
TDP guessed to be between 20W-30W



you're comparing a 1.6 ghz quadcore 15 w chip with a 3.1 ghz quadcore chip.
Exclude the turbo speeds as they are never attained in all thread usage and the base is always used.
MX150 delivers less performance at 30 watts, furthermore the footprint is larger and I see a issue here for nvidia.


https://ark.intel.com/products/122589/Intel-Core-i7-8550U-Processor-8M-Cache-up-to-4_00-GHz
This is inferiour to the 8705G at 2 ghz @ 25w so 55w (5watt less) for inferior gpu and cpu with larger and thicker laptop.


Edit:
Deeper dive into the TDP
Intel having 2ghz @ 25w for their 8550U and 65W for both CPU and GPU on 8705G,
they must use at least 10W more to attain 3.1 ghz so safe to assume it's a 30 W gpu or less and not a power hungry monster as one would assume by the comparison above.

My 8550U, 4700HQ, 6700HQ never go above their base frequency for more than a few seconds and only exception on sustained singlecore is the 8550U which can sustain higher than base clock for extended periods of time but never less even in all core usage.
Base frequency is the most valuable property of any intel laptop cpu and non K cpu and only owners of only unlocked desktop Intel cpu's are oblivious of this fact.
 
Last edited:
I wonder why the did not partner w/Nvidia?

From reading the first time about the Intel/AMD hybrid, I wondered the same thing. While i never found an answer for it, my thoughts why not are because maybe

  • Nvidia was too powerful a company at the moment, while AMD was the weaker link. Any negotiating would be in Nvidia's side compared to AMD who the last several years before the ryzen release was struggling. Business wise this would be better for intel, who doesn't need to release high end gear, but just a better low/mid end product to flood the market.
  • Nvidia's foray into AI and automation and compute might be something Intel wants to do in the future, and they didn't want to partner with Nvidia's level of competition
  • Very far on the theory list for me was the fact that intel peeps don't like leather jackets and Jensen Haung scares them.

These were all thoughts and once again none proven. But I do think the hybrid is pretty damn cool.
 
From reading the first time about the Intel/AMD hybrid, I wondered the same thing. While i never found an answer for it, my thoughts why not are because maybe

  • Nvidia was too powerful a company at the moment, while AMD was the weaker link. Any negotiating would be in Nvidia's side compared to AMD who the last several years before the ryzen release was struggling. Business wise this would be better for intel, who doesn't need to release high end gear, but just a better low/mid end product to flood the market.
  • Nvidia's foray into AI and automation and compute might be something Intel wants to do in the future, and they didn't want to partner with Nvidia's level of competition
  • Very far on the theory list for me was the fact that intel peeps don't like leather jackets and Jensen Haung scares them.

These were all thoughts and once again none proven. But I do think the hybrid is pretty damn cool.

I can only have a big smile on your post basically it comes down to the following Nvidia is taking market revenue from Intel and Intel does not give anything for Nvidia because when laptops sell they add in Nvidia while Nvidia gets paid more prolly then Intel does for the machine with an extra videocard.
Intel philosophy is that they can pay AMD a percentage of what there getting extra selling a complete solution because if you see the numbers why would OEM even bother with Nvidia again (just for the niche markets) Intel gets paid more AMD gets a little and Nvidia is not selling anything any time soon any more .. And Intel gets paid more something they neglected to realize several decades ago.....

Nvidia is about self promoting Nvidia that you can get caught up in their scheme is proof that they do it very well....
I wonder why the did not partner w/Nvidia?

You are funny ;)
 
Would like to see these compare to the Raven Ridge APUs. I know a dedicated GPU core and dedicated memory will wipe the floor with a shared memory APU, but I'd like to know how much.

Because these things are not going to be cheap, and RR should be affordable.
 
This looks promising, 1080p at high settings with those frame rates, not bad.
 
Impressive. I'm more interested in the Vega M GH performance and power draw, so hopefully we start seeing some deep-dives on that soon.
 
Thats not really how it works. Intel uses 2 different types of TIM as their recommended attach for heatsinks. For the pre-applied goop on the bottom of their oem heatsinks its typically Dow Corning TC5622, which has an insanely long lifetime at mildly OK temperatures. For die attach heatsinks on laptops and such i believe they recommend shin etsu x23, which again has a long lifetime and decent temps. The problem with pointing to a single goop or whatever as a replacement is that it can take years to fully qualify a new TIM for production, and it needs to be from a vendor that has a long history of doing things right. Arctic silver, noctua, thermal grizzly, arctic cooling, etc. none of these are anything but boutique vendors who just source pastes from Dow Corning, Shin Etsu, Dupont, etc.

That being said, I have no idea what intel uses for their internal tim between IHS and processor die.

Doesn't matter if they use two different TIMs that look exactly the same.

Going back 12+ years of working on Intel based machines.. some of them a lot older than that, the heatsink TIM has always had the same issues. I've even had issues on laptops that are 1 year old or less where the TIM has started separating, causing hot spots, which causes the CPU to overheat and the CPU fan to ramp up to full speed.

This is still happening as of right now.

It really started showing up in the Pentium 4 era. Pretty much every single P4 machine I worked on that was having stability issues related to the CPU was because the TIM had separated. Most of those machines were less than 2 years old when it happened.

As of late, I have had issues with laptops less than a year old.

Screw the stock TIM. I am about to the point of replacing the TIM with stuff 100x better before I deploy brand new machines.
 
Back
Top