Intel Xe / Odyssey

erek

[H]F Junkie
Joined
Dec 19, 2005
Messages
10,785
opinion?

114300_O.jpg
 
I had to google what this was.

I'm whelmed. It seems irrelevant to most of us until they have hardware to sell, but I guess it's good that we're starting to see the trappings of Intel's graphics card lineup. The size of the card in the picture doesn't inspire a whole lot of confidence. If that's what they're showing, it looks like they're more interested in the 1080P esports market than they are in the high end.
 
I don't care for looks....I care for performance/features...
 
I'm excited to see them pushing it and not just leaving us with rumours in the shadows. It makes me feel like this might actually happen.

I'm not too worried about card size, we don't know anything about the design \ price \ performance. Even if it only competes in the $150 - $200 segment it's still a start. My memory could be failing but wasn't their last attempt some super scalable bundle of tiny cores? If the new design works that way scaling up to a larger \ better performing pcb should be relatively simple.
 
I'm excited to see them pushing it and not just leaving us with rumours in the shadows. It makes me feel like this might actually happen.

I'm not too worried about card size, we don't know anything about the design \ price \ performance. Even if it only competes in the $150 - $200 segment it's still a start. My memory could be failing but wasn't their last attempt some super scalable bundle of tiny cores? If the new design works that way scaling up to a larger \ better performing pcb should be relatively simple.

Larrabee was a bunch of modified x86 Pentium Pro cores AFAIR
 
No drivers though, rigth?

:(

i'm embarassed to admit this, but upon further research and discussion my card is a "QDF S Spec" and potentially (not 100% confirmed) a Thermal Sample...

i have heard of someone working on Xeon Phi drivers for gaming that might be workable on these Larrabee samples that aren't Mechanical / Thermal
 

Attachments

  • DSCN6168.JPG
    DSCN6168.JPG
    654.2 KB · Views: 0
Larrabee was a bunch of modified x86 Pentium Pro cores AFAIR

No, just Pentium. Way simpler architecture, and thus way easier to create en-mass on a single chip.

First-gen Atom was in-order dual-issue, with a single 128-bit SSE unit. They worked around the in-order by giving it hyperthreading. The engineers working Larrabee started with Atom, then tacked-on a 4xlarger vector unit.

This is just pointless marketing fluff, showing the obvious PCB layouts and coolers chosen. It means they're getting close-enough to release to be sending out board samples, but not anywhere near ready.

They said 2020, and it's still looking like that.
 
Last edited:
I'd really like Intel to succeed in discreet GPU market. If their cards are good upgrade from my GTX 1070 next year, I'd definitely consider one. Things have been really boring when it comes to GPU market.
 
Personally, I'm not expecting Intel to be smashing it out of the park on their first go. What I'd like to see is decent performance for the price (whatever price point this ends up being launched at) with a view to cranking up the raw performance over time. If it's competitive with AMD and Nvidia's offerings at that price bracket then that's great, but it's not the end of the world if it lags behind a bit, so long as Intel can deliver improvements subsequently.

A third player in the market could be the shot in the arm that the GPU space badly needs. But writing Intel off instantly isn't going to help anyone.
 
Don't get your hopes up, intel has been subsidizing their GPU architecture development through their cpu business for 10 years (now at the point of >2:1 die area in favor of) and even with this massive default presence it has been extremely unimpressive for anything but media acceleration.
 
Don't get your hopes up, intel has been subsidizing their GPU architecture development through their cpu business for 10 years (now at the point of >2:1 die area in favor of) and even with this massive default presence it has been extremely unimpressive for anything but media acceleration.
Intel integrated GPU's are bad at all for what they are: addition to CPU's that have tiny bit die space and TDP allocated to it and no memory (in most processors)
Broadwell iGPU was actually faster than AMD APU of the time.

Next year will be very interesting for GPU enthusiasts. Both AMD and Intel might or might not support hardware DXR and both of them might be or might be not any good.
At this point only sure thing is that in 2020 Nvidia will have great GPU's as they almost always do and most probably will remain in top 1.
So even if both AMD and Intel screw this up I will be able to upgrade. Hopefully AMD and Intel do their job right and make Nvidia to lower prices on xx70 cards instead of planned increase :(
 
I don't see why Intel couldn't hit it out the park on the first go. Maybe not a high-end take-over, but they have the experience to do it right.

Remember, Intel owns 70% of the GPU market with their iGPU solutions. So they do know something, they are not starting from scratch here.
 
Intel integrated GPU's are bad at all for what they are: addition to CPU's that have tiny bit die space and TDP allocated to it and no memory (in most processors)
Broadwell iGPU was actually faster than AMD APU of the time.

Been gaming on them for years- currently I'm experimenting with one on Linux through Wine.

That's a pretty high level of support from the software side, and given that it can hit 1080p60 in low-end games, the hardware is plenty stout already.
 
It likely uses HBM, thus the size.

I dunno I would like them to add 4" of extra blank PCB, make me feel like I'm getting a value right? If it's little it has to be slow!!!

But on a more serious note, this is an actual problem for Intel, they need to let more marketing guys have their say.

The customer doesn't make logical decisions, he just knows that big card=high end=desirable because thats the way it has been. After all, big mansions and yachts are yet to go out of style.

It's the same as when you sell a customer a digital code, it helps to give them a .0013 cent green plastic shell to go with it, increases the precived value of the item.
 
I dunno I would like them to add 4" of extra blank PCB, make me feel like I'm getting a value right? If it's little it has to be slow!!!

But on a more serious note, this is an actual problem for Intel, they need to let more marketing guys have their say.

The customer doesn't make logical decisions, he just knows that big card=high end=desirable because thats the way it has been. After all, big mansions and yachts are yet to go out of style.

It's the same as when you sell a customer a digital code, it helps to give them a .0013 cent green plastic shell to go with it, increases the precived value of the item.

No, Not big card=high end=desirable.

Heavy=high end=desirable!

I see Intel using HMB2 or something faster.
I also predict them adding a high end option for an Optane card add-on for memory in rendering like the AMD Pro SSG, that offers 2 PCIE card slots for an additional 1TB. If Intel can find a way of market more use of Optane drives, they will.

Intel doesn't "just" want gamers, they want the whole enchilada.
They want render farms, professional workstations and gamers.
 
No, Not big card=high end=desirable.

Heavy=high end=desirable!

I see Intel using HMB2 or something faster.
I also predict them adding a high end option for an Optane card add-on for memory in rendering like the AMD Pro SSG, that offers 2 PCIE card slots for an additional 1TB. If Intel can find a way of market more use of Optane drives, they will.

Intel doesn't "just" want gamers, they want the whole enchilada.
They want render farms, professional workstations and gamers.

I recently saw a RTX 2070 FE IRL, the backplate was the most sturdy looking backplate i've seen on any modern card. very impressive, it looked like a really expensive product to me.

That percieved value thing really fucks with your mind, I was happy with the card before I plugged it in
 
I recently saw a RTX 2070 FE IRL, the backplate was the most sturdy looking backplate i've seen on any modern card. very impressive, it looked like a really expensive product to me.

That percieved value thing really fucks with your mind, I was happy with the card before I plugged it in

They do the same thing with how car door close. Premium trim will have a distinctive sound and feel sturdy while cheaper trim will sound echo and feel like you're lucky it latched properly lol.
Marketing stunt at its best ;) Much like RGB fad... Some people (normal people EDIT: "If I may say") are appealed to those eye candy since they don't understand the rest of the specifications anyway.
 
They do the same thing with how car door close. Premium trim will have a distinctive sound and feel sturdy while cheaper trim will sound echo and feel like you're lucky it latched properly lol.
Marketing stunt at its best ;) Much like RGB fad... Some people (normal people EDIT: "If I may say") are appealed to those eye candy since they don't understand the rest of the specifications anyway.

I must just be a sucker for the stuff, I fell in love with that card over a fucking metal plate.

And I can't get enough of RGB, I must be some sort of simpleton because those shiny lights make me ooo and aah
 
If Intel continues its open source friendliness Intel Xe might actually be the new Linux best friend

Perhaps among the FOSS faithful- personally, all three vendors appear to have very good Linux support for gaming and otherwise.

What's nice is that Intel is already on that ball, though. I couldn't care less whether they went open or closed, so long as they got it running well, and based on what we already have, they do appear to have set themselves up to succeed in the future.

[personal note: I've been experimenting with pushing as much as I can to Linux; my company is doing the same primarily due to ease of licensing and deployment in addition to the obvious cost benefits]
 
Intel could definitely make inroads on the FOSS Linux community, however small it may be.

For example, the open-source Nvidia driver Nouveau lagged behind so far that the GTX 1080 I used to have wasn't supported even like 2 years after launch.

In order to install Ubuntu (and not have a black screen) I needed to swap in an old Nvidia card (750 Ti) to do the OS install and then load closed-source Nvidia drivers before swapping back in the GTX 1080.

Certainly Intel could do better than this, and I hope they do.
 
In order to install Ubuntu (and not have a black screen) I needed to swap in an old Nvidia card (750 Ti) to do the OS install and then load closed-source Nvidia drivers before swapping back in the GTX 1080.

Which version?

I can't say that I could address the issue, but loading up Ubuntu MATE 18.04.2 LTS (current LTS) had my 1080Ti with one monitor and RX560 with three monitors working from the live install image. Once installed I was able to arrange them properly.
 
I believe it was 16.04. I looked for a while for an answer, most people said to use integrated video for the install, but I didn't have that at the time on this machine.

Don't want to get too side-tracked, the point was that there is room for Intel to release a solid open-source driver that would be better than what's out there now with Nvidia (or AMD to a lesser extent).
 
Don't want to get too side-tracked, the point was that there is room for Intel to release a solid open-source driver that would be better than what's out there now with Nvidia (or AMD to a lesser extent).

Well, that's kind of the point I was getting at above- all three look to be doing quite well in their own way. I see it less as Intel has an opportunity to 'step up' as they've already been doing that, and more that they just need to 'keep up' the hard work.
 
Back
Top