Intel Teases Its New Intel Graphics UI

cageymaru

Fully [H]
Joined
Apr 10, 2003
Messages
22,348
Intel has released a teaser video that shows off the new Intel Graphics UI with an announcement of "Coming this month." #JoinTheOdyssey

The new control panel for Intel Graphics is coming. Get a taste of what that means for the future of visual computing. The Odyssey awaits.
 
Hard to get excited about the gui. I don't install the geforce experience on most of my machines. The gen11 integrated graphics sound like a big step up though and I suppose they'll be launching pretty soon (or I hope so for intel's sake). I'm interested in seeing variable rate shading in action. If nvidia, intel in gen 11 and I believe AMD in navi all have it we should start seeing it in games. I'm skeptical that it won't noticeably affect image quality but intel claims it's up to 30% faster on their integrated graphics. For slow graphics that may be too good to pass up.
 
Hard to get excited about the gui. I don't install the geforce experience on most of my machines. The gen11 integrated graphics sound like a big step up though and I suppose they'll be launching pretty soon (or I hope so for intel's sake). I'm interested in seeing variable rate shading in action. If nvidia, intel in gen 11 and I believe AMD in navi all have it we should start seeing it in games. I'm skeptical that it won't noticeably affect image quality but intel claims it's up to 30% faster on their integrated graphics. For slow graphics that may be too good to pass up.
Well when AMD revamped their UI some years ago, it was remarkably faster and more responsive. Everything was much more modern and easier to use. It is simply more intuitive. That's what I believe that Intel is shooting for here.

I can't wait to see how their new GPUs perform!
 
so.. what i gather from that video.. is that the new intel graphics will be able to run some mid range requirement games, but in a small window showing only part of the screen at any one time

:ROFLMAO:
 
Intel has enough video hardware out there that it is about time they got their own version of the GeForce Experiance, announce driver updates and offer "optimized" settings for their GPU's. If they are going to actually get serious about getting their own stand along GPU's to market it will be a requirement at this point.
 
The Nvidia optimization functionality is pretty terrible, so hopefully it's better than that. Being able to properly target an exact FPS at all costs could be cool for power users and casual players.
I'm hopeful for universal settings like vsync permanently on/off, triple buffering, permanent FPS locks, etc.
 
based on the hints, it almost makes me think they are going to make their igpu into a cloud render farm.
 
I wonder if nVidia will EVER update their antiquated control panel? Probably not... And even if they did, it would probably take up 2Gig plus instal space.
 
A tease video for a new Intel driver?!? Exactly who is that going to get excited enough to not buy a discrete GPU between now and the actual drive release?


Raja hype train is picking up steam
 
I pine for the day when graphics hardware manufacturers no longer need to tune their drivers for each new piece of software coming out.
 
I wonder if nVidia will EVER update their antiquated control panel? Probably not... And even if they did, it would probably take up 2Gig plus instal space.

It could use some updates but it looks better than most full screen, touch friendly UIs these days. And it is easier to navigate than them.
 
Intel has been trying to develop a GPU for over a decade. The problem is that they can't figure out how to make it efficient, or in other words, they have no clue how to make something competitive. A GPU is nothing more than a giant calculator, so Intel has to get their software stack sorted out. To do that, they need first to sort out the silicon so that they don't need translation layers between the software and the hardware. And now, using x86 as a basis for a GPU is not a good idea, but I guess Intel finally figured that one out by now. I can only wonder what Raja promised them. Going by his latest interview, I don't think he can get Intel's GPU up to perform competitively in gaming applications, so now he's talking about a full hardware stack for the data center. Jim Keller is a genius. However, I doubt that GPUs are his specialty. I read his interview with Anandtech after he moved from AMD to Intel, and it's headache inducing. Due to the Intel NDA, he can't talk about anything of substance, so it's almost a waste of time reading it. He could speak more about Zen around 2014 when he was at AMD than about what he's doing now at Intel. I really admire Jim Keller, he is a true genius. Raja Koduri, on the other hand, is more of a corporate bullshit artist.

Here is Jim Keller talking about Zen:

 
Iris pro wasn't bad, but its not like they are gonna be competitive anyway. They try new architectures and abandon them
 
I might be remembering this wrong; but I remember back circa 2006-2008 or so when Intel ran its mouth about how they would put NVIDIA out of business in graphics with larrabee, decade+ later and they can't even compete. Would be nice if they did though.... GPUs are way too expensive.
somewhat of a source here: http://vrworld.com/2009/10/12/an-inconvenient-truth-intel-larrabee-story-revealed/

Ironically, Larrabee promised to be what the RTX series is today: the first real time ray tracing graphics card.
 
Unless they are adding Freesync support for iGPUs like they said were going to do years ago but have not yet, I don't see how this is news.
 
As much as I would like for Intel to succeed, just to have more competition in the space, the things Raja has said in recent days is part of the problem.

While likely attempting to bash AMD by saying he has more resources on the Intel team, my takeaway was that Intel has the resources available to their graphics team for a while and still hasn't come up with anything competitive / went nowhere.
 
Any GPU manufacturer should have a phone companion app like AMD link.

I much prefer it to overlays etc.
 
As much as I would like for Intel to succeed, just to have more competition in the space, the things Raja has said in recent days is part of the problem.

While likely attempting to bash AMD by saying he has more resources on the Intel team, my takeaway was that Intel has the resources available to their graphics team for a while and still hasn't come up with anything competitive / went nowhere.

He did similar things when he was at AMD. He launched the rumor that Dr. Lisa Su moved people from his team over to Navi, specifically for the custom chip design that AMD was making for Sony. It turned out that it was not more than hogwash.

Long story short, lets put it this way: if Raja Koduri gets the boot from Intel, he has nowhere else to go where he will enjoy fame and fortune as he does at Intel. He burned bridges at Apple and AMD. The guy is a corporate whore, and there are only so many tricks he can turn before he has to settle for a mediocre job where no one will ever hear from him again. I can imagine that if he tries to go to NVIDIA, Jensen Huang will personally walk Raja Koduri out of the building while uttering the words: "Here at NVIDIA, we don't reward incompetence, empty promises and BS techno-babble. We only reward hard work and excellence. Good day Sir!"

Don't get me wrong, Raja Koduri probably knows a lot about GPU design, but most likely not enough to make a good product, especially one designed from scratch. He comes off a lot like a Hooli employee from "Silicon Valley", maybe he's a bit like Gavin Belson himself. At AMD he had only one job, and that was to bring Vega to market on time and make it a decent product. Given what we have learned about the architecture with the Radeon VII launch, Vega likes low latency and high bandwidth memory. Vega 56 and Vega 64 shipped with 2048-bit HBM2, while Radeon VII shipped with 4096-bit HBM2. Before all of that, Fury shipped with 4096-bit HBM (first gen) memory. Maybe they didn't want to make the interposer too big to accommodate four HMD2 modules for Vega, I have no idea, but I am sure that memory latency and bandwidth makes a huge difference for Vega. Regardless, it was by about a year late and costly at launch, and for a good chunk of time after that. As we have seen with the last three NVIDIA generations, efficiency is key to having a successful product that performs well. Maybe after spending several billion dollars in R&D costs, two decades from now they will figure it out at Intel after they figure out first that Raja Koduri is not the man for the job.

As depressing as this might sound, I'm not getting my hopes up for any competition on the consumer GPU space. If you want the best, you will have to give your money to NVIDIA for at least another decade or so. AMD has better chances than Intel to become competitive again in this segment. However, I'm sure they have done the math (AMD), and they probably figured out that it's not worth a considerable upfront investment to make better gaming GPUs now. I think AMD will evolve with time, staying around the "good enough" market segment. But Intel, I'll believe it when I see the first product from them that can at the very least perform like an RX580.
 
  • Like
Reactions: Auer
like this
Back
Top