Intel Teases Its New Intel Graphics UI

Discussion in 'HardForum Tech News' started by cageymaru, Mar 12, 2019.

  1. cageymaru

    cageymaru [H]ard as it Gets

    Messages:
    19,725
    Joined:
    Apr 10, 2003
    Intel has released a teaser video that shows off the new Intel Graphics UI with an announcement of "Coming this month." #JoinTheOdyssey

    The new control panel for Intel Graphics is coming. Get a taste of what that means for the future of visual computing. The Odyssey awaits.
     
  2. the-one1

    the-one1 2[H]4U

    Messages:
    2,982
    Joined:
    Jan 16, 2003
    Ooooooooooo. A GUI!
     
    auntjemima likes this.
  3. andrewaggb

    andrewaggb Limp Gawd

    Messages:
    424
    Joined:
    Oct 6, 2004
    Hard to get excited about the gui. I don't install the geforce experience on most of my machines. The gen11 integrated graphics sound like a big step up though and I suppose they'll be launching pretty soon (or I hope so for intel's sake). I'm interested in seeing variable rate shading in action. If nvidia, intel in gen 11 and I believe AMD in navi all have it we should start seeing it in games. I'm skeptical that it won't noticeably affect image quality but intel claims it's up to 30% faster on their integrated graphics. For slow graphics that may be too good to pass up.
     
  4. DeeFrag

    DeeFrag [H]ardness Supreme

    Messages:
    5,501
    Joined:
    Jan 14, 2005
    Intel deserves to win a participation award for this.
     
    Auer, lenne0815 and PeaKr like this.
  5. cageymaru

    cageymaru [H]ard as it Gets

    Messages:
    19,725
    Joined:
    Apr 10, 2003
    Well when AMD revamped their UI some years ago, it was remarkably faster and more responsive. Everything was much more modern and easier to use. It is simply more intuitive. That's what I believe that Intel is shooting for here.

    I can't wait to see how their new GPUs perform!
     
    Auer and andrewaggb like this.
  6. katanaD

    katanaD [H]ard|Gawd

    Messages:
    1,987
    Joined:
    Nov 15, 2016
    so.. what i gather from that video.. is that the new intel graphics will be able to run some mid range requirement games, but in a small window showing only part of the screen at any one time

    :ROFLMAO:
     
  7. Lakados

    Lakados [H]ard|Gawd

    Messages:
    1,490
    Joined:
    Feb 3, 2014
    Intel has enough video hardware out there that it is about time they got their own version of the GeForce Experiance, announce driver updates and offer "optimized" settings for their GPU's. If they are going to actually get serious about getting their own stand along GPU's to market it will be a requirement at this point.
     
  8. Domingo

    Domingo [H]ard as it Gets

    Messages:
    16,999
    Joined:
    Jul 30, 2004
    The Nvidia optimization functionality is pretty terrible, so hopefully it's better than that. Being able to properly target an exact FPS at all costs could be cool for power users and casual players.
    I'm hopeful for universal settings like vsync permanently on/off, triple buffering, permanent FPS locks, etc.
     
  9. TheBuzzer

    TheBuzzer HACK THE WORLD!

    Messages:
    12,472
    Joined:
    Aug 15, 2005
    based on the hints, it almost makes me think they are going to make their igpu into a cloud render farm.
     
  10. Stimpy88

    Stimpy88 [H]ard|Gawd

    Messages:
    1,273
    Joined:
    Feb 18, 2004
    I wonder if nVidia will EVER update their antiquated control panel? Probably not... And even if they did, it would probably take up 2Gig plus instal space.
     
  11. ecmaster76

    ecmaster76 [H]ard|Gawd

    Messages:
    1,152
    Joined:
    Feb 6, 2007
    A tease video for a new Intel driver?!? Exactly who is that going to get excited enough to not buy a discrete GPU between now and the actual drive release?


    Raja hype train is picking up steam
     
  12. STEM

    STEM Gawd

    Messages:
    561
    Joined:
    Jun 7, 2007
    Ah, that Raja flavoured bulls... hype-pie-in-the-sky...
     
  13. horskh

    horskh Limp Gawd

    Messages:
    135
    Joined:
    Jan 19, 2018
    I pine for the day when graphics hardware manufacturers no longer need to tune their drivers for each new piece of software coming out.
     
    auntjemima likes this.
  14. KrS

    KrS n00b

    Messages:
    42
    Joined:
    Jan 29, 2015
    WOOT!!! Dark mode!!!

    ...
     
  15. cbutters

    cbutters Gawd

    Messages:
    512
    Joined:
    Dec 30, 2005
  16. Flogger23m

    Flogger23m [H]ardForum Junkie

    Messages:
    9,810
    Joined:
    Jun 19, 2009
    It could use some updates but it looks better than most full screen, touch friendly UIs these days. And it is easier to navigate than them.
     
  17. STEM

    STEM Gawd

    Messages:
    561
    Joined:
    Jun 7, 2007
    Intel has been trying to develop a GPU for over a decade. The problem is that they can't figure out how to make it efficient, or in other words, they have no clue how to make something competitive. A GPU is nothing more than a giant calculator, so Intel has to get their software stack sorted out. To do that, they need first to sort out the silicon so that they don't need translation layers between the software and the hardware. And now, using x86 as a basis for a GPU is not a good idea, but I guess Intel finally figured that one out by now. I can only wonder what Raja promised them. Going by his latest interview, I don't think he can get Intel's GPU up to perform competitively in gaming applications, so now he's talking about a full hardware stack for the data center. Jim Keller is a genius. However, I doubt that GPUs are his specialty. I read his interview with Anandtech after he moved from AMD to Intel, and it's headache inducing. Due to the Intel NDA, he can't talk about anything of substance, so it's almost a waste of time reading it. He could speak more about Zen around 2014 when he was at AMD than about what he's doing now at Intel. I really admire Jim Keller, he is a true genius. Raja Koduri, on the other hand, is more of a corporate bullshit artist.

    Here is Jim Keller talking about Zen:

     
  18. Nausicaa

    Nausicaa [H]Lite

    Messages:
    120
    Joined:
    Mar 9, 2015
    Iris pro wasn't bad, but its not like they are gonna be competitive anyway. They try new architectures and abandon them
     
  19. wyqtor

    wyqtor Limp Gawd

    Messages:
    393
    Joined:
    Dec 30, 2011
    Ironically, Larrabee promised to be what the RTX series is today: the first real time ray tracing graphics card.
     
  20. Staples

    Staples [H]ardness Supreme

    Messages:
    7,786
    Joined:
    Jul 18, 2001
    Unless they are adding Freesync support for iGPUs like they said were going to do years ago but have not yet, I don't see how this is news.
     
  21. tangoseal

    tangoseal [H]ardness Supreme

    Messages:
    7,307
    Joined:
    Dec 18, 2010
    Is it 14++++++++++ gui too?
     
  22. dvsman

    dvsman 2[H]4U

    Messages:
    2,638
    Joined:
    Dec 2, 2009
    As much as I would like for Intel to succeed, just to have more competition in the space, the things Raja has said in recent days is part of the problem.

    While likely attempting to bash AMD by saying he has more resources on the Intel team, my takeaway was that Intel has the resources available to their graphics team for a while and still hasn't come up with anything competitive / went nowhere.
     
    STEM likes this.
  23. Auer

    Auer Gawd

    Messages:
    632
    Joined:
    Nov 2, 2018
    Any GPU manufacturer should have a phone companion app like AMD link.

    I much prefer it to overlays etc.
     
  24. STEM

    STEM Gawd

    Messages:
    561
    Joined:
    Jun 7, 2007
    He did similar things when he was at AMD. He launched the rumor that Dr. Lisa Su moved people from his team over to Navi, specifically for the custom chip design that AMD was making for Sony. It turned out that it was not more than hogwash.

    Long story short, lets put it this way: if Raja Koduri gets the boot from Intel, he has nowhere else to go where he will enjoy fame and fortune as he does at Intel. He burned bridges at Apple and AMD. The guy is a corporate whore, and there are only so many tricks he can turn before he has to settle for a mediocre job where no one will ever hear from him again. I can imagine that if he tries to go to NVIDIA, Jensen Huang will personally walk Raja Koduri out of the building while uttering the words: "Here at NVIDIA, we don't reward incompetence, empty promises and BS techno-babble. We only reward hard work and excellence. Good day Sir!"

    Don't get me wrong, Raja Koduri probably knows a lot about GPU design, but most likely not enough to make a good product, especially one designed from scratch. He comes off a lot like a Hooli employee from "Silicon Valley", maybe he's a bit like Gavin Belson himself. At AMD he had only one job, and that was to bring Vega to market on time and make it a decent product. Given what we have learned about the architecture with the Radeon VII launch, Vega likes low latency and high bandwidth memory. Vega 56 and Vega 64 shipped with 2048-bit HBM2, while Radeon VII shipped with 4096-bit HBM2. Before all of that, Fury shipped with 4096-bit HBM (first gen) memory. Maybe they didn't want to make the interposer too big to accommodate four HMD2 modules for Vega, I have no idea, but I am sure that memory latency and bandwidth makes a huge difference for Vega. Regardless, it was by about a year late and costly at launch, and for a good chunk of time after that. As we have seen with the last three NVIDIA generations, efficiency is key to having a successful product that performs well. Maybe after spending several billion dollars in R&D costs, two decades from now they will figure it out at Intel after they figure out first that Raja Koduri is not the man for the job.

    As depressing as this might sound, I'm not getting my hopes up for any competition on the consumer GPU space. If you want the best, you will have to give your money to NVIDIA for at least another decade or so. AMD has better chances than Intel to become competitive again in this segment. However, I'm sure they have done the math (AMD), and they probably figured out that it's not worth a considerable upfront investment to make better gaming GPUs now. I think AMD will evolve with time, staying around the "good enough" market segment. But Intel, I'll believe it when I see the first product from them that can at the very least perform like an RX580.
     
    Auer likes this.