Time for another manufacturer to jump into graphics cards?

Discussion in 'Video Cards' started by Nukester, Feb 23, 2018.

  1. DeathFromBelow

    DeathFromBelow [H]ardness Supreme

    Messages:
    7,263
    Joined:
    Jul 15, 2005
    I'm hearing a lot of 'it cant be done.'

    Somebody call Elon Musk, he must need GPU's for all his cars and rockets and engineers...
     
    Khahhblaab likes this.
  2. braamer

    braamer [H]ard|Gawd

    Messages:
    1,522
    Joined:
    Jun 28, 2004
    How much money in government grants would we need to give Elon in order for him to make a graphics card?
     
  3. defaultluser

    defaultluser [H]ardForum Junkie

    Messages:
    12,615
    Joined:
    Jan 14, 2006

    It certainly CAN be done. Just that Intel has yet to put in the required effort (high-performance, low-bug drivers, more efficient architecture).

    I would be with you on a guy like Musk DISRUPTING the graphics market if it was standing still. But both Nvidia and AMD are making progress on peak performance and performance/watt.

    It's much easier to come min and be disruptive when your competition has bled-out. The graphics card market has a lot more active competitors, AND ( up until last year), was STEADILY LOSING SALES NUMBERS to integrated.
     
  4. James21

    James21 Limp Gawd

    Messages:
    148
    Joined:
    Jan 8, 2009
    I don't really think it "Can be done" in a business profitable sense.
    Even Intel is not going to try to jump in and make a video card that is performance competitive & price Competitive with the Nvidia high end (such as 1080Ti) video cards for gaming. The money just is not there based on how much you would have to spend to get there.

    Where Intel is focused is in the compute market that Nvidia, AMD and several others are all working hard on, that is where the profits lie and the chance to get your products to excel for various specific use cases.

    Mid low range integrated stuff sure, but has anyone been whining about the lack of availability for 1030 class cards?

    Nothing that Intel or AMD are currently working on with integrated graphics or Intel's own graphics design are going to make any dent in the availability of the "precious" high end video cards for "gamers".
     
  5. Randall Stephens

    Randall Stephens Limp Gawd

    Messages:
    478
    Joined:
    Mar 3, 2017
    I'm hoping XGI makes a comeback. Hell, maybe Cyrix will jump into the market :D
     
  6. cdabc123

    cdabc123 2[H]4U

    Messages:
    2,264
    Joined:
    Jun 21, 2016
    cards that are very good for computing can often be easily made into a gaming card if the company wants to. if intel made a high prefermance gpu for datacenters they already have a high preformance gaming card if they edit drivers and possible pcb/fan abit.
     
  7. vegeta535

    vegeta535 2[H]4U

    Messages:
    3,320
    Joined:
    Jul 19, 2013
    The only way another player would enter if Nvidia and AMD went belly up and left a void in the market. If that happens we would go backwards for a bit cause a upstart company will not make a comparative GPU for a while. This while dumping billions in R&D to get something going from scratch. Not to mention all the patent issue that would arise from legit owners and trolls. Samsung or Apple could do it but they don't see the ROI atm.
     
  8. DeathFromBelow

    DeathFromBelow [H]ardness Supreme

    Messages:
    7,263
    Joined:
    Jul 15, 2005
    The government subsidizes everything, even marriage and home ownership.

    Musk financed his own company. SpaceX won their NASA money via a competition with Boeing et all. I don't see what's wrong with that. In a similar vein I imagine there's money involved in the high-performance computing initiative. The industry doesn't just need a shake up, we need to address the problem of developing post-silicon computers.
     
    John721 likes this.
  9. Batboy88

    Batboy88 Limp Gawd

    Messages:
    323
    Joined:
    Dec 25, 2017
    Actually would like to see that...Nvidia has definitely always monopolized it sooo bad lol.
     
  10. Icy006

    Icy006 [H]Lite

    Messages:
    76
    Joined:
    Jan 4, 2004
    I'm pretty curious what Apple could pull off in the desktop graphics space. Their A11 is pretty damn impressive in a tiny, power-efficient form-factor.
     
  11. braamer

    braamer [H]ard|Gawd

    Messages:
    1,522
    Joined:
    Jun 28, 2004
    Tesla and SpaceX have received over $5B in government money. The government provides a $7500 rebate on purchases of the Tesla. The average Tesla household makes $503,000 per year. Do you think they need more government handouts?
     
  12. dook43

    dook43 2[H]4U

    Messages:
    2,915
    Joined:
    Sep 9, 2005
    The G400/G400 Max were competitive.

    They didn't survive the jump to the hardware T&L capabilities, and by the time Parhelia was released, they were focusing on features that weren't in games yet (surround, dual/quad texturing), and weren't fast enough to compete with GF3/GF4/Radeon 8500/9700.
     
  13. mouacyk

    mouacyk n00b

    Messages:
    57
    Joined:
    Dec 5, 2015
    The problem is shortage due to mining idiocy... introducing a new GPU manufacturing will just bottleneck the fabs even more. Right now is literally the worst time for a newcomer.
     
  14. cdabc123

    cdabc123 2[H]4U

    Messages:
    2,264
    Joined:
    Jun 21, 2016
    not really if a manufacurer can make a half decent card that can mine abit it will sell very well even if it cant game well. that would give them the time and money to perhaps put together a future card thats more competitive. mining has bee fantastic so far for the manufacturers. vega would not fly off the shelfs and amd wouldnt have been anywhere near as sucsessful this year if it wasnt for mining. however if mining crashes it will have horrific effects on both amd and nvidia.
     
  15. Gasaraki_

    Gasaraki_ Gawd

    Messages:
    617
    Joined:
    Oct 27, 2016
    How is another "manufacturer" going to fix supply?

    One, AMD and nVidia are not "manufacturers". They are chip designers/makers. They sell the GPU processors to other manufacturers like EVGA, ASUS, MSI, Zotac, Gigabyte, PNY, etc.
    Two, they don't make the GPU themselves. TMSC and Samsung are the "manufacturers" of GPUs.
    Three, the only way another manufacturer can increase supply is to bring in manufacturing that are not currently used to make GPUs. So in other word Intel. If Intel started making discrete GPUs, they will bring their own manufacturing and therefore increase supply.
     
  16. noxqzs

    noxqzs Limp Gawd

    Messages:
    177
    Joined:
    Aug 2, 2013
    I believe within 2 years Intel will have their own GPU one that will be on par with the competition but offer the benefit of being a SOC device. The chief architect from ATI/AMD is due to start this year and he has already hinted that he has something up his sleeve already.
     
  17. Randall Stephens

    Randall Stephens Limp Gawd

    Messages:
    478
    Joined:
    Mar 3, 2017
    If his VEGA track record is any indication...
     
    Denpepe likes this.
  18. PhaseNoise

    PhaseNoise [H]ard|Gawd

    Messages:
    2,000
    Joined:
    May 11, 2005
    The thing is - you have to sustain that development relentlessly to fight in this market, and that costs an absolute fortune.
     
    razor1 likes this.
  19. razor1

    razor1 [H]ardForum Junkie

    Messages:
    10,155
    Joined:
    Jul 14, 2005

    yep dev costs are crazy for GPU's now. I was shocked to see how much more nV has in patent development over AMD in graphics in the past 3 years, they literally have x2 the amount of patents in the past 3 years. That really shows how far AMD has fallen off the radar when it comes to GPU technology, and to get that back, they need to spend that 3 to 4 years of R&D to become competitive again.
     
  20. cdabc123

    cdabc123 2[H]4U

    Messages:
    2,264
    Joined:
    Jun 21, 2016
    i would say they have maintained to be competitive throughout all of this. 290 was a good card. fury wasnt bad, vega is alright (and can mine :p ) i dont know if they are able to pull off a card much better then what nvidia has but they remain producing cards that are all decent enough on price/preformance
     
  21. razor1

    razor1 [H]ardForum Junkie

    Messages:
    10,155
    Joined:
    Jul 14, 2005
    They haven't been able to at all, it shows with the amount of graphics patents AMD has done in the past 3 years vs what nV has done. That is where the R&D nV is spending is going. AMD's total R&D of CPU's and GPU's is the same as what nV has for just graphics. Its a staggering difference. It shows, with what happened with Maxwell vs the 3xx series, and Fiji, and Polaris and Vega. AMD has been standing still when it comes to pushing development, they just didn't and still don't have the money to do it.
     
  22. Khahhblaab

    Khahhblaab Limp Gawd

    Messages:
    481
    Joined:
    Apr 23, 2017
    Many companies depend on 'Fabless" manuf. Its just shows how expensive is to build a new place to make stuff really, really small. Big money to go down to 14 nm dimensions. The parts that come out of a fab are still the intellectual property of the designer. The factory just made my copyrighted design. If intel got into the game, supply would increase but because they already have a new fab commin on line.

    Dont have numbers stating how much capacity intel has in storage but I bet its not much. Memory especially is a worldwide shortage. Opportunity to fill in some gaps.
     
    razor1 likes this.
  23. UnknownSouljer

    UnknownSouljer [H]ardness Supreme

    Messages:
    6,007
    Joined:
    Sep 24, 2001
    Right. Intel is one of the few tech companies that owns its own fabs. They are also the one with the most to loose to GPGPU as Volta is set to eat Intels lunch in machine learning and data center. GPGPU is showing that it scales far better than processors. It costs less and it's faster. If they do nothing then nVidia will become the new Intel within less than a decade (and during that entire time it will be the decline of Intel). If they don't get into the fight soon the "uphill battle" will become an impossible one.

    I don't think there is anyway that Intel doesn't make waves, but here's the thing: I don't know if that will have any affect on gamers. Their interest will be workstation and server. Then at the bottom end with integrated. Without too much interest in-between. Vertical integration will help them at both the top end and the bottom. I'm not sure if they simply care enough for the middle.
     
    razor1 likes this.
  24. razor1

    razor1 [H]ardForum Junkie

    Messages:
    10,155
    Joined:
    Jul 14, 2005

    Its already a up hill battle, it took nV 12 years going on 13 years to make CUDA and HPC markets what they are now, its very hard for Intel to do the same in a decade, its going to take them 5 years minimum to do even think about taking on nV, and in the mean time? 5 years, 3 years to make the hardware, and then another 2 years to get people on board to do software. 2 years for software is very fast, in a market that nV will be entrenched in.

    If they don't hit it out of the park with their first discrete GPU (which I don't see how they will with nV iterating so fast with vastly different approaches for different market segments), expect it to be another 3 years for the next generation of hardware, in the mean time, software is the key, and where Phi failed to deliver, its premise of x86 code sounded great, but upgrade paths were not good, as many things changed for new generations, this can't happen, they need to be backward compatible code wise at least for 3 or 4 gens, something they over looked and now developers aren't using Phi as much.
     
    Last edited: Mar 2, 2018
  25. cdabc123

    cdabc123 2[H]4U

    Messages:
    2,264
    Joined:
    Jun 21, 2016
    depends on where they are trying to compete. intel already knows how to make a high powered accelerator they just dont have much software backing it. I have alot of the x200 intel phis (the bootable cpus) and if the program can take advantage of the prosseser to the fullest the phi can absolutely kill any gpu out there. for example mining monero (likes alot of super fast mem) the phi will do ~3000h/s the next highest (idk what the tesla v100 is) is a vega 64 at 2000h/s and the phi is more power effeciant and much more versatile being a actuall cpu. I would love to see intel double the power envolope and double the core count but they would need to do that via pcie cards and they kinda abandon those cards.
     
  26. UnknownSouljer

    UnknownSouljer [H]ardness Supreme

    Messages:
    6,007
    Joined:
    Sep 24, 2001
    ...oy, I hate to nitpick, but that's what I said.


    I don't disagree with you, in the sense that there will probably have to be work done around the clock to get into the market. However even with all this, there is no choice. Adapt or die. More or less point blank. GPGPU is the most disruptive thing to processors in probably 30 years.
    However, Intel is uniquely positioned with huge stacks of cash, a massive R&D budget, and the ability to headhunt the best. Under good direction I think they could have something competitive in the next 3 years (with several generations of product that are basically evolutions of where they are trying to go). However, if Intel can at least do okay in performance per watt. And performance per dollar, it won't hurt nearly as much as being unable to compete in any aspect at all of the workstation/server market.

    Direction is critical. They definitely need strong leadership with vision in this regard. However Intel obviously has huge amounts of partnerships. If they are smart, which they mostly are, they'll leverage everything they can to get people on their platforms.
     
    razor1 likes this.
  27. razor1

    razor1 [H]ardForum Junkie

    Messages:
    10,155
    Joined:
    Jul 14, 2005

    I agree with ya, I just elaborated a bit more :)

    Intel's partnerships only work for where they have a strong hold in their current markets, not new ones or ones they are weak in. This is because Intel needs to provide something that will not force their partners to spend exorbitant amount of money, AKA rewriting software to fit their processors, this has never worked in the entire computer industry at any time. Intel tried it with Itanium, failed miserably. They tried it with Atom and smart phones, failed again, they tried it with tablets, failed a third time. Even with contra revenue with tablets as well. Intel tried to pay developers to program for Itanium as well, failed there too. These were not small amounts we are talking 2 billion a year and more! By the time Intel has a GPU that can match nV, they will need to have their software stack ready to go too. If they don't forget it, the HPC/DL markets won't shift at all. It will all be for not. Just the way AMD has been doing their open source HSA programs, we won't see anything out of Intel. Intel is part of HSA as well, and they still haven't pushed HSA much. I don't see that changing at all, I haven't heard anything new yet for Phi on this front. HSA is going at a snails pace, while every version of CUDA is getting better, more features, speed specific features, we aren't seeing the same adoption of hardware or software techniques from HSA members.
     
    UnknownSouljer likes this.
  28. Shintai

    Shintai [H]ardness Supreme

    Messages:
    5,691
    Joined:
    Jul 1, 2016
    Lets be honest, RTG in AMD have been running on fumes for quite a lot of years due to minimal funding.

    However don´t really expect Intel to be much different besides bigger R&D and better nodes. A CPU company is always a CPU company first.
     
  29. noxqzs

    noxqzs Limp Gawd

    Messages:
    177
    Joined:
    Aug 2, 2013
    IMHO, VEGA is only a small piece of the pie Koduri helped create. He was originally with ATI when the Radeon line was created and that has been a driving force as far as competition with Nvidia. I have much respect for the guy, and hope he can pull something out of his hat and we profit.
     
  30. PhaseNoise

    PhaseNoise [H]ard|Gawd

    Messages:
    2,000
    Joined:
    May 11, 2005
    Once the Bitboys Oy Glaze3d finally ships, NV, 3dfx, and AMD will be in serious trouble I tell you.
     
    razor1 likes this.
  31. Mega6

    Mega6 [H]ard|Gawd

    Messages:
    1,863
    Joined:
    Aug 13, 2017
    yeah 3dfx is in for it.
     
    razor1 likes this.
  32. WhoBeDaPlaya

    WhoBeDaPlaya 2[H]4U

    Messages:
    2,485
    Joined:
    Dec 16, 2002
    S3 Virge + Cyrix = party likes it's the 90s!
     
    razor1 likes this.
  33. razor1

    razor1 [H]ardForum Junkie

    Messages:
    10,155
    Joined:
    Jul 14, 2005

    Quite literally!
     
    Khahhblaab, Maddness and WhoBeDaPlaya like this.
  34. Khahhblaab

    Khahhblaab Limp Gawd

    Messages:
    481
    Joined:
    Apr 23, 2017
    Just don't forget that a graphics card without a CPU is a brick. GPGPU only leverages a GPU when it gets code that can be processed because its easily parallel.

    Its still follows the old axiom:

    Use a CPU to do complex manipulations to a small set of data.
    Use a GPU to do simple manipulations to a large set of data.
    <paraphrased>
     
    razor1 likes this.
  35. UnknownSouljer

    UnknownSouljer [H]ardness Supreme

    Messages:
    6,007
    Joined:
    Sep 24, 2001
    Sure. But things are going to get reversed in big data. It’s likely that there will be a move from having 8 cpu systems to 2 cpu systems in favor of having 4+ video cards.

    Additionally there has been no major movement on the CPU side since sandy bridge, with mostly just a focus on optimization and decreased tdp.

    Hell, there’s even a chance that ARM will take over with lower costed processors with nvidia taking the data, ai side and intel just getting squeezed out. (NVidias autonomous driving platform as an example utilized two custom arm processors, which isn’t precisely a server, but it starts there).

    I’ll freely admit that I’m spit balling about consequences, but what I am not, is that there is going to be a big shift to graphics cards that will for sure affect intel in terms of how much they make and their prominence in the market.
     
    razor1 likes this.
  36. razor1

    razor1 [H]ardForum Junkie

    Messages:
    10,155
    Joined:
    Jul 14, 2005

    True, both of you are correct but the future is a bit different, one of Volta's key changes in their SIMT is to enable fine grain transition between threads on an ALU level. Something that CPU's are capable of doing, and GPU's weren't.
     
  37. dexvx

    dexvx [H]ard|Gawd

    Messages:
    1,104
    Joined:
    Aug 14, 2002
    That's not entirely the reason. It's true that the G400/MAX was competitive/beats TNT2/U, but GeForce 256 was released shortly after. Matrox just didn't want to raise venture capital for R&D purposes. Nvidia/ATI raised a LOT of venture capital and/or took on corporate debt. It was the primary reason they rocketed ahead of everyone else. Matrox just played it safe and thought they could survive with their niche market. But Quadro NVS pretty much forced Matrox back in. By that time, it was too late. I remember Matrox stating they need hundreds of millions of $ to develop a competitive GPU from scratch. It was not a gamble they were willing to take.

    Aside from Intel, the only other tech company that could remotely come close to a competitive GPU is PowerVR. They would need to scale their phone GPU's to desktop TDP's.
     
  38. Factum

    Factum [H]ard|Gawd

    Messages:
    1,863
    Joined:
    Dec 24, 2014
    Shortage in the US you mean, right?
    Cards are in stock in EU/Asia...both for consumers and enterprise.
    The problem is that people in the US thinks that the US being 3rd priority for SKU's means there is a global shortage...there isn't....the US is just allocated cards last, simple as that...hance why some people in the US are thinking there is a global shortage.
     
    Shintai likes this.
  39. razor1

    razor1 [H]ardForum Junkie

    Messages:
    10,155
    Joined:
    Jul 14, 2005
     
  40. defaultluser

    defaultluser [H]ardForum Junkie

    Messages:
    12,615
    Joined:
    Jan 14, 2006
    Yeaah, Mtrox was doomed unless they bet it all n performance 3D. It was fairly easy to release a card competitive with 3DFX, but taking it to the next level with T&L and competent performance was a task too hard for most companies.

    between 1998 and 2002, six "competent" consumer 3D companies were completely plowed-over by Nvidia:

    Rendition
    3DLabs
    3dfx
    S3
    Real3D
    Matrox

    Because doing more than basic 3D takes some real engineering talent. That's why ATI's purchase of ArtX was so important, because Nvidia already started with a few SGI engineers (who knew what lighting and effects were). And 3dfx wasted their SGI talent, until Nvidia bought them.

    http://boards.fool.com/battle-of-the-ex-sgi-engineers-10143854.aspx

    It wasn't new technology, just new for the consumer space. SGI didn't know how to manage that transition, so they bled engineers and eventually died.

    Nvidia and ATI didn't really start innovating until DX9. Everything prior to that was available on professional cards in the 1990s. Just see RealityEngine2, which has hardware MSAA, and real-time lighting and environment mapping.

    Up until that, they just copied stuff from the engineers they bought.
     
    Last edited: Mar 5, 2018
    razor1 likes this.