Time for another manufacturer to jump into graphics cards?

I'm hearing a lot of 'it cant be done.'

Somebody call Elon Musk, he must need GPU's for all his cars and rockets and engineers...
 
I'm hearing a lot of 'it cant be done.'

Somebody call Elon Musk, he must need GPU's for all his cars and rockets and engineers...


It certainly CAN be done. Just that Intel has yet to put in the required effort (high-performance, low-bug drivers, more efficient architecture).

I would be with you on a guy like Musk DISRUPTING the graphics market if it was standing still. But both Nvidia and AMD are making progress on peak performance and performance/watt.

It's much easier to come min and be disruptive when your competition has bled-out. The graphics card market has a lot more active competitors, AND ( up until last year), was STEADILY LOSING SALES NUMBERS to integrated.
 
I don't really think it "Can be done" in a business profitable sense.
Even Intel is not going to try to jump in and make a video card that is performance competitive & price Competitive with the Nvidia high end (such as 1080Ti) video cards for gaming. The money just is not there based on how much you would have to spend to get there.

Where Intel is focused is in the compute market that Nvidia, AMD and several others are all working hard on, that is where the profits lie and the chance to get your products to excel for various specific use cases.

Mid low range integrated stuff sure, but has anyone been whining about the lack of availability for 1030 class cards?

Nothing that Intel or AMD are currently working on with integrated graphics or Intel's own graphics design are going to make any dent in the availability of the "precious" high end video cards for "gamers".
 
I don't really think it "Can be done" in a business profitable sense.
Even Intel is not going to try to jump in and make a video card that is performance competitive & price Competitive with the Nvidia high end (such as 1080Ti) video cards for gaming. The money just is not there based on how much you would have to spend to get there.

Where Intel is focused is in the compute market that Nvidia, AMD and several others are all working hard on, that is where the profits lie and the chance to get your products to excel for various specific use cases.

Mid low range integrated stuff sure, but has anyone been whining about the lack of availability for 1030 class cards?

Nothing that Intel or AMD are currently working on with integrated graphics or Intel's own graphics design are going to make any dent in the availability of the "precious" high end video cards for "gamers".

cards that are very good for computing can often be easily made into a gaming card if the company wants to. if intel made a high prefermance gpu for datacenters they already have a high preformance gaming card if they edit drivers and possible pcb/fan abit.
 
The only way another player would enter if Nvidia and AMD went belly up and left a void in the market. If that happens we would go backwards for a bit cause a upstart company will not make a comparative GPU for a while. This while dumping billions in R&D to get something going from scratch. Not to mention all the patent issue that would arise from legit owners and trolls. Samsung or Apple could do it but they don't see the ROI atm.
 
How much money in government grants would we need to give Elon in order for him to make a graphics card?

The government subsidizes everything, even marriage and home ownership.

Musk financed his own company. SpaceX won their NASA money via a competition with Boeing et all. I don't see what's wrong with that. In a similar vein I imagine there's money involved in the high-performance computing initiative. The industry doesn't just need a shake up, we need to address the problem of developing post-silicon computers.
 
Actually would like to see that...Nvidia has definitely always monopolized it sooo bad lol.
 
I'm pretty curious what Apple could pull off in the desktop graphics space. Their A11 is pretty damn impressive in a tiny, power-efficient form-factor.
 
The government subsidizes everything, even marriage and home ownership.

Musk financed his own company. SpaceX won their NASA money via a competition with Boeing et all. I don't see what's wrong with that. In a similar vein I imagine there's money involved in the high-performance computing initiative. The industry doesn't just need a shake up, we need to address the problem of developing post-silicon computers.

Tesla and SpaceX have received over $5B in government money. The government provides a $7500 rebate on purchases of the Tesla. The average Tesla household makes $503,000 per year. Do you think they need more government handouts?
 
I liked Matrox cards as well (I had/have a G400 Max, a G450, and a Parhelia 256). They were great for 2D, but Matrox never had anything that could compete with their peers of the time in 3D gaming. Plus, their cards always carried a premium price. They were wise to get out of the market--I doubt we'll ever see them back in it again.

The G400/G400 Max were competitive.

They didn't survive the jump to the hardware T&L capabilities, and by the time Parhelia was released, they were focusing on features that weren't in games yet (surround, dual/quad texturing), and weren't fast enough to compete with GF3/GF4/Radeon 8500/9700.
 
The problem is shortage due to mining idiocy... introducing a new GPU manufacturing will just bottleneck the fabs even more. Right now is literally the worst time for a newcomer.
 
The problem is shortage due to mining idiocy... introducing a new GPU manufacturing will just bottleneck the fabs even more. Right now is literally the worst time for a newcomer.

not really if a manufacurer can make a half decent card that can mine abit it will sell very well even if it cant game well. that would give them the time and money to perhaps put together a future card thats more competitive. mining has bee fantastic so far for the manufacturers. vega would not fly off the shelfs and amd wouldnt have been anywhere near as sucsessful this year if it wasnt for mining. however if mining crashes it will have horrific effects on both amd and nvidia.
 
Would you purchase a card from a new manufacturer? Not ATI/AMD or NVidia for your gaming card? I think it’s time for a new “3dfx” to pop into the game. I’m fed up with the prices, and supply chain. We need more competition. It’s the only way to fix this f’in problem brought about from cryptoassholes.

How is another "manufacturer" going to fix supply?

One, AMD and nVidia are not "manufacturers". They are chip designers/makers. They sell the GPU processors to other manufacturers like EVGA, ASUS, MSI, Zotac, Gigabyte, PNY, etc.
Two, they don't make the GPU themselves. TMSC and Samsung are the "manufacturers" of GPUs.
Three, the only way another manufacturer can increase supply is to bring in manufacturing that are not currently used to make GPUs. So in other word Intel. If Intel started making discrete GPUs, they will bring their own manufacturing and therefore increase supply.
 
I believe within 2 years Intel will have their own GPU one that will be on par with the competition but offer the benefit of being a SOC device. The chief architect from ATI/AMD is due to start this year and he has already hinted that he has something up his sleeve already.
 
I believe within 2 years Intel will have their own GPU one that will be on par with the competition but offer the benefit of being a SOC device. The chief architect from ATI/AMD is due to start this year and he has already hinted that he has something up his sleeve already.

If his VEGA track record is any indication...
 
The G400/G400 Max were competitive.

They didn't survive the jump to the hardware T&L capabilities, and by the time Parhelia was released, they were focusing on features that weren't in games yet (surround, dual/quad texturing), and weren't fast enough to compete with GF3/GF4/Radeon 8500/9700.

The thing is - you have to sustain that development relentlessly to fight in this market, and that costs an absolute fortune.
 
The thing is - you have to sustain that development relentlessly to fight in this market, and that costs an absolute fortune.


yep dev costs are crazy for GPU's now. I was shocked to see how much more nV has in patent development over AMD in graphics in the past 3 years, they literally have x2 the amount of patents in the past 3 years. That really shows how far AMD has fallen off the radar when it comes to GPU technology, and to get that back, they need to spend that 3 to 4 years of R&D to become competitive again.
 
yep dev costs are crazy for GPU's now. I was shocked to see how much more nV has in patent development over AMD in graphics in the past 3 years, they literally have x2 the amount of patents in the past 3 years. That really shows how far AMD has fallen off the radar when it comes to GPU technology, and to get that back, they need to spend that 3 to 4 years of R&D to become competitive again.

i would say they have maintained to be competitive throughout all of this. 290 was a good card. fury wasnt bad, vega is alright (and can mine :p ) i dont know if they are able to pull off a card much better then what nvidia has but they remain producing cards that are all decent enough on price/preformance
 
i would say they have maintained to be competitive throughout all of this. 290 was a good card. fury wasnt bad, vega is alright (and can mine :p ) i dont know if they are able to pull off a card much better then what nvidia has but they remain producing cards that are all decent enough on price/preformance

They haven't been able to at all, it shows with the amount of graphics patents AMD has done in the past 3 years vs what nV has done. That is where the R&D nV is spending is going. AMD's total R&D of CPU's and GPU's is the same as what nV has for just graphics. Its a staggering difference. It shows, with what happened with Maxwell vs the 3xx series, and Fiji, and Polaris and Vega. AMD has been standing still when it comes to pushing development, they just didn't and still don't have the money to do it.
 
How is another "manufacturer" going to fix supply?

One, AMD and nVidia are not "manufacturers". They are chip designers/makers. They sell the GPU processors to other manufacturers like EVGA, ASUS, MSI, Zotac, Gigabyte, PNY, etc.
Two, they don't make the GPU themselves. TMSC and Samsung are the "manufacturers" of GPUs.
Three, the only way another manufacturer can increase supply is to bring in manufacturing that are not currently used to make GPUs. So in other word Intel. If Intel started making discrete GPUs, they will bring their own manufacturing and therefore increase supply.

Many companies depend on 'Fabless" manuf. Its just shows how expensive is to build a new place to make stuff really, really small. Big money to go down to 14 nm dimensions. The parts that come out of a fab are still the intellectual property of the designer. The factory just made my copyrighted design. If intel got into the game, supply would increase but because they already have a new fab commin on line.

Dont have numbers stating how much capacity intel has in storage but I bet its not much. Memory especially is a worldwide shortage. Opportunity to fill in some gaps.
 
Right. Intel is one of the few tech companies that owns its own fabs. They are also the one with the most to loose to GPGPU as Volta is set to eat Intels lunch in machine learning and data center. GPGPU is showing that it scales far better than processors. It costs less and it's faster. If they do nothing then nVidia will become the new Intel within less than a decade (and during that entire time it will be the decline of Intel). If they don't get into the fight soon the "uphill battle" will become an impossible one.

I don't think there is anyway that Intel doesn't make waves, but here's the thing: I don't know if that will have any affect on gamers. Their interest will be workstation and server. Then at the bottom end with integrated. Without too much interest in-between. Vertical integration will help them at both the top end and the bottom. I'm not sure if they simply care enough for the middle.
 
Right. Intel is one of the few tech companies that owns its own fabs. They are also the one with the most to loose to GPGPU as Volta is set to eat Intels lunch in machine learning and data center. GPGPU is showing that it scales far better than processors. It costs less and it's faster. If they do nothing then nVidia will become the new Intel within less than a decade (and during that entire time it will be the decline of Intel). If they don't get into the fight soon the "uphill battle" will become an impossible one.

I don't think there is anyway that Intel doesn't make waves, but here's the thing: I don't know if that will have any affect on gamers. Their interest will be workstation and server. Then at the bottom end with integrated. Without too much interest in-between. Vertical integration will help them at both the top end and the bottom. I'm not sure if they simply care enough for the middle.


Its already a up hill battle, it took nV 12 years going on 13 years to make CUDA and HPC markets what they are now, its very hard for Intel to do the same in a decade, its going to take them 5 years minimum to do even think about taking on nV, and in the mean time? 5 years, 3 years to make the hardware, and then another 2 years to get people on board to do software. 2 years for software is very fast, in a market that nV will be entrenched in.

If they don't hit it out of the park with their first discrete GPU (which I don't see how they will with nV iterating so fast with vastly different approaches for different market segments), expect it to be another 3 years for the next generation of hardware, in the mean time, software is the key, and where Phi failed to deliver, its premise of x86 code sounded great, but upgrade paths were not good, as many things changed for new generations, this can't happen, they need to be backward compatible code wise at least for 3 or 4 gens, something they over looked and now developers aren't using Phi as much.
 
Last edited:
Its already a up hill battle, it took nV 12 years going on 13 years to make CUDA and HPC markets what they are now, its very hard for Intel to do the same in a decade, its going to take them 5 years minimum to do even think about taking on nV, and in the mean time? 5 years, 3 years to make the hardware, and then another 2 years to get people on board to do software. 2 years for software is very fast, in a market that nV will be entrenched in.

If they don't hit it out of the park with their first discrete GPU (which I don't see how they will with nV iterating so fast with vastly different approaches for different market segments), expect it to be another 3 years for the next generation of hardware, in the mean time, software is the key, and where Phi failed to deliver, its premise of x86 code sounded great, but upgrade paths were not good, as many things changed for new generations, this can't happen, they need to be backward compatible code wise at least for 3 or 4 gens, something they over looked and now developers aren't using Phi as much.

depends on where they are trying to compete. intel already knows how to make a high powered accelerator they just dont have much software backing it. I have alot of the x200 intel phis (the bootable cpus) and if the program can take advantage of the prosseser to the fullest the phi can absolutely kill any gpu out there. for example mining monero (likes alot of super fast mem) the phi will do ~3000h/s the next highest (idk what the tesla v100 is) is a vega 64 at 2000h/s and the phi is more power effeciant and much more versatile being a actuall cpu. I would love to see intel double the power envolope and double the core count but they would need to do that via pcie cards and they kinda abandon those cards.
 
Its already a up hill battle,

...oy, I hate to nitpick, but that's what I said.


it took nV 12 years going on 13 years to make CUDA and HPC markets what they are now, its very hard for Intel to do the same in a decade, its going to take them 5 years minimum to do even think about taking on nV, and in the mean time? 5 years, 3 years to make the hardware, and then another 2 years to get people on board to do software. 2 years for software is very fast, in a market that nV will be entrenched in.

If they don't hit it out of the park with their first discrete GPU (which I don't see how they will with nV iterating so fast with vastly different approaches for different market segments), expect it to be another 3 years for the next generation of hardware, in the mean time, software is the key, and where Phi failed to deliver, its premise of x86 code sounded great, but upgrade paths were not good, as many things changed for new generations, this can't happen, they need to be backward compatible code wise at least for 3 or 4 gens, something they over looked and now developers aren't using Phi as much.

I don't disagree with you, in the sense that there will probably have to be work done around the clock to get into the market. However even with all this, there is no choice. Adapt or die. More or less point blank. GPGPU is the most disruptive thing to processors in probably 30 years.
However, Intel is uniquely positioned with huge stacks of cash, a massive R&D budget, and the ability to headhunt the best. Under good direction I think they could have something competitive in the next 3 years (with several generations of product that are basically evolutions of where they are trying to go). However, if Intel can at least do okay in performance per watt. And performance per dollar, it won't hurt nearly as much as being unable to compete in any aspect at all of the workstation/server market.

Direction is critical. They definitely need strong leadership with vision in this regard. However Intel obviously has huge amounts of partnerships. If they are smart, which they mostly are, they'll leverage everything they can to get people on their platforms.
 
...oy, I hate to nitpick, but that's what I said.




I don't disagree with you, in the sense that there will probably have to be work done around the clock to get into the market. However even with all this, there is no choice. Adapt or die. More or less point blank. GPGPU is the most disruptive thing to processors in probably 30 years.
However, Intel is uniquely positioned with huge stacks of cash, a massive R&D budget, and the ability to headhunt the best. Under good direction I think they could have something competitive in the next 3 years (with several generations of product that are basically evolutions of where they are trying to go). However, if Intel can at least do okay in performance per watt. And performance per dollar, it won't hurt nearly as much as being unable to compete in any aspect at all of the workstation/server market.

Direction is critical. They definitely need strong leadership with vision in this regard. However Intel obviously has huge amounts of partnerships. If they are smart, which they mostly are, they'll leverage everything they can to get people on their platforms.


I agree with ya, I just elaborated a bit more :)

Intel's partnerships only work for where they have a strong hold in their current markets, not new ones or ones they are weak in. This is because Intel needs to provide something that will not force their partners to spend exorbitant amount of money, AKA rewriting software to fit their processors, this has never worked in the entire computer industry at any time. Intel tried it with Itanium, failed miserably. They tried it with Atom and smart phones, failed again, they tried it with tablets, failed a third time. Even with contra revenue with tablets as well. Intel tried to pay developers to program for Itanium as well, failed there too. These were not small amounts we are talking 2 billion a year and more! By the time Intel has a GPU that can match nV, they will need to have their software stack ready to go too. If they don't forget it, the HPC/DL markets won't shift at all. It will all be for not. Just the way AMD has been doing their open source HSA programs, we won't see anything out of Intel. Intel is part of HSA as well, and they still haven't pushed HSA much. I don't see that changing at all, I haven't heard anything new yet for Phi on this front. HSA is going at a snails pace, while every version of CUDA is getting better, more features, speed specific features, we aren't seeing the same adoption of hardware or software techniques from HSA members.
 
If his VEGA track record is any indication...

Lets be honest, RTG in AMD have been running on fumes for quite a lot of years due to minimal funding.

However don´t really expect Intel to be much different besides bigger R&D and better nodes. A CPU company is always a CPU company first.
 
IMHO, VEGA is only a small piece of the pie Koduri helped create. He was originally with ATI when the Radeon line was created and that has been a driving force as far as competition with Nvidia. I have much respect for the guy, and hope he can pull something out of his hat and we profit.
 
Once the Bitboys Oy Glaze3d finally ships, NV, 3dfx, and AMD will be in serious trouble I tell you.
 
GPGPU is the most disruptive thing to processors in probably 30 years

Just don't forget that a graphics card without a CPU is a brick. GPGPU only leverages a GPU when it gets code that can be processed because its easily parallel.

Its still follows the old axiom:

Use a CPU to do complex manipulations to a small set of data.
Use a GPU to do simple manipulations to a large set of data.
<paraphrased>
 
Just don't forget that a graphics card without a CPU is a brick. GPGPU only leverages a GPU when it gets code that can be processed because its easily parallel.

Its still follows the old axiom:

Use a CPU to do complex manipulations to a small set of data.
Use a GPU to do simple manipulations to a large set of data.
<paraphrased>

Sure. But things are going to get reversed in big data. It’s likely that there will be a move from having 8 cpu systems to 2 cpu systems in favor of having 4+ video cards.

Additionally there has been no major movement on the CPU side since sandy bridge, with mostly just a focus on optimization and decreased tdp.

Hell, there’s even a chance that ARM will take over with lower costed processors with nvidia taking the data, ai side and intel just getting squeezed out. (NVidias autonomous driving platform as an example utilized two custom arm processors, which isn’t precisely a server, but it starts there).

I’ll freely admit that I’m spit balling about consequences, but what I am not, is that there is going to be a big shift to graphics cards that will for sure affect intel in terms of how much they make and their prominence in the market.
 
Sure. But things are going to get reversed in big data. It’s likely that there will be a move from having 8 cpu systems to 2 cpu systems in favor of having 4+ video cards.

Additionally there has been no major movement on the CPU side since sandy bridge, with mostly just a focus on optimization and decreased tdp.

Hell, there’s even a chance that ARM will take over with lower costed processors with nvidia taking the data, ai side and intel just getting squeezed out. (NVidias autonomous driving platform as an example utilized two custom arm processors, which isn’t precisely a server, but it starts there).

I’ll freely admit that I’m spit balling about consequences, but what I am not, is that there is going to be a big shift to graphics cards that will for sure affect intel in terms of how much they make and their prominence in the market.

Just don't forget that a graphics card without a CPU is a brick. GPGPU only leverages a GPU when it gets code that can be processed because its easily parallel.

Its still follows the old axiom:

Use a CPU to do complex manipulations to a small set of data.
Use a GPU to do simple manipulations to a large set of data.
<paraphrased>


True, both of you are correct but the future is a bit different, one of Volta's key changes in their SIMT is to enable fine grain transition between threads on an ALU level. Something that CPU's are capable of doing, and GPU's weren't.
 
The G400/G400 Max were competitive.

They didn't survive the jump to the hardware T&L capabilities, and by the time Parhelia was released, they were focusing on features that weren't in games yet (surround, dual/quad texturing), and weren't fast enough to compete with GF3/GF4/Radeon 8500/9700.

That's not entirely the reason. It's true that the G400/MAX was competitive/beats TNT2/U, but GeForce 256 was released shortly after. Matrox just didn't want to raise venture capital for R&D purposes. Nvidia/ATI raised a LOT of venture capital and/or took on corporate debt. It was the primary reason they rocketed ahead of everyone else. Matrox just played it safe and thought they could survive with their niche market. But Quadro NVS pretty much forced Matrox back in. By that time, it was too late. I remember Matrox stating they need hundreds of millions of $ to develop a competitive GPU from scratch. It was not a gamble they were willing to take.

Aside from Intel, the only other tech company that could remotely come close to a competitive GPU is PowerVR. They would need to scale their phone GPU's to desktop TDP's.
 
The problem is shortage due to mining idiocy... introducing a new GPU manufacturing will just bottleneck the fabs even more. Right now is literally the worst time for a newcomer.

Shortage in the US you mean, right?
Cards are in stock in EU/Asia...both for consumers and enterprise.
The problem is that people in the US thinks that the US being 3rd priority for SKU's means there is a global shortage...there isn't....the US is just allocated cards last, simple as that...hance why some people in the US are thinking there is a global shortage.
 
The problem is shortage due to mining idiocy... introducing a new GPU manufacturing will just bottleneck the fabs even more. Right now is literally the worst time for a newcomer.

https://www.massdrop.com/buy/aorus-geforce-gtx-1080-ti-11g/talk/1976701

may need to create an account to view this post

Pretty much the primary reason for the graphics card supply issues is a memory supply issue. Cryptomining is a secondary cause of this.

Unfortunately for this market everything hit at the same time.

This might be affecting nV's next gen launch!
 
Yeaah, Mtrox was doomed unless they bet it all n performance 3D. It was fairly easy to release a card competitive with 3DFX, but taking it to the next level with T&L and competent performance was a task too hard for most companies.

between 1998 and 2002, six "competent" consumer 3D companies were completely plowed-over by Nvidia:

Rendition
3DLabs
3dfx
S3
Real3D
Matrox

Because doing more than basic 3D takes some real engineering talent. That's why ATI's purchase of ArtX was so important, because Nvidia already started with a few SGI engineers (who knew what lighting and effects were). And 3dfx wasted their SGI talent, until Nvidia bought them.

http://boards.fool.com/battle-of-the-ex-sgi-engineers-10143854.aspx

It wasn't new technology, just new for the consumer space. SGI didn't know how to manage that transition, so they bled engineers and eventually died.

Nvidia and ATI didn't really start innovating until DX9. Everything prior to that was available on professional cards in the 1990s. Just see RealityEngine2, which has hardware MSAA, and real-time lighting and environment mapping.

Up until that, they just copied stuff from the engineers they bought.
 
Last edited:
Back
Top