Facebook Is Allegedly Working on Custom Machine Learning Hardware

AlphaAtlas

[H]ard|Gawd
Staff member
Joined
Mar 3, 2018
Messages
1,713
Nvidia GPUs are the undisputed king of the machine learning hardware market today, but more and more companies are throwing their hat into the AI ring. Google has already introduced their machine learning-focused TPU, and other giants like Amazon and Intel are reportedly following suit, while a number of smaller startups are filling in niches or taking riskier approaches to compete with the bigger players. Last year, various reports surfaced claiming that Facebook was working on their own, custom ASICs, but an EE Times report said that it was "not the equivalent of [Google's] TPU." Now, according to a Bloomberg report published earlier this week, some of Facebook's upcoming custom silicon may focus on machine learning after all. Facebook's chief AI researcher says that "the company is working on a new class of semiconductor that would work very differently than most existing designs," and mentioned that future chips will need radically different architectures.

"We don't want to leave any stone unturned, particularly if no one else is turning them over," he said in an interview ahead of the release Monday of a research paper he authored on the history and future of computer hardware designed to handle artificial intelligence... LeCun said that for the moment, GPUs would remain important for deep learning research, but the chips were ill-suited for running the AI algorithms once they were trained, whether that was in datacenters or on devices like mobile phones or home digital assistants.
 
This could really play into AMD's hand with them using chiplets. It would allow them to relatively easily add new custom AI chips to the package. 6 ryzen cores, 1 AI core, 1 GPU core in a single chip package.

could be
 
I think the future will be specialized hardware. I don't think that AMD will have a play in the space with Ryzen other than at the hobbyist level. It would be hard to compete with Google, Amazon, and Facebook in this space.
 
Nvidia rules the ecosystem currently. There software is way ahead and better supported than anyone else, but if it is in house that won't matter for these companies with the massive resource they have. Xilinx with their FPGAs are a nice crossover between the general purpose and app specific chips. I'm hoping they continue to make strides. I own both Nvidia and Xilinx, though I have reduced my position in Nvidia by 65% since October and increased Xilinx.
 
We don't want Nvidia to be the only show in town. The market needs competition. Still, FB? I have no trust for that company at all.
 
Exactly, once it comes online and finds out through facebook that we are weak & misguided, it will be the end for us.

Exactly. Once the AI gets done processing data from likes on cute kitten videos it will decide that we all need a to be controlled or eliminated.
 
I would wonder what about Tensor cores are insufficient for them? Google invested quite a bit of capital into that and its pretty much become (or becoming) a standard. I always question whenever someone wants to deviate...why? What doesnt work for you? Can it be improved to make it work?

I have a hard time believing their work is so fundamentally different from Googles that they would need custom hardware. Seems wasteful.
 
I would wonder what about Tensor cores are insufficient for them? Google invested quite a bit of capital into that and its pretty much become (or becoming) a standard. I always question whenever someone wants to deviate...why? What doesnt work for you? Can it be improved to make it work?

I have a hard time believing their work is so fundamentally different from Googles that they would need custom hardware. Seems wasteful.

It's wasteful in the sense that this technology will only serve more ads and speedup/enhance data-massaging -- what a fucking waste.
I find myself agreeing more and more with blogposts like this: http://tonsky.me/blog/disenchantment/
 
Back
Top