Nvidia GPUs are the undisputed king of the machine learning hardware market today, but more and more companies are throwing their hat into the AI ring. Google has already introduced their machine learning-focused TPU, and other giants like Amazon and Intel are reportedly following suit, while a number of smaller startups are filling in niches or taking riskier approaches to compete with the bigger players. Last year, various reports surfaced claiming that Facebook was working on their own, custom ASICs, but an EE Times report said that it was "not the equivalent of [Google's] TPU." Now, according to a Bloomberg report published earlier this week, some of Facebook's upcoming custom silicon may focus on machine learning after all. Facebook's chief AI researcher says that "the company is working on a new class of semiconductor that would work very differently than most existing designs," and mentioned that future chips will need radically different architectures. "We don't want to leave any stone unturned, particularly if no one else is turning them over," he said in an interview ahead of the release Monday of a research paper he authored on the history and future of computer hardware designed to handle artificial intelligence... LeCun said that for the moment, GPUs would remain important for deep learning research, but the chips were ill-suited for running the AI algorithms once they were trained, whether that was in datacenters or on devices like mobile phones or home digital assistants.