The gold standard for big tech firms and startups that need compute power to run and develop AI platforms is Nvidia chips. Since quite some time Nvidia stock is facing headwinds as investors fear an AI bubble.
As we know, graphic processing units or GPUs , from Nvidia were created to accelerate the renderings of graphics — mainly in video games and other visual effects applications. However, these GPUs turned out to be well-suited to training AI models because they can handle large amounts of data and computations.
Google uses TPUs or tensor processing units which are application-specific integrated circuits or microchips. These were designed for a discrete purpose. The same tensor chips were adapted as an accelerator for AI and ML tasks in Google’s own applications.
Google and DeepMind both develop cutting edge AI models (such as Gemini) and they make available lessons so learnt to the chip designers.
Google wants to tie up with different organizations to establish tensor processing units or TPUs in data centers. Google can rival Nvidia as a leader in AI technology. Meta or Facebook plans to use Google’s TPUs. Google cloud offers both TPUs and Nvidia’s GPUs. Anthropic has already agreed to buy 1 million TPUs from Google. It shows third-party users providers of LLMs are likely to leverage Google as a secondary supplier of accelerator chips for inferencing in near future.
Leave a Reply