Nvidia has become an important supplier of GPU chips to run AI. Microsoft is expected to release its AI chip code-named Athena in collaboration with AMD. OpenAI which uses Nvidia chips loaded supercomputer, built by Microsoft. The computer employs thousands of Nvidia chips. Google has developed its own chips — tensor processing units (TPUs). These TPUs are optimised for use in neural networks. However, these are not good for word processing or executing bank transactions. Google TPUs can be used by other enterprises — say for amounts ranging from $3000 per month to $100,000 per month. Google is also planning to launch chips for its Chromebooks. Baidu, a Chinese search giant has designed chips called Kunlun used in autonomous driving and NLP. Tencent, another Chinese company operating WeChat, is designing chips which can process vast data with a focus on AI, and for image and video processing.
It will take 4-5 years for other companies to catch up with the leader — Nvidia. Nvidia itself has not stopped innovation. Some predict that rival companies may take a decade to catch up with Nvidia.
Nvidia, as we have already observed, partnered with Foxconn to set ‘AI factories’ — a new kind of data centre.
There are many challenges in developing AI chips — complex supply chain issues, dearth of talent, the challenge of getting the chip right, long design and development cycle. The investment is about $1.5 billion to design a single 3nm chip with a complex GPU. Are there buyers to justify the huge investment?
The demand for AI chips will go up. The market will expand. It is going to be a $400 billion market by 2032. It is too big for Big Tech to ignore.