AI has become a competitive market and there is a race among chip makers — AMD, Intel and Nvidia to have a pie in this market.
In AI computing, Nvidia has a commanding lead. Nvidia’s GPUs dominate AI training. However, when the AI system is deployed in corporates and to individuals to make predictions and decisions, there is a need for AI inferencing. Business could see tangible returns on AI investments at the inferencing stage. AMD and Intel are positioning themselves here to capitalise on this opportunity.
No doubt, Nvidia’s GPUs are gold standard for AI training. AI inferencing soon is going to be a larger market than training over a period of time. AMD and Intel are positioning their CPUs and GPUs to capitalise on this transition. These would prove to be power-efficient and cost-efficient alternatives for the enterprises. It will force Nvidia to lower its prices.
Currently, Nvidia draws its major revenue from data centers and therefore AMD and Intel eye inferencing market. This can alter competitive landscape. AI training is concentrated in data centers. Inferencing is expected to take place closer to users on edge devices — smart phones, autonomous vehicles and IoT systems.
Nvidia too is not sitting idle. It is expanding its portfolio to include CPUs and optimized GPUs for inferencing.
Leave a Reply