As we have observed in previous articles, generative AI has already been incorporated by Microsoft to its Bing search engine, and Google too is likely to follow suit by deploying the technology for its Google search engine.
We have also observed that in traditional searches, the web crawlers scanned the internet to compile an idex of information.
On the other hand, ChatGPT type AI search uses huge computing power — there are chips worth of billions of dollars, which are deployed over their long useful life of several years, and have a long payback period. Another cost is that of electricity. The companies have to meet their carbon footprint goals AI-powered search queries are handled by a process leading to inference and consists of a neural network resembling how the CNS infers answer to a query. Googles in-house chips called Tensor Processing Units or TPUs along with other optimisers handle this.
Thus all said and done, AI-model accounts for high expense. ChatGPT computing costs a couple or more cents per conversation. The language model is likely to cost 10 times more than the standard key search. Of course, there could be additional revenue from chat-based advertising but still it may not be enough to prevent erosion of the bottomline of the current search engines. There is several billions worth of additional costs — there could be a $6 billion dollar increase in costs for Google by 2024, even if it handles 50 per cent queries it receives with short answers.
Microsoft expects to gain additional users and more advertising revenue after empowering Bing with generative AI.
AI-chat involves charts, videos and other generative tech. It raises the expenses between 30-50 per cent.
However, as we know, when technology is scaled up, it becomes economical and that could happen over a period of time.