Web today consists of 2.35 crore petabytes of data, and the data grows by 70 terabytes per second. Web is thus a vast data store that doubles in size every year. ChatGPT has been trained on internet data up to 2021, and misses data thereafter. A search engine has access to the latest updates on the web.
Computing power required for generative models and search engines differ. Generative models have been trained on massive amount of data and consume considerable computational resources. In its training, there are multiple layers of neural networks, which require huge computing power to process data. Generative model is capable of handling complex natural language queries and generate creative solutions in real time.
Search engines use deterministic algorithms that use pre-existing information to produce results quickly and accurately. The model does not use the same amount of computational resources as a generative model does. Search engines do not generate any new content. They rank the web links on certain factors. In order to up date, these algorithms too require significant computational resources. Besides they handle huge traffic of users. All said and done, being simpler, the algorithms comparatively use less computing power.
Search engines are for accurate results quickly. Generative models are suitable for handling complex open ended queries and provide creative solutions. These two offerings can overlap, but serve different purposes. They can coexist.