There was an annual technology conference for startups at Paris. It is called Viva Tech. Yann LeCun, chief of Meta AI advised young students not to work on LLMs if they are interested AI systems.
Youngsters should be interested in building next generation AI systems. Hence, they should not work on LLMs which are controlled by Big Tech. Instead, they should build AI systems that overcome the limitations of LLMs.
Mufeed, a young creator of Devika (a Devin alternative) spoke about moving away from Transformer architecture and develop new architecture, e.g. RMKV (a RNN architecture). It has an expanded context window and inference. Such an approach could lead to building something as impressive as GPT-4.
LeCun also recommends open source. Ultimately, all our interactions with the digital world will be through AI assistants, and there should be a large number of AI assistants.
Though LeCun is not in favour of Transformer models, they too are evolving. To illustrates, there is GPT-4o. It understands video and audio natively.
How much smarter AI can get? Much much smarter. Sam Altman says data would not be a problem anymore, thus addressing the concerns for training LLMs.