AI: Need for a New Technological Breakthrough

While speaking at Nvidia GTC 2025 event, LeCun, AI Chief Meta describes LLMs as just token generators which are in discrete space. He is interested in next-generation model architectures. These should be able to do your things — understand physical world, have persistent memory, be capable to plan and reason.

LLMs learn through text data which cannot achieve human-level AI. A typical LLM is trained on 20 trillion tokens or words equalling 10 to the power 14 bytes — one with 14 zeros behind it. Huge amount of information. However, a child of four years receives the same amount of information through visual system. In four years, a child is awake for about 16000 hours. Information reaches its brain through optic nerve. It is about 2 megabytes per second. Calculation shows that it equals 10 to the power of 14 bytes. Thus, a child of four years is exposed to the same information as the biggest LLM. It means we can never get human level AI by just training on text.

Meta itself has not released a new version of Llama in a while. OpenAI is trying to scale LLMs. It is to be seen how things go, and which approach works out. Meta is bearish on LLMs and expects a technological breakthrough to enhance the performance of AI systems.

print

Leave a Reply

Your email address will not be published. Required fields are marked *