With respect to LLMs, we talk about the parameters. An LLM consists of billions of parameters. Actually, these are the weights (numericals) assigned to the connections between the nodes of neural network architecture. These weights determine the strength and influence of each connection. It ultimately shapes the LLM’s understanding of language and its ability to generate text. It makes an LLM learn patterns and relationships within the language.
Hyperparameters are the settings that control the training process. These do not affect the LLMs internal network structure. These influence the way an LLM learns from the training data. To illustrate, hyperparameters are the learning rate, batch size and the number of training epochs.
Parameters are adjustable weights. They are adjusted during training based on training data. They affect the processing of information in the network.