Input layer receives inputs from external sources. It passes these to the hidden layer. Each neuron in the hidden layer gets information from the previous layer. It computes the weighted total. This is transferred to the neurons in the next layer.
Weighted means the impacts of the inputs from the preceding layer are more or less optimised by giving each input a distinct weight. These weights are adjusted during the training process to enhance the performance of the model.
Neurons are also called units. Whether a layer has a dozen or million units depend on the complexity of the data — the patterns in the dataset.
Each neuron receives input from the previous layer. It then gives output which becomes the input of the neuron in the next layer. The process continues till the final layer. The final layer produces the output of the network.
Units (neurons) are linked to one another. Each of these links has weights that control how much one unit influences the other. The neural network continues learning as the data moves from one unit to another.