Hardware-based Neural Networks

The traditional neural networks are built on networking perceptrons which are simulations of neurons in the brain. A large number of perceptrons make a powerful model but consume huge amount of energy.

As we know, perceptrons are software abstractions. Its network is run on GPUs. It amounts to translating that network into the language of hardware.

Researchers have devised ways to make computer vision (CV) systems more efficient by building such networks on computer chips’ logic gates. Felix Peterson did such research at Stanford.

Logic gates are made up of a few transistors. They accept two bits (1s and Os) as inputs. Specific pattern of transistors output a single bit. On the lines of perceptrons, logic gates can be chained up into networks. These logic gates can be ‘relaxed’ enough for backpropagation by creating functions that work like logic gates on Os and 1s. They also give answer for intermediate values. Peterson ran simulated networks with those gates through training. Later he converted relaxed logic gate network back into something that can be implemented in computer hardware. Of course, this type of training is tough. Each node can end up as any one of 16 different logic gates and there are 16 probabilities associated with each of these gates. They should be tracked and adjusted continuously. It takes time and energy. It takes longer than the training of neural networks on GPUs.

However, once the network is trained, things become economical. It is a cute idea, and we have to see how well it scales. Peterson plans to continue pushing the abilities of his logic gate networks, so as to create, what he calls a ‘hardware foundation model’. Though it may not be the best model, it could be the cheapest.

The idea is suitable for edge deployment. The advantages are energy efficiency, speed, edge deployments and dedicated functionality.

The challenges are lack of flexibility, the development complexity involving significant engineering effort, scalability since along with complexity of CV, the number of gates increase and loss learning capabilities since these cannot be retrained like neural networks.

A promising compromise is neuromorphic computing or a hybrid system where logic gates perform fundamental operations and neural networks are emulated using efficient hardware.

This research can redefine how we think about AI on edge devices and IoT systems.

print

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *