Deep Learning Hardware

Google claims that its Tensor Processing Units (TPUs) are faster (1.7 times) than the A100 chips from Nvidia which power most AI applications. TPUs are more energy efficient (1.9 times) than the A100. Thus Google’s processing of AI is greener.

Nvidia’s A-100 core GPU is based on the new Nvidia Ampere GPU architecture. It adds many new features and delivers faster performance for HPC, AI and data analytics workloads.

Google’s TPUs are application-specific integrated circuits (ASICs) especially to accelerate AI. These are liquid cooled and designed to slot into server racks. They deliver up to 100 petaflops of compute and power Google products like Google Search. Google Photos, Google Translate, Google Assistant, Gmail and Google Cloud AI APIs.

CPUs are central processing units. GPUs are graphical processing units to enhance the graphical performance. TPUs are optimized for tensor operations. The CPU architecture is general purpose. GPUs offer flexibility and precision options. TPUs are optimized for tensor operations. GPUs have greater memory bandwidth than TPUs but higher power consumption. TPUs are energy efficient and performance efficient.

print

Leave a Reply

Your email address will not be published. Required fields are marked *