AI Hardware

Google utilises supercomputers to train its AI models. Google’s systems are faster and more power-efficient than those used by Nvidia. Google has designed its own customised chip called Tensor Processing Unit (TPU). Google now runs its fourth generation of TPU.

Google has strung more than 4000 of the chips together into a supercomputer by using its own custom-developed optical switches.

print

Leave a Reply

Your email address will not be published. Required fields are marked *