Technology

Bard to Gemini and TPUs: Decoding Google's multi-pronged AI strategy

Asia / India0 views1 min
Bard to Gemini and TPUs: Decoding Google's multi-pronged AI strategy

Google has transitioned from Bard to Gemini AI models and introduced new Tensor Processing Units (TPUs) to improve its AI capabilities. The company's custom AI chips and infrastructure are designed to support both training and inference of AI models.

Google has been investing heavily in artificial intelligence research and infrastructure. The company has transitioned from Bard to Gemini AI models and introduced a new generation of custom AI chips called Tensor Processing Units (TPUs). TPUs are designed to handle mathematical operations that AI models rely on, and Google has been building its own chips since 2015. Initially, TPUs were focused on inference, but later versions were designed to handle both training and inference. Google connected hundreds of TPUs into clusters known as pods, creating a training supercomputer. The company's strategy is to build an entire AI infrastructure stack, keeping TPUs flexible to support a wide range of models.

This content was automatically generated and/or translated by AI. It may contain inaccuracies. Please refer to the original sources for verification.

Comments (0)

Log in to comment.

Loading...