- Google presents Ironwood, its seventh generation TPU
- Ironwood is designed for inference, the new great challenge for AI
- It offers great advances in power and efficiency, and even exceeds El Capitain Supercomputer
Google has revealed its most powerful artificial intelligence training hardware to date, since it seems to take another important step in inference.
Ironwood is the seventh generation tensioner processing unit (TPU), the hardware that drives Google Cloud and its customers AI training and management of the workload.
The hardware was revealed in the Google Cloud Next 25 of the company in Las Vegas, where it was interested in highlighting the great advances in efficiency, which should also mean that workloads can work more profitable.
Google Ironwood Tpu
The company says that Ironwood marks “a significant change” in the development of AI, making part of the movement of receptive AI models that simply present real -time information for users to process, towards proactive models that can interpret and infer for themselves.
This is essentially the next generation of AI Informatics, Create Google Cloud, allowing its most demanding customers to configure and establish increasing workloads.
In its high range, Ironwood can climb up to 9,216 chips per pod, for a total of 42.5 exafe, more than 24 times the computing power of El Capitan, the largest supercomputer in the world.
Each individual chip offers a maximum calculation of 4,614 Tflops, which the company says it is a great leap forward in capacity and capacity, even in its slightly less large configuration of “only” 256 chips.
However, the scale can be even greater, since Ironwood allows developers to use the route software stack designed by the company Deepmind to take advantage of the combined computer power of tens of thousands of Ironwood TPU.
Ironwood also offers a significant increase in high -band memory capacity (192GB per chip, up to 6 times greater than the sixth generation TPU previous trillium generation) and bandwidth, capable of reaching 7.2Tbps, 4.5x more than Trillium.
“For more than a decade, TPUs have promoted the most demanding Google training and serve workloads, and have allowed our clouds to do the same,” said Amin Vahdat, VP/GM, ML, Systems & Cloud AI.
“Ironwood is our most powerful, capable and efficient TPU until now. And it is designed for power thinking, models of scale inferentials on scale.”