Alphabet introduced its seventh-generation AI chip, named Ironwood, on Wednesday, aimed at enhancing the speed and performance of artificial intelligence applications. Designed for “inference” computing, which involves rapid data processing to provide real-time responses in AI applications like OpenAI’s ChatGPT, the Ironwood chip offers a viable alternative to Nvidia’s AI processors.
This new chip is part of Google’s ongoing multi-billion-dollar, decade-long initiative to advance AI technology. Google’s Tensor Processing Units (TPUs), which have provided a competitive advantage for the company’s internal AI efforts, are primarily used by Google engineers or through its cloud services.
Read more: Google to Pay $100 Million to Settle Long-Running AdWords Lawsuit
For previous generations, Google had split its TPU chips into two categories: one for building large AI models and another designed to reduce the costs of running AI applications.
The Ironwood chip is specifically built for running AI applications and can work in clusters of up to 9,216 chips, providing increased memory for better efficiency in serving AI tasks.
This chip significantly improves energy efficiency, offering double the performance compared to Google’s Trillium chip announced last year. While the manufacturer of the Ironwood chip was not disclosed, it is integral to the development of Google’s Gemini AI models.
Following the unveiling of the chip, Alphabet’s stock rose by 9.7%, boosted by President Donald Trump’s unexpected reversal on tariffs.