Friday, November 28, 2025 | 02:47 AM ISTहिंदी में पढें
Business Standard
Notification Icon
userprofile IconSearch

Google pushes its AI chips into rival data centres, challenging Nvidia

Google is deploying TPUs in rival data centres, challenging Nvidia's GPU dominance and aiming to break developers' reliance

Google is deploying its TPUs in rival data centres, pushing into Nvidia’s GPU-dominated turf and offering billions in backing to win developer adoption.

TPU vs GPU: Google expands TPU access beyond its cloud challenging Nvidia's chip dominance. (Photo: Reuters)

Vasudha Mukherjee New Delhi

Listen to This Article

Don't want to miss the best from Business Standard?

Google has started placing its in-house artificial intelligence (AI) chips, known as ‘tensor processing units’ (TPUs), in data centres operated by smaller cloud providers. These firms have so far been almost entirely reliant on Nvidia’s market-dominating graphics processors.
 
According to a report by The Information, the company has approached several firms, including CoreWeave and Crusoe, about hosting TPUs. It has already secured a deal with London-based Fluidstack, which will install Google’s chips in a new data centre in New York.
 
Until now, TPUs were reserved for Google’s own services, such as its Gemini AI models, or offered selectively through Google Cloud to companies like Apple and the image-generator Midjourney. By allowing third-party providers to host them, Google seems to be widening access and reducing its reliance on Nvidia, whose chips have long been the industry standard.
 
 

Nvidia: The industry’s default GPU supplier

Nvidia dominates AI hardware. Its graphics chips (GPUs), originally made for gaming, have become the go-to for building AI systems because they’re flexible and supported by a strong set of software tools.
 
Cloud providers like CoreWeave and Crusoe have leaned into this, buying Nvidia chips in bulk and renting them out to AI startups and giants like OpenAI and Microsoft. Nvidia has even invested in some of these companies, deepening their reliance on its technology.
 

Google’s alternative to GPUs

Google’s TPUs work differently. They’re custom-built for AI, which makes them faster and more efficient at handling machine learning. But they’re not as versatile as Nvidia’s chips, and until now, they’ve been mostly locked inside Google’s own systems.
By supplying TPUs to the same companies, Google is effectively stepping into Nvidia’s territory. 
However, the challenge for Google is not just selling a chip; it’s also to convince developers to move away from what they already know. Most are already trained on Nvidia’s tools, and Google will need to offer real incentives to spark adoption.
 

Google’s offer

To sweeten the deal, Google is putting up cash. In Fluidstack’s case, it pledged up to $3.2 billion as a financial backstop for the New York data centre lease. That guarantee is helping Fluidstack raise funds to build the facility.
 

Bigger AI race

The report argues that while cloud firms and AI developers have a growing interest in diversifying away from a single supplier, getting developers to adopt TPUs will not be easy.
 
Google is the only company that makes TPUs, but it is not alone in developing specialised AI hardware. Amazon has built Inferentia and Trainium chips, Microsoft has designed its own Maia and Cobalt processors, Meta is working on MTIA chips, and Apple integrates a Neural Engine in its devices.
 
All of these efforts reflect a broader race to reduce dependence on Nvidia, whose GPUs remain the backbone of today’s AI systems.
 
Nvidia’s GPUs remain the default choice for now. The company's chief executive, Jensen Huang, has dismissed rivals, saying developers stick with Nvidia because of its versatility and software support.
 

Don't miss the most important news and views of the day. Get them on our Telegram channel

First Published: Sep 08 2025 | 9:41 AM IST

Explore News