The news is by your side.

Nvidia's big tech rivals are bringing their own AI chips to the table

0

In September, Amazon said it would invest up to $4 billion in Anthropic, a San Francisco artificial intelligence startup.

Shortly afterwards, an Amazon manager sent a private message to a manager at another company. He said Anthropic won the deal because it agreed to build its AI using specialized computer chips designed by Amazon.

Amazon, he wrote, wanted to create a viable competitor to chipmaker Nvidia, a key partner and kingmaker in the all-important field of artificial intelligence.

The boom in generative AI over the past year has highlighted just how dependent major tech companies have become on Nvidia. They can't build chatbots and other AI systems without a special kind of chip that Nvidia has mastered in recent years. They've spent billions of dollars on Nvidia's systems, and the chipmaker hasn't kept up with demand.

So Amazon and other industry giants – including Google, Meta and Microsoft – are building their own AI chips. With these chips, the technology giants could determine their own destiny. They could reduce costs, eliminate chip shortages, and ultimately sell access to their chips to companies that use their cloud services.

While Nvidia sold 2.5 million chips last year, Google spent between $2 billion and $3 billion building about a million of its own AI chips, said Pierre Ferragu, an analyst at New Street Research. Amazon spent $200 million on 100,000 chips last year, he estimates. Microsoft said it had started testing its first AI chip.

But this work is a balancing act between competing with Nvidia while working closely with the chipmaker and its increasingly powerful CEO, Jensen Huang.

Mr Huang's company accounts for more than 70 percent of AI chip sales, according to research firm Omdia. It powers an even greater percentage of the systems used in creating generative AI. Nvidia's revenue has increased 206 percent in the past year and the company has added about a trillion dollars in market value.

What revenues are for Nvidia are costs for the tech giants. Orders from Microsoft and Meta have made up about a quarter of Nvidia's revenue over the past two full quarters, said Gil Luria, an analyst at investment bank DA Davidson.

Nvidia sells its chips for about $15,000 each, while Google averages just $2,000 to $3,000 each, according to Mr. Ferragu.

“When they encountered a vendor holding them over a barrel, they reacted very violently,” Mr. Luria said.

Companies are constantly courting Mr. Huang, trying to get to the front of the line for his chips. He regularly appears on event stages with their CEOs, and the companies are quick to say they remain committed to their partnership with Nvidia. They all plan to continue offering their chips alongside their own.

As the big tech companies move into Nvidia's business, Nvidia is moving into theirs. Last year, Nvidia launched its own cloud service where companies can use its chips, and it is funneling chips to a new wave of cloud providers, such as CoreWeave, that compete with the big three: Amazon, Google and Microsoft.

“The tensions here are a thousand times greater than the usual battle between customers and suppliers,” said Charles Fitzgerald, a technology consultant and investor.

Nvidia declined to comment.

According to research firm Gartner, the AI ​​chip market is expected to more than double to approximately $140 billion by 2027. Venerable chipmakers like AMD and Intel are also building specialized AI chips, as are startups like Cerebras and SambaNova. But Amazon and other tech giants can do things that smaller competitors can't.

“If they can achieve high enough volume and reduce costs, these companies should in theory be able to offer something even better than Nvidia,” says Naveen Rao, who founded one of the first AI chip startups. ups and later sold it to Intel.

Nvidia builds so-called graphics processing units, or GPUs, which it originally designed to render graphics for video games. But a decade ago, academic researchers realized that these chips were also great at building the systems, called neural networks, that now power generative AI.

As this technology took off, Mr. Huang soon began adapting Nvidia's chips and associated software for AI, and they became the de facto standard. Most of the software systems used to train AI technologies were tailored to Nvidia's chips.

“Nvidia has great chips, and more importantly, they have an incredible ecosystem,” said Dave Brown, who heads Amazon's chip business. That makes it “very, very challenging” to get customers to use a new kind of AI chip, he said.

Rewriting software code to use a new chip is so difficult and time-consuming that many companies don't even try, says Mike Schroepfer, consultant and former Chief Technology Officer at Meta. “The problem with technology development is that a lot of it dies before it even gets started,” he said.

Rani Borkar, who oversees Microsoft's hardware infrastructure, says Microsoft and his colleagues need to make it “seamless” for customers to switch between chips from different companies.

Amazon, Mr. Brown said, is working to make switching between chips “as easy as possible.”

Some tech giants have had success making their own chips. Apple designs the silicon in iPhones and Macs, and Amazon has deployed more than two million of its own traditional server chips in its cloud computing data centers. But this kind of performance takes years of hardware and software development.

Google has the biggest lead in developing AI chips. In 2017, it introduced its tensor processing unit, or TPU, named after a type of computation essential for building artificial intelligence. Google used tens of thousands of TPUs to build AI products, including its online chatbot Google Bard. And other companies have used the chip through Google's cloud service to build similar technologies, including high-profile startup Cohere.

Amazon is now using the second generation Trainium, the chip for building AI systems, and has created a second chip specifically for offering AI models to customers. In May, Meta announced plans to work on an AI chip tailor-made for its needs, although it is not yet in use. In November, Microsoft announced its first AI chip, Maia, which will initially focus on running Microsoft's own AI products.

“When Microsoft builds its own chips, it builds exactly what it needs at the lowest possible cost,” Mr. Luria said.

Nvidia's rivals have used their investments in high-profile AI startups to boost use of their chips. Microsoft has committed $13 billion to OpenAI, maker of the ChatGPT chatbot, and the Maia chip will deliver OpenAI's technologies to Microsoft's customers. Like Amazon, Google has invested billions in Anthropic, and it also uses Google's AI chips.

Anthropic, which has used chips from both Nvidia and Google, is among a handful of companies working to build AI using as many specialized chips as they can get their hands on. Amazon said that if companies like Anthropic used Amazon's chips more and more widely and even helped design future chips, it could reduce costs and improve the performance of these processors. Anthropic declined to comment.

But none of these companies will overtake Nvidia anytime soon. The chips may be pricey, but they are among the fastest on the market. And the company will continue to improve their speed.

Mr. Rao said his company, Databricks, trained some experimental AI systems using Amazon's AI chips, but built the largest and most important systems using Nvidia chips because they delivered higher performance and worked well together with a wider range of software.

“We still have many years of hard innovation ahead of us,” said Amazon's Mr. Brown. “Nvidia will not stand still.”

Leave A Reply

Your email address will not be published.