Nvidia is leveraging Groq’s low-latency AI chips as the data center race heats up

Nvidia is leveraging Groq’s low-latency AI chips as the data center race heats up

The move strengthens Nvidia’s dominance in AI infrastructure at a time when rivals like Google, Microsoft and Amazon are accelerating in-house chip efforts.

Nvidia Corp. has signed a licensing deal with the artificial intelligence startup Groq, furthering its investments in companies related to the AI ​​boom and gaining the right to add a new type of technology to its products.

The world’s largest publicly traded company has paid for the right to use Groq’s technology and will integrate its chip design into future products. Some of the startup’s executives are leaving to join Nvidia to help with that effort, the companies said. Groq will continue as an independent company with a new CEO, according to a message on its website on Wednesday.

Nvidia’s technology already dominates data centers that are at the heart of the explosion in spending on new computers needed for AI software and services. The popularity of its existing offerings has made Nvidia by far the wealthiest company in the chip industry, and it has said it will use some of that money to fuel the adoption of AI across the economy.

Groq is one of the startups and companies like Alphabet Inc.’s Google. who are developing their own AI chips to compete with Nvidia. The startup, which was founded in 2016, raised $750 million in September at a post-funding valuation of $6.9 billion. Groq said at the time that it would use the money to expand the capacity of its data centers. The data center business, which offers outsourced computing, will continue, the company said in the post.

Groq Chief Executive Officer Jonathan Ross is a former Google chip executive who helped build that company’s Tensor Processing Unit (TPU), which powers AI workloads. As part of the deal, he and other top executives will join Nvidia “to help advance and scale its licensed technology,” Groq said in the statement.

No financial details have been released.

Groq’s low-latency chips are extremely responsive to input and will add new capabilities to Nvidia’s products and open up new areas of the market, Nvidia said. Under CEO Jensen Huang’s leadership, the chipmaker has added a host of new offerings aimed at strengthening its position and accelerating the speed at which companies find use for AI software. The company now sells networks, software and services, as well as complete computers.

The licensing agreement is similar in some respects to a partnership that Meta Platforms Inc. has struck with data labeling startup Scale AI, in which the large tech company made a significant investment in the smaller company, licensing technology and hiring its CEO.

Nvidia has been investing in companies in the AI ​​infrastructure ecosystem and is trying to maintain a big lead in the market for inference: running models as soon as they’re developed. The company’s leadership has already committed billions to a wide variety of projects that they believe will advance the overall AI industry. Nvidia agreed to invest as much as $100 billion in OpenAI and has even bought a stake in former nemesis Intel Corp.

By incorporating a new type of design into what it sells, Nvidia is showing a willingness to be flexible and add new capabilities. That approach is likely aimed at keeping the biggest customers and newcomers focused on the technology, at a time when internal efforts by Google, Microsoft Corp. and Amazon.com Inc. gaining momentum as the industry races to install as much computing capacity as possible as quickly as possible.

More stories like this are available at bloomberg.com

Published on December 25, 2025

#Nvidia #leveraging #Groqs #lowlatency #chips #data #center #race #heats

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *