|
|
By Alimat Aliyeva
The American company OpenAI is reportedly dissatisfied with some of Nvidia’s latest chips and is exploring alternative options for powering artificial intelligence (AI) workloads, Azernews reports.
According to OpenAI, the shift in strategy is driven by the growing importance of specialized microprocessors used to execute specific AI tasks—essentially, the processes that generate responses when AI models interact with users.
This move by OpenAI, along with similar decisions by other tech companies to seek alternatives in the AI chip market, represents a significant challenge to Nvidia’s dominance in the AI hardware sector, according to industry analysts. While Nvidia continues to lead in chips for training large AI models, inference—the process of running models to produce outputs—has become a new competitive frontier.
Nvidia CEO Jensen Huang recently commented on the company’s ongoing investment in OpenAI. He confirmed that Nvidia would continue to fund the startup, describing it as a “good investment,” but did not disclose the exact amount planned for the partnership.
The growing competition in AI hardware has led to innovations in AI-specific chips, such as processors designed for faster inference with lower energy consumption. Some emerging chips even aim to run complex AI models directly on smaller devices, like smartphones and edge servers, potentially reducing the need for massive data centers. This shift could fundamentally change how AI services are delivered in the near future.
Print version