Meta, OpenAI, and Microsoft on Wednesday, December 6, at an AMD event, for investors announced their intention to use the company’s new artificial intelligence chip called Instinct MI300X.
The specified statement is significant in the context of observing trends in the advanced technology industry. The intention of the mentioned companies to use the new AI chip from AMD is a sign of the demand for an alternative to expensive Nvidia GPUs, which are necessary for the creation and subsequent operation of artificial intelligence programs, such as, for example, OpenAI ChatGPT.
If the experience of using the Instinct MI300X demonstrates a high level of efficiency and its application becomes a profitable solution for technology firms and cloud service providers developing and maintaining machine intelligence models, the amount of costs required to create digital thinking systems and related products will decrease. Deliveries of the new AMD chip will be launched early next year. Also in this case, if the mentioned microcircuit demonstrates a high level of productivity and acceptable quality of work, there will be competitive pressure on the process of rapid growth in sales of artificial intelligence chips from Nvidia.
At an event on Wednesday, AMD CEO Lisa Su said that all the interest is concentrated in big iron and big GPUs for the cloud.
The MI300X is based on a new architecture that provides significant performance improvements. The main feature of the new chip is 192 GB of ultra-modern high-performance memory, known as HBM3, which guarantees high data transfer speeds and can match large artificial intelligence models.
Lisa Su compared the MI300X and the systems developed based on this microcircuit with Nvidia’s main graphics processor with artificial intelligence called the H100. The head of AMD said that the high level of performance of the new chip created by her company improves the user experience. She noted that consumers are interested in getting answers from artificial intelligence models as quickly as possible, stressing that this request is being updated against the background of the complexity of information materials that those who interact with AI systems need.
In the context of launching the supply of a new chip for AMD, the willingness of companies using Nvidia products to spend time and money on establishing cooperation with another GPU manufacturer is of particular importance. Lisa Su stated that some work needs to be done to implement AMD. Probably, in this case, it is implied that the company needs to make efforts to interest potential customers who have a demand for artificial intelligence chips.
AMD on Wednesday announced to partners and investors the improvement of its ROCm software package. In this case, the company seeks to compete with Nvidia’s CUDA industry standard.
In the context of AMD’s potential to gain market share in advanced chips, the cost of the corresponding products is of particular importance. So far, there is no information about the price of the MI300X. The cost of one Nvidia chip can be about $40,000. Lisa Su stated that the new microcircuit should be cheaper to buy and operate compared to the products of the specified competitor to convince customers to buy the MI300X.
AMD on Wednesday also announced the signing of a contract for the use of the new chip with some companies in particular need of graphics processors. This year, the largest buyers of Nvidia H100 GPUs are Meta and Microsoft. The relevant information is contained in the report of the Omidia research firm.
Meta has stated that it will use MI300X GPUs for workloads that involve artificial intelligence inference. In this case, it means solving tasks such as processing AI stickers, editing images, and working with a virtual assistant.
Microsoft CTO Kevin Scott said that the company will offer access to the MI300X through its Azure web service.
Oracle Cloud will also use AMD’s new microcircuit. OpenAI said it will support the MI300X in one of its software products called Triton, which is not a large GPT-like language model, but is applied in artificial intelligence research to access the chip’s functions.
AMD does not expect that the new microcircuit will provide rapid profit growth in the foreseeable future. The company predicts that next year the total revenue from the supply of GPUs for data centers will be about $2 billion. In the third quarter of this year, Nvidia earned more than $14 billion as a result of its relevant activities.
AMD expects that over the next four years, the total financial volume of the market for graphics processors with artificial intelligence may grow to $400 billion. This figure is twice as high as the company’s previous forecast. The change in the outlook of the mentioned market indicates significant expectations and a high level of demand for high-quality artificial intelligence chips.
Lisa Su says AMD does not need to beat Nvidia to succeed in delivering AI microcircuits. She believes that the company she heads can gain a significant share of the market for artificial intelligence chips that demonstrates growth.
As we have reported earlier, AMD’s Profit Exceeds Analyst Expectations.