News

Microsoft Develops AI Chip

Microsoft is developing a new artificial intelligence chip under the internal code name Athena.

Microsoft Develops AI Chip

Source: Pixabay.com

The media reports that as early as next year, this chip can be widely distributed both within the company and in OpenAI. Experts say that from the point of view of Nvidia’s commercial interests, there is no threat in this, but at the same time, it is impossible to deny the fact of a signal that it is time for hyper scalers to start developing their silicon.

Nvidia remains the undisputed market leader in terms of supply volumes of artificial intelligence chips. The market share of this company is more than 85%. The firm’s partners compete with each other for the opportunity to reserve access to the A100 and H100 GPUs, which cost tens of thousands of dollars.

Athena, like the chips that were developed in-house by Google (TPU) and Amazon (Trainium and Inferentia processor architectures), is designed to teach large language models. There is a special need for this since the scale of generative artificial intelligence models is expanding at a faster pace compared to the computational capabilities that are necessary for their training.

Advanced generative models of AI use hundreds of billions of parameters that require large-scale computing capabilities. The next-generation models will vary in trillions of parameters, which creates the need for computational accelerators for faster learning while reducing the cost of this process and reducing the amount of time required.

Microsoft is trying to force its strategy of generating artificial intelligence while reducing costs. It makes sense for a tech giant to develop a custom AI accelerator concept. In this case, significant economies of scale are possible.

The need for acceleration applies to artificial intelligence chips that support machine learning inference. The model boils down to a set of weights, which then use the data to achieve practical goals. For example, the computing infrastructure is used when ChatGPT generates responses to input data in natural language.

The need for the application of artificial intelligence is expanding, which entails an increase in the use cases of AI. In this regard, Microsoft and other large companies should create optimized versions of artificial intelligence chips for their architectures and optimized algorithms. To operate cloud computing and provide customers with cheap options, do not need an expensive Nvidia product. Technology giants are likely to create their chips to compete not only with Nvidia but also with Intel in the field of general-purpose cloud computing.

Experts admit that in the context of the logic of technological development, at some point there will be a situation of intensification of activities to create GPUs and custom chips not for general purposes, but for specific purposes. The implementation of this scenario will have a significant impact on the entire semiconductor industry. Technology providers still have to do the bulk of the work to meet the needs of the artificial intelligence market, which is in a state of active development.

As we have reported earlier, Santander and Microsoft Launch AI Challenge.

Serhii Mikhailov

2133 Posts 0 Comments

Serhii’s track record of study and work spans six years at the Faculty of Philology and eight years in the media, during which he has developed a deep understanding of various aspects of the industry and honed his writing skills; his areas of expertise include fintech, payments, cryptocurrency, and financial services, and he is constantly keeping a close eye on the latest developments and innovations in these fields, as he believes that they will have a significant impact on the future direction of the economy as a whole.