AMD Says AI Is Its No.1 Strategic Priority With Instinct MI300 Leading The Charge Later This Year

AMD Says AI Is Its No.1 Strategic Priority With Instinct MI300 Leading The Charge Later This Year

 0
AMD Says AI Is Its No.1 Strategic Priority With Instinct MI300 Leading The Charge Later This Year

AMD is also betting its future on AI as CEO, Dr. Lisa Su, pointed out during the recent earnings call & Instinct MI300 will lead the charge.

AMD's CEO states that AI has and will be its number one strategy moving forward and they are working to get the required hardware and software out to meet the growing demand for AI on a priority basis. The red team has various products which focus on AI coming soon which include Instinct MI300, Ryzen 7040 "Phoenix APUs" with Ryzen AI, and a range of data center SoCs / HPC accelerators.

We are very excited about our opportunity in AI. This is our No. 1 strategic priority, and we are engaging deeply across our customer set to bring joint solutions to the market, led by our upcoming instinct MI300 GPUs, Ryzen 7040 Series CPUs with Ryzen AI, Zynq UltraScale+ MPSoCs, LVO V70 data center inference accelerators and Versal AI adaptive data center and edge SoCs.

AMD CEO, Dr. Lisa Su

For AMD, AI has a broader market and much more applications compared to the cloud segment. It includes what the company is already doing in the client and embedded segment. Talking about its Instinct MI300 APU accelerator, the exascale APU suits the requirements for AI just as well as HPC & supercomputing ecosystems. But AMD says that the Instinct MI300 has evolved over the past few months.

The company is working to expand the pipeline of its Instinct MI300 accelerator and generative AI has seen considerable progress. AMD is using its Xilinix prowess to boost and accelerate the work related to AI software as well as the entire generative AI platform.

Now as it relates to your question about MI300, look, we're really excited about the AI opportunity. I think that is success for us is having a significant part of the AI overall opportunity. AI for us is broader than cloud. I mean it also includes what we're doing in Clients and Embedded.

But specifically, as it relates to MI300, MI300 is actually very well-positioned for both HPC or supercomputing workloads as well as for AI workloads. And with the recent interest in generative AI, I would say the pipeline for MI300 has expanded considerably here over the last few months, and we're excited about that. We're putting a lot more resources. I mentioned on the prepared remarks, the work that we're doing, sort of taking our Xilinx and sort of the the overall AMD AI efforts and collapsing them into one organization that's primarily to accelerate our AI software work as well as platform work.

So success for MI3100 is for sure, a significant part of sort of the growth in AI in the cloud. And I think we feel good about how we're positioned there.

AMD CEO, Dr. Lisa Su

Talking about when AMD will release the Instinct MI300 accelerators, the chip will begin its initial revenue ramp in Q4 2023 when it ships to the first cloud AI customers and a more meaningful revenue return is expected in 2024. You can read more about the specs of the multi-chiplet and multi-IP design that is the MI300 over here.

We mentioned in the prepared remarks some of the work that was done on the LUMI supercomputer with generative AI models. We've continued to do quite a bit of library optimization with MI250 and software optimization to really ensure that we could increase the overall performance and capabilities. MI300 looks really good. I think from everything that we see, the workloads have also changed a bit in terms of -- whereas a year ago, much of the conversation was primarily focused on training.

Today, that has migrated to sort of large language model inferencing, which is particularly good for GPUs. So I think from an MI300 standpoint, we do believe that we will start ramping revenue in the fourth quarter with cloud AI customers and then it will be more meaningful in 2024.

AMD CEO, Dr. Lisa Su

AMD further reaffirmed that its EPYC Bergamo CPUs featuring the new Zen 4C cores will be launching later this quarter. The Genoa-X chips with 3D V-Cache boosted Zen 4 cores are also expected to ship later this quarter.

We are on track to launch Bergamo, our first cloud-native server CPU and Genoa, our fourth Gen EPYC processor with 3D chiplets for leadership in technical computing workloads later this quarter. Although we expect server demand to remain mixed in the second quarter, we are well-positioned to grow our cloud and enterprise footprint in the second half of the year based on the strong customer response to the performance and TCO advantages of Genoa, Bergamo, and GenX.

AMD CEO, Dr. Lisa Su

AMD is now following NVIDIA's footsteps in making sure that they get a slice out of AI. The demand for AI GPUs from NVIDIA has seen huge growth over the past several months. The company saw a major uptick in market capitalization recently. AMD isn't just waiting for Instinct MI300 servers to kick its AI ventures into action but also working to optimize the existing MI250 accelerators to meet AI demand.

What's Your Reaction?

like

dislike

love

funny

angry

sad

wow