Skip to main content

AMD Advancing AI live: All the news and updates as they happen

ITPro is live on the ground at the AMD Advancing AI conference in San Francisco – stay up to date with all the news, updates, and announcements here

AMD Advancing AI sign pictured at the Moscone Center in San Francisco, California.
(Image: © ITPro/Ross Kelly)

Welcome to ITPro's live coverage of the AMD Advancing AI conference in San Francisco.

It's a chilly morning here in San Francisco, but we're up and rearing to go ahead of the conference, which is due to kick off at 9am PST.

We've got a jam-packed agenda ahead of us at the Moscone Center today, and yesterday saw assembled press get a sneak peak at all the news and announcements we can expect to see in the opening keynote session – and we can assure you there's going to be a lot of talking points.

In the opening keynote we'll hear from a variety of executives at AMD, including chief executive Lisa Su.

Make sure to keep tabs on our rolling live coverage here throughout the day.

Refresh

We've got a panel session coming up now with a host of industry stakeholders, including:

• Dani Yogatama, CEO at Reka AI

• Dmytro Dzhulgakov, CTO at Fireworks AI

• Ashish Vaswani, CEO of Essential AI

• Amit Jain, CEO at Luma AI

Panel discussion on stage at AMD Advancing AI conference in San Francisco, California.

(Image credit: Future)

So how is ROCm helping support development at these trailblazing companies?

The open source approach that AMD employs has been highly beneficial to all four organizations, that much is clear. Furthermore, AMD GPUs are underpinning the development of the models currently in development at these firms.

"ROCm's open source platform really allowed us to move fast," Yogatama says.

Dzhulgakov at Fireworks AI says using the MI300 series has unlocked marked benefits for the company during its development process.

"This basically allows us to deliver really efficient deployment of AI models," he says.

While hardware is important, software is an equally crucial aspect of AMD’s current roadmap and strategy.

AMD platforms are powering some of the most powerful workloads on the planet.

We’re getting a run down on ROCm now, AMD’s open software stack that complements its hardware offerings.

“ROCm really delivers for AI developers.”

Part of this focus includes close collaboration with the open source community for AMD. The chip maker works closely with PyTorch, for example, as well as Hugging Face and vLLM.

With Hugging Face in particular, more than one million models run on AMD.

AMD really is pulling out the big hitters in terms of brand names here. Oracle, Microsoft, and now Meta.

Su is joined on stage now by Kevin Salvadori, VP for Infrastructure and Engineering at Meta.

Meta's sharpened focus on generative AI over the last two years has been underpinned by AMD Instinct GPUs and EPYC CPUs.

Salvadori admits that Meta is a demanding customer, but AMD has been on hand to match those expectations amid the company's big AI boom.

"We've deployed over one and a half million CPUs," Salvadori says. Huge numbers and serious scale.

Using the MI300x series in production has been "instrumental" to helping Meta scale its AI infrastructure and power inference. This has been crucial to support development of the Llama large language model.

Kevin Salvadori, VP of Infrastructure and Engineering at Meta, pictured on stage with AMD CEO Lisa Su at the AMD Advancing AI conference in San Francisco.

(Image credit: Future)

Satya Nadella, Microsoft CEO, joins Su now…just not on stage. We’re getting a run through a video chat with the duo discussing the pace of change and development in the AI space.

“It’s been incredible, the amount of innovation in the industry,” Su says.

AMD and Microsoft have deep, deep ties. Microsoft is creating an entirely new PC category with AMD at the moment with AI PCs.

In the last four years, Microsoft has been tapping into AMD’s AI innovation to support its own cloud innovation. It’s a very beneficial feedback loop for both organizations that’s paying dividends.

Microsoft is using the MI300 series which is delivering great benefits for key services such as its raft of various AI solutions and tools, as well as Azure.

“What we have to deliver is performance, per dollar, per watt,” Nadella says. A big focus on energy efficiency and performance to deliver cost benefits for enterprises.

That’s the one metric that matters, Nadella says. Given the current costs associated with AI development, unlocking these improvements is vital.

Microsoft CEO Satya Nadella and Lisa Su speaking on stage.

(Image credit: Future)

We now have another partner joining Su up on stage. This time it's Karan Batta, SVP at Oracle.

AI, naturally, is the biggest talking point here. Oracle is seeing "incredible levels" of performance and efficiency on AMD GPUs, Batta says.

The company is continuing to scale up capacity for MI300X series GPUs for customers, Batta says. There's more to come in the year ahead as well.

Lisa Su, AMD CEO, pictured on stage at AMD Advancing AI with Kara Batta, SVP at Oracle CLoud.

(Image credit: Future)

Rapid fire customer chats here with Naveen Rao, VP of AI at Databricks joins Su on stage now to discuss the company's partnership with AMD.

"We've been on this journey for a while," Rao says. "We've achieved remarkable results."

Leveraging the MI300X the company has delivered marked compute improvements. It's a real "powerhouse" of a GPU, Rao says.

AMD Advancing AI keynote stage with Naveen Rao, VP of AI at Databricks, and AMD CEO Lisa Su.

(Image credit: Future)

So what's AMD's big selling point here then with regard to GPU capabilities?

The Instinct MI300X series is the chip maker's flagship product here, and in terms of performance it's far exceeding what competitors can provide.

The MI300X outperforms Nvidia's H100 series on several key benchmarks, Su explains. All told, this represents 1.3x higher performance on Llama 3 and Mistral-based workloads.

AMD Lisa SU on stage at AMD Advancing AI

(Image credit: Future)

Data center AI demand is absolutely skyrocketing. If you're a regular reader of ITPro then you'll have come across some of the various bits of research we've covered on this subject.

Insatiable demand from enterprises globally has prompted a massive surge. This, Su says, represents a massive opportunity for AMD to capitalize on this trend with its Instinct GPU series.

Looking ahead, the numbers involved here are simply mind boggling.

AMD CEO Lisa Su on stage at AMD Advancing AI.

(Image credit: Future)

So how's this benefitting Google Cloud? Well, leveraging EPYC machines, the tech giant has cut costs by 40% and delivered a 15% performance improvement.

Big numbers here and when you look at this from a financial perspective, very significant for the hyperscaler.

We've got our first partner on stage now from Google Cloud.

"The demand for AI compute power is insatiable."

We've moved onto the data center portfolio here. A really exciting area for AMD right now - big gains and big revenues boosts in recent months.

AMD EPYC now boasts a record market share - and it's growing, Su says. At the end of the second quarter, AMD exited with a 34% revenue share in this domain.

AMD CEO Lisa Su speaking on stage at AMD Advancing AI.

(Image credit: Future)

"When we think about AI, it really is about using the right compute for the right application," Su says.

There's really no 'one-size-fits-all' approach to AI, and AMD is keen to enable enterprises to get those creative juices flowing and adopting the technology on their terms, at their pace, and based on their unique individual needs.

We're getting a brief rundown of AMD's current strategy.

"It's about employing the entire ecosystem - the cloud, OEMs...our goal is to create an open ecosystem," Su says.

Su says AMD is "really committed" to driving open innovation in high performance computing and AI infrastructure. This has been a recurring talking point in the company's strategy of late - ecosystem building and collaboration.

And here we go! AMD chief executive Lisa Su is on stage to kick things off at Advancing AI.

"Let's talk a little bit about AI," Su says. "Actually, we're going to talk a lot about it."

Setting the tone here right off the mark.

AMD CEO Lisa Su pictured on stage at AMD Advancing AI.

(Image credit: Future)

We are seated now and good to go. Just a matter of time until we hear from Lisa Su in her opening remarks here at AMD Advancing AI. A good buzz around the keynote hall ahead of things commencing.

Keynote stage at the AMD Advancing AI conference at the Moscone Center, San Francisco.

(Image credit: ITPro/Ross Kelly)

Like we said, a real contrast in the volume of people here at the Moscone Center today.

But what's AMD doing in this regard? Its Ryzen series is a key focus for the company in its big AI PC push, so we'll likely hear more on this during the opening keynote.

There has been some concern from industry stakeholders on the Ryzen series, however.

After AMD unveiled its Ryzen Pro enterprise-focused processors, the company promised significant performance gains, but reports in April suggested they were unlikely to meet the steep performance threshold required to be designated as an 'AI PC' by Microsoft.

A key talking point here was that the AI processing power of these chips appeared to fall short of the 40 trillion operations per second (TOPS) standard set by Microsoft.

You can read more on this in our coverage below.

AMD's new series of AI-enabled processors likely fall short of Microsoft’s performance requirements for next-gen AI PCs

With this in mind, we could see AMD come out guns blazing with a new update to the Ryzen series.

We're expecting to hear about a big push on AI PCs from AMD today. The rise of the AI PC has been a big talking point over the last year, with manufacturers and chip makers alike ramping up development in this space.

Intel CEO Pat Gelsinger claimed last year that the AI PC would be the "star of the show" in 2024, and he wasn't wrong. This has been a big area of focus for Intel, one of AMD's key competitors in this space.

It's still early - and we are expecting a torrent of people to converge on the Moscone Center this morning - but I have to say the city center in San Francisco has been a breath of fresh air compared to the carnage that Dreamforce brought with it just a couple of weeks prior.

No sea of lanyards or the cacophony of car horns in chaotic traffic.

We've still got a little while until the opening keynote session kicks off, but in the meantime why not check out our preview ahead of the conference for a taste of what we can expect to hear about?

AMD has put in the groundwork for a major AI push while the tech industry fawned over Nvidia

It's clear that AMD has been cooking up a storm so far in 2024 as it primes itself for a battle with Nvidia.

All eyes have been on the latter over the last two years since the emergence of generative AI, but AMD has been putting in the groundwork of late to catch up with its great rival.

Today we'll hear more about how it plans to ramp things up in the year ahead, but we have seen a raft of exciting acquisitions, including two in the last couple months alone. These included a $4.9 billion deal to acquire ZT Systems, which marked a huge statement of intent from the chip maker.

In July, AMD also announced the $655 million acquisition of SIlo AI amidst its big AI push.