AMD reveals MI300 series accelerators for generative AI
MI300X is the star of the new family of processors with AMD ‘laser-focused’ on data center deployment

AMD has unveiled the AMD Instinct MI300X, a new accelerator optimized for generative artificial intelligence (AI), as part of a wider announcement for the MI300 series of accelerators.
The MI300X, which was launched at the company’s Data Center and AI Technology Premiere, is AMD’s response to the current thirst for all things generative AI, making it the star of the new lineup.
It’s based on the AMD CDNA 3 accelerator architecture, which supports up to 192GB of high bandwidth memory (HBM) – this is what is needed for large language models (LLMs) and generative AI workloads. With the higher memory capacity, even the largest and most demanding of LLMs , such as the 40B parameter Falcon-40 model, can be run on a single MI300X accelerator, according to AMD.
In addition to the MI300X, the MI300 series also includes the MI300A – described by AMD as the “world’s first APU accelerator for high-performance computing (HPC) and AI workloads”. Other notable announcements included the AMD Instinct Platform, which combines the eight MI300X accelerators into an ‘ultimate solution’ for AI inference and training.
“AI is the defining technology shaping the next generation of computing and the largest strategic growth opportunity for AMD,” said AMD chair and CEO Lisa Su. “We are laser-focused on accelerating the deployment of AMD AI platforms at scale in the data center, led by the launch of our Instinct MI300 accelerators planned for later this year and the growing ecosystem of enterprise-ready AI software optimized for our hardware.”
RELATED RESOURCE
AI inferencing with AMD EPYC™ processors
Providing an excellent platform for CPU-based AI inferencing
Other AI-based announcements included the Pervasive AI Vision, an AI platform that offers a portfolio of cloud-to-edge-to-endpoint hardware products and software collaborations. The aim is to develop scalable and pervasive AI services.
The announcement was hailed by Su as a “significant step forward for [AMD’s] data center strategy”. The expansion of the 4th Gen EPYC family of processors now supports cloud and technical compute workloads and new public instances and internal deployments with the largest cloud providers.
Get the ITPro daily newsletter
Sign up today and you will receive a free copy of our Future Focus 2025 report - the leading guidance on AI, cybersecurity and other IT challenges as per 700+ senior executives
Along with the new capabilities, a burgeoning ecosystem of partners has been tipped to shape the future of computing – this includes the likes of Amazon Web Services (AWS), Meta, Microsoft, and Oracle. AWS already uses the AMD EPYC processor in the Amazon Elastic Compute Cloud (Amazon EC2) M7a instances. At the same time, Oracle is planning to make its Computing Infrastructure (OCI) E5 instances available with 4th Gen AMD EPYC processors.
“AWS has worked with AMD since 2018 to offer Amazon EC2 instances to customers. Today, we are seeing customers wanting to bring new types of applications to AWS, like financial applications, applications servers, video transcoding, and simulation modeling," said David Brown, vice president of Amazon EC2 at AWS. “When we combine the performance of 4th Gen AMD EPYC processors with the AWS Nitro System, we’re advancing cloud technology for our customers by allowing them to do more with better performance on even more Amazon EC2 instances.”
AMD also showcased the ROCm software ecosystem, which are data center accelerators, pitched for an open AI software ecosystem. This announcement came with a presentation from PyTorch looking at its work with AMD to fully upstream the ROCm software stack and immediate “day zero” support for PyTorch 2.0 with ROCm release 5.4.2 on all AMD Instinct accelerators.
ITPro is a global business technology website providing the latest news, analysis, and business insight for IT decision-makers. Whether it's cyber security, cloud computing, IT infrastructure, or business strategy, we aim to equip leaders with the data they need to make informed IT investments.
For regular updates delivered to your inbox and social feeds, be sure to sign up to our daily newsletter and follow on us LinkedIn and Twitter.
-
Bigger salaries, more burnout: Is the CISO role in crisis?
In-depth CISOs are more stressed than ever before – but why is this and what can be done?
By Kate O'Flaherty Published
-
Cheap cyber crime kits can be bought on the dark web for less than $25
News Research from NordVPN shows phishing kits are now widely available on the dark web and via messaging apps like Telegram, and are often selling for less than $25.
By Emma Woollacott Published
-
What is exascale computing? Exploring the next step in supercomputers
60 years after the birth of the first supercomputers, we are entering a new era
By Jane McCallion Published
-
AMD has put in the groundwork for a major AI push while the tech industry has fawned over Nvidia
Analysis The processor giant will be keen to use its AMD Advancing AI Event in San Francisco to capitalize on recent successes
By Ross Kelly Published
-
Empowering enterprises with AI: Entering the era of choice
whitepaper How High Performance Computing (HPC) is making great ideas greater, bringing out their boundless potential, and driving innovation forward
By ITPro Last updated
-
AMD’s acquisition spree continues with $665 million deal for Silo AI
News The deal will enable AMD to bolster its portfolio of end-to-end AI solutions and drive its ‘open standards’ approach to the technology
By Ross Kelly Published
-
AMD retains its position as the partner of choice for the world’s fastest and most efficient HPC deployments
Supported content AMD EPYC processors and AMD Instinct accelerators have been used to power a host of new supercomputers globally over the last year
By ITPro Published
-
AMD strikes deal to power Microsoft Azure OpenAI service workloads with Instinct MI300X accelerators
Supported content Some of Microsoft’s most critical services, including Teams and Azure, will be underpinned by AMD technology
By ITPro Published
-
What do you really need for your AI project?
Supported content From GPUs and NPUs to the right amount of server storage, there are many hardware considerations for training AI in-house
By ITPro Published
-
AMD and Microsoft cement relationship with cloud collaborations
The products offer Azure customers the possibility of running high intensity or AI workloads on powerful infrastructure, without having to house or maintain it themselves
By ITPro Published