AMD today launched the latest high performance computing solutions defining the AI computing era, including 5th Gen AMD EPYC™ server CPUs, AMD Instinct™ MI325X accelerators, AMD Pensando™ Salina DPUs, AMD Pensando Pollara 400 NICs and AMD Ryzen™ AI PRO 300 series processors for enterprise AI PCs. AMD and its partners also showcased how they are deploying AMD AI solutions at scale, the continued ecosystem growth of AMD ROCm™ open source AI software, and a broad portfolio of new solutions based on AMD Instinct accelerators, EPYC CPUs and Ryzen PRO CPUs.
“The data center and AI represent significant growth opportunities for AMD, and we are building strong momentum for our EPYC and AMD Instinct processors across a growing set of customers,” said AMD Chair and CEO Dr. Lisa Su. “With our new EPYC CPUs, AMD Instinct GPUs and Pensando DPUs we are delivering leadership compute to power our customers’ most important and demanding workloads. Looking ahead, we see the data center AI accelerator market growing to $500 billion by 2028. We are committed to delivering open innovation at scale through our expanded silicon, software, network and cluster-level solutions.”
Defining the Data Center in the AI Era
AMD announced a broad portfolio of data center solutions for AI, enterprise, cloud and mixed workloads:
- New AMD EPYC 9005 Series processors deliver record-breaking performance1 to enable optimized compute solutions for diverse data center Built on the latest “Zen 5” architecture, the lineup offers up to 192 cores and will be available in a wide range of platforms from leading OEMs and ODMs starting today.
- AMD continues executing its annual cadence of AI accelerators with the launch of AMD Instinct MI325X, delivering leadership performance and memory capabilities for the most demanding AI workloads. AMD also shared new details on next-gen AMD Instinct MI350 series accelerators expected to launch in the second half of 2025, extending AMD Instinct leadership memory capacity and generative AI performance. AMD has made significant progress developing the AMD Instinct MI400 Series accelerators based on the AMD CDNA Next architecture, planned to be available in
- AMD has continuously improved its AMD ROCm software stack, doubling AMD Instinct MI300X accelerator inferencing and training performance2 across a wide range of the most popular AI Today, over one million models run seamlessly out of the box on AMD Instinct, triple the number available when MI300X launched, with day-zero support for the most widely used models.
- AMD also expanded its high performance networking portfolio to address evolving system networking requirements for AI infrastructure, maximizing CPU and GPU performance to deliver performance, scalability and efficiency across the entire system. The AMD Pensando Salina DPU delivers a high performance front-end network for AI systems, while the AMD Pensando Pollara 400, the first Ultra Ethernet Consortium ready NIC, reduces the complexity of performance tuning and helps improve time to production.
AMD partners detailed how they leverage AMD data center solutions to drive leadership generative AI capabilities, deliver cloud infrastructure used by millions of people daily and power on-prem and hybrid data centers for leading enterprises:
- Since launching in December 2023, AMD Instinct MI300X accelerators have been deployed at scale by leading cloud, OEM and ODM partners and are serving millions of users daily on popular AI models, including OpenAI’s ChatGPT, Meta Llama and over one million open source models on the Hugging Face platform.
- Google highlighted how AMD EPYC processors power a wide range of instances for AI, high performance, general purpose and confidential computing, including their AI Hypercomputer, a supercomputing architecture designed to maximize AI ROI. Google also announced EPYC 9005 Series-based VMs will be available in early
- Oracle Cloud Infrastructure shared how it leverages AMD EPYC CPUs, AMD Instinct accelerators and Pensando DPUs to deliver fast, energy efficient compute and networking infrastructure for customers like Uber, Red Bull Powertrains, PayPal and Fireworks OCI announced the new E6 compute platform powered by EPYC 9005 processors.
- Databricks highlighted how its models and workflows run seamlessly on AMD Instinct and ROCm and disclosed that their testing shows the large memory capacity and compute capabilities of AMD Instinct MI300X GPUs help deliver an over 50% increase in performance on Llama and Databricks proprietary models.
- Microsoft CEO Satya Nadella highlighted Microsoft’s longstanding collaboration and co-innovation with AMD across its product offerings and infrastructure, with MI300X delivering strong performance on Microsoft Azure and GPT workloads. Nadella and Su also discussed the companies’ deep partnership on the AMD Instinct roadmap and how Microsoft is planning to leverage future generations of AMD Instinct accelerators including MI350 series and beyond to deliver leadership performance- per-dollar-per-watt for AI applications.
- Meta detailed how AMD EPYC CPUs and AMD Instinct accelerators power its compute infrastructure across AI deployments and services, with MI300X serving all live traffic on Llama 405B. Meta is also partnering with AMD to optimize AI performance from silicon, systems, and networking to software and applications.
- Leading OEMs Dell, HPE, Lenovo and Supermicro are expanding on their highly performant, energy efficient AMD EPYC processor-based lineups with new platforms designed to modernize data centers for the AI era.
Expanding an Open AI Ecosystem
AMD continues to invest in the open AI ecosystem and expand the AMD ROCm open source software stack with new features, tools, optimizations and support to help developers extract the ultimate performance from AMD Instinct accelerators and deliver out-of-the-box support for today’s leading AI models. Leaders from Essential AI, Fireworks AI, Luma AI and Reka AI discussed how they are optimizing models across AMD hardware and software.
AMD also hosted a developer event joined by technical leaders from across the AI developer ecosystem, including Microsoft, OpenAI, Meta, Cohere, xAI and more. Luminary presentations hosted by the inventors of popular AI programming languages, models and frameworks critical to the AI transformation taking place, such as Triton, TensorFlow, vLLM and Paged Attention, FastChat and more, shared how developers are unlocking AI performance optimizations through vendor agnostic programming languages, accelerating models on AMD Instinct accelerators, and highlighted the ease of use porting to ROCm software and how the ecosystem is benefiting from an open-source approach.
Enabling Enterprise Productivity with AI PCs
AMD launched AMD Ryzen AI PRO 300 Series processors, powering the first Microsoft Copilot+ laptops enabled for the enterprise3. The Ryzen AI PRO 300 Series processor lineup extends AMD leadership in performance and battery life with the addition of enterprise-grade security and manageability features for business users.
- The Ryzen AI PRO 300 Series processors, featuring the new AMD “Zen 5” and AMD XDNA™ 2 architectures, are the world’s most advanced commercial processors4, offering best in class performance for unmatched productivity5 and an industry leading 55 NPU TOPS6 of AI performance with the Ryzen AI 9 HX PRO 375 processor to process AI tasks locally on Ryzen AI PRO laptops.
- Microsoft highlighted how Windows 11 Copilot+ and the Ryzen AI PRO 300 lineup are ready for next generation AI experiences, including new productivity and security
- OEM partners including HP and Lenovo are expanding their commercial offerings with new PCs powered by Ryzen AI PRO 300 Series processors, with more than 100 platforms expected to come to market through 2025.
Archive
- October 2024(44)
- September 2024(94)
- August 2024(100)
- July 2024(99)
- June 2024(126)
- May 2024(155)
- April 2024(123)
- March 2024(112)
- February 2024(109)
- January 2024(95)
- December 2023(56)
- November 2023(86)
- October 2023(97)
- September 2023(89)
- August 2023(101)
- July 2023(104)
- June 2023(113)
- May 2023(103)
- April 2023(93)
- March 2023(129)
- February 2023(77)
- January 2023(91)
- December 2022(90)
- November 2022(125)
- October 2022(117)
- September 2022(137)
- August 2022(119)
- July 2022(99)
- June 2022(128)
- May 2022(112)
- April 2022(108)
- March 2022(121)
- February 2022(93)
- January 2022(110)
- December 2021(92)
- November 2021(107)
- October 2021(101)
- September 2021(81)
- August 2021(74)
- July 2021(78)
- June 2021(92)
- May 2021(67)
- April 2021(79)
- March 2021(79)
- February 2021(58)
- January 2021(55)
- December 2020(56)
- November 2020(59)
- October 2020(78)
- September 2020(72)
- August 2020(64)
- July 2020(71)
- June 2020(74)
- May 2020(50)
- April 2020(71)
- March 2020(71)
- February 2020(58)
- January 2020(62)
- December 2019(57)
- November 2019(64)
- October 2019(25)
- September 2019(24)
- August 2019(14)
- July 2019(23)
- June 2019(54)
- May 2019(82)
- April 2019(76)
- March 2019(71)
- February 2019(67)
- January 2019(75)
- December 2018(44)
- November 2018(47)
- October 2018(74)
- September 2018(54)
- August 2018(61)
- July 2018(72)
- June 2018(62)
- May 2018(62)
- April 2018(73)
- March 2018(76)
- February 2018(8)
- January 2018(7)
- December 2017(6)
- November 2017(8)
- October 2017(3)
- September 2017(4)
- August 2017(4)
- July 2017(2)
- June 2017(5)
- May 2017(6)
- April 2017(11)
- March 2017(8)
- February 2017(16)
- January 2017(10)
- December 2016(12)
- November 2016(20)
- October 2016(7)
- September 2016(102)
- August 2016(168)
- July 2016(141)
- June 2016(149)
- May 2016(117)
- April 2016(59)
- March 2016(85)
- February 2016(153)
- December 2015(150)