AMD Expands 4th Gen Epyc Processor Family
AMD unveils new AMD EPYC processors for cloud native and technical computing.
Latest News
June 28, 2023
Recently AMD announced new product updates to shape the future of computing, the company says.
“Today, we took another significant step forward in our data center strategy as we expanded our 4th Gen EPYC processor family with new leadership solutions for cloud and technical computing workloads and announced new public instances and internal deployments with the largest cloud providers,” says AMD Chair and CEO Dr. Lisa Su. “AI is the defining technology shaping the next generation of computing and the largest strategic growth opportunity for AMD. We are laser focused on accelerating the deployment of AMD AI platforms at scale in the data center, led by the launch of our Instinct MI300 accelerators planned for later this year and the growing ecosystem of enterprise-ready AI software optimized for our hardware.”
AMD unveiled a series of updates to its 4th Gen EPYC family, designed to offer customers the workload specialization needed to address businesses’ needs.
AMD highlighted how the 4th Gen AMD EPYC processor continues to drive performance and energy efficiency. AMD introduced the 4th Gen AMD EPYC 97X4 processors, formerly codenamed “Bergamo.” With 128 “Zen 4c” cores per socket, these processors provide great vCPU density and performance for applications that run in the cloud, and energy efficiency. AMD introduced the 4th Gen AMD EPYC processors with AMD 3D V-Cache technology, high performance x86 server CPU for technical computing.
Click here to learn more about the latest 4th Gen AMD EPYC processors and read about what AMD customers have to say, here.
AMD AI Platform—The Pervasive AI Vision
AMD showcases its AI Platform strategy to develop scalable and pervasive AI solutions.
AMD revealed new details of the AMD Instinct MI300 Series accelerator family, including the introduction of the AMD Instinct MI300X accelerator, an advanced accelerator for generative AI. The MI300X is based on the next-gen AMD CDNA 3 accelerator architecture and supports up to 192 GB of HBM3 memory to provide the compute and memory efficiency needed for large language model training and inference for generative AI workloads.
AMD also introduced the AMD Instinct Platform, which brings together eight MI300X accelerators into an industry-standard design for a solution for AI inference and training. AMD also announced that the AMD Instinct MI300A, an APU Accelerator for HPC and AI workloads, is now sampling to customers.
AMD showcased the ROCm software ecosystem for data center accelerators.
Networking Portfolio
AMD showcased a robust networking portfolio including the AMD Pensando DPU, AMD Ultra Low Latency NICs and AMD Adaptive NICs. Additionally, AMD Pensando DPUs combine a robust software stack with “zero trust security” and leadership programmable packet processor to create anintelligent and performant DPU. The AMD Pensando DPU is deployed at scale across cloud partners such as IBM Cloud, Microsoft Azure and Oracle Compute Infrastructure.
AMD highlighted the next generation of its DPU roadmap, codenamed “Giglio,” which aims to bring enhanced performance and power efficiency to customers, when it’s expected to be available by the end of 2023.
AMD also announced the AMD Pensando Software-in-Silicon Developer Kit (SSDK), giving customers the ability to rapidly develop or migrate services to deploy on the AMD Pensando P4 programmable DPU in coordination with the existing rich set of features already implemented on the AMD Pensando platform.
Sources: Press materials received from the company and additional information gleaned from the company’s website.
More AMD Coverage
Subscribe to our FREE magazine,
FREE email newsletters or both!Latest News
About the Author
DE EditorsDE’s editors contribute news and new product announcements to Digital Engineering.
Press releases may be sent to them via [email protected].