The AMD Instinct MI300A, described as the first data center APU, combines both the CPU and the GPU, and it is meant to compete with the likes of Nvidia’s Grace Hopper. The MI300A is already coming to ...
HPE was showing off a server blade from its upcoming El Capitan supercomputer at the recent ISC High Performance event in Hamburg, Germany. The server blade had its front cover removed, showing off ...
— ROCm 6 open software ecosystem combines next-gen hardware and software to deliver ~8x generational performance increase, power advancements in generative AI and simplify deployment of AMD AI ...
AMD has run a distant second to Nvidia in the GPU acceleration HPC market despite the fact that its accelerators power the world’s fastest supercomputer. It’s looking to gain ground with the launch of ...
The MI300 chips—which are also getting support from Lenovo, Supermicro and Oracle—represent AMD’s biggest challenge yet to Nvidia’s AI computing dominance. It claims that the MI300X GPUs, which are ...
Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More Advanced Micro Devices said it is unveiling its AMD Instinct MI300X and ...
The great thing about the Cambrian explosion in compute that has been forced by the end of Dennard scaling of clock frequencies and Moore’s Law lowering in the cost of transistors is not only that we ...
New 8-GPU Systems Powered by AMD Instinct™ MI300X Accelerators Are Now Available with Breakthrough AI and HPC Performance for Large Scale AI Training and LLM Deployments The new 2U liquid-cooled and ...
In the fiercely competitive high-performance computing (HPC) and artificial intelligence (AI) sectors, AMD made a substantial leap forward with the release of its new MI300X and MI300A accelerators.
CRN rounds up five cool AI and high-performance computing servers from Dell Technologies, Lenovo, Supermicro and Gigabyte that use AMD’s Instinct MI300 chips, which launched a few months ago to ...
AMD hosted its Data Center and AI Technology Premiere in San Francisco, CA today, where it outlined its vision and strategy for the “future of computing.” The company introduced its first batch of ...
NVIDIA's just-announced H200 Hopper AI GPU features an increased 141GB of HBM3e memory from Micron, up from the 80GB HBM3 memory used on its industry-leading AI performance monster H100 AI GPU. AMD ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results