Wall Street is hypnotized by Nvidia. But AMD just made the loudest move of the year without raising its voice. The Instinct MI350 Series is out. It is not a refresh. It is a leap. Built on CDNA 4, the MI350X and MI355X deliver up to 4 times the compute of the MI300 generation and a 35 times jump in inference throughput. That is not marketing fluff. That is silicon math.
Each card packs 288 GB of HBM3E memory. That is enough to run 20 billion parameter models on a single GPU. Bandwidth hits 8 terabytes per second. The MI355X pushes 161 PFLOPs of FP4 performance. That is not theoretical. That is shipping this quarter. Meta, Oracle, Dell, HPE, Cisco, and Asus are already integrating MI350 systems. These are not paper partners. These are deployment pipelines.
Lisa Su says we are at an inflection point in AI inference. Not training. Inference. That is where the volume lives. That is where the money flows. She projects the data center AI accelerator market will hit $500 billion by 2028. That is not a typo. That is a five followed by eleven zeroes. AMD is not chasing that market. It is already in it. Seven of the ten largest AI developers are running workloads on Instinct accelerators today.
The MI300 family has already pulled in over $5 billion in revenue. The MI350 is built to carry that momentum into 2026. The MI400 is already on the roadmap. AMD is not guessing. It is executing. The ROCm 7 software stack is now delivering 3.5 times inference gains over the previous version. It supports major models like LLaMA and DeepSeek out of the box. It is open. It is fast. It is getting adopted.
Q1 2025 numbers back it up. Revenue hit $7.44 billion, up 36% year over year. Data center revenue jumped 57%. EPS climbed 55%. Guidance for Q2 is flat at $7.4 billion, even with a $700 million hit from export controls. That is not softness. That is resilience.
AMD stock is up 20% year to date. It still trades below its all-time high. Nvidia is grabbing headlines. AMD is grabbing market share. The breakout setup is sitting in plain sight.
Sources: