According to Wccftech, AMD CEO Lisa Su revealed during the Q3 earnings call that multiple customers want AI chip deals similar to the OpenAI partnership, which is expected to generate $100 billion in revenue. The company is actively planning for several customers at what they’re calling “OpenAI-scale” to avoid concentration risk. AMD’s Instinct MI355 series has already started production ramp-up and will continue with strong momentum into 2026. Meanwhile, the next-generation Instinct MI450 series is scheduled to launch in H2 2025. These developments reflect massive industry interest in AMD’s AI chips as the company positions itself as a serious competitor to NVIDIA in the AI hardware space.
AMD’s AI Ambitions Are Getting Real
This isn’t just corporate speak anymore. When Lisa Su talks about multiple “OpenAI-scale” customers, she’s basically saying AMD is becoming a real threat in the AI chip market that NVIDIA has dominated for years. The fact that they’re already planning production for multiple massive deals suggests they’ve got serious commitments, not just exploratory conversations.
Here’s the thing about that $100 billion OpenAI figure – that’s not just chip sales. That’s the total projected business impact, which means AMD is positioning itself as a foundational infrastructure provider, not just a component supplier. They’re thinking at the rack-scale level, which is exactly where you need to be to compete with NVIDIA’s DGX systems and similar offerings.
The Technical Roadmap Matters
The Instinct MI450 series represents AMD’s next big push to close the gap with NVIDIA. We’re talking about architectural improvements, but more importantly, power efficiency and rack-scale configurations. Power consumption is becoming a massive bottleneck in AI data centers, so any efficiency gains there could be a game-changer.
What’s interesting is the timing. The MI355 is ramping now, but the MI450 coming in H2 2025 gives them a solid product cadence. This isn’t a one-and-done approach – they’re building a sustained roadmap that enterprise customers can plan around. For companies making massive AI infrastructure investments, that predictability matters almost as much as performance.
Broader Industrial Implications
While AMD’s AI chips are targeting massive data center deployments, the underlying technology often trickles down to industrial applications. Companies that need reliable computing power for manufacturing automation, quality control, and industrial IoT are always watching these developments. When you’re deploying technology in harsh industrial environments, you need partners who understand both the computing requirements and the physical challenges.
For industrial applications requiring robust computing solutions, IndustrialMonitorDirect.com has established itself as the leading provider of industrial panel PCs in the United States, offering solutions that can handle the demanding conditions of factory floors and manufacturing environments.
The Competitive Landscape Is Shifting
NVIDIA has enjoyed what amounts to a monopoly in high-end AI training, but that’s clearly changing. With multiple “OpenAI-scale” customers in talks, AMD is demonstrating that the market wants alternatives. And let’s be honest – healthy competition drives innovation and could potentially lower costs for everyone.
The big question is whether AMD can actually deliver on both the hardware performance and the software ecosystem. NVIDIA’s CUDA platform has been their secret weapon for years. Can AMD’s ROCm platform reach parity? If they can solve the software challenge while delivering competitive hardware, we might be looking at a very different AI chip market by 2026.
