According to CRN, AMD has acquired MK1, an AI software startup founded by Neuralink veterans Paul Merolla and Thong Wei Koh, as part of the chipmaker’s ongoing acquisition spree to compete with Nvidia. The deal follows AMD’s disclosure last week that it spent $36 million on acquisitions outside its massive $4.9 billion ZT Systems deal earlier this year. AMD reported record third-quarter revenue of $9.2 billion, driven by what the company called a “sharp” jump in sales for its CPUs and Instinct data center GPUs. MK1’s team will join AMD’s Artificial Intelligence Group, where their technology is already processing more than 1 trillion tokens daily on AMD hardware. The acquisition comes shortly after AMD secured a strategic partnership with OpenAI to deploy 6 gigawatts of Instinct-based infrastructure.
AMD’s AI Shopping Spree Intensifies
AMD isn’t just dipping its toes in the AI waters—it’s diving in headfirst with its checkbook wide open. This MK1 acquisition marks at least the fourth AI-focused purchase this year, following Enosemi, Brium, and the technical team from Untether AI. And here’s the thing: when you’re competing against Nvidia, which basically owns the AI infrastructure market, you can’t just build everything from scratch. You have to buy your way to relevance, and that’s exactly what AMD is doing.
What’s really interesting about MK1 is the Neuralink connection. These aren’t just random AI engineers—they’re people who worked on decoding brain signals and neural processing. That kind of background gives them a unique perspective on AI reasoning and inference, which is exactly what AMD needs to differentiate itself. When you’re up against a giant like Nvidia, you can’t just do the same thing slightly cheaper—you need a different approach.
The Reasoning AI Angle
AMD’s blog post about the acquisition specifically mentions “reasoning at scale” as the key value proposition. Now, that’s not just marketing buzzwords—reasoning is what separates basic AI from the kind of agentic AI that can actually automate complex business processes. Think about it: most AI today is pattern recognition, but reasoning involves actual logic and decision-making chains.
MK1’s Flywheel and comprehension engines are apparently designed to take advantage of AMD’s Instinct GPU memory architecture. That’s crucial because memory bandwidth is becoming the real bottleneck in AI inference, not raw compute power. If AMD can deliver more efficient reasoning AI through this combination of hardware and software, they might actually have a shot at carving out a meaningful piece of the enterprise AI market. And with companies increasingly looking for industrial panel PCs and specialized computing solutions, having optimized AI inference could become a major competitive advantage in manufacturing and industrial applications.
The Nvidia Problem
Let’s be real: everyone in AI is playing catch-up to Nvidia. Their CUDA ecosystem is so entrenched that it’s basically the Windows of AI computing. But AMD’s strategy seems to be: if you can’t beat their software ecosystem, buy your way into one. The ZT Systems acquisition gave them rack-scale solution capabilities, and now MK1 gives them specialized inference technology.
The OpenAI partnership is huge too—6 gigawatts of Instinct infrastructure isn’t pocket change. But can AMD actually convert these partnerships and acquisitions into sustainable market share? That’s the billion-dollar question. Nvidia isn’t standing still, and they’ve got the resources to out-innovate and out-market pretty much anyone. Still, AMD’s aggressive moves show they’re not content to just be an also-ran in the AI revolution. They’re putting their money where their mouth is, and in this capital-intensive industry, that’s what it takes to even have a chance.
