Elon Musk Testifies xAI Used OpenAI Models to Train Grok
In a striking legal development, Elon Musk has testified that his artificial intelligence company xAI trained its flagship chatbot, Grok, using models developed by OpenAI — the very company he co-founded and later departed on acrimonious terms.
The testimony thrusts the practice known as model distillation into the global spotlight, reigniting a fierce debate about intellectual property, fair competition, and the future of AI development.
What Is Model Distillation?
Distillation, in AI terms, is a technique where a smaller or newer model is trained to mimic the outputs of a larger, more powerful one. Instead of learning from raw data alone, the student model learns from the responses generated by the teacher model — effectively absorbing its capabilities at a fraction of the cost.
It's a legitimate and widely used technique in machine learning research. But when the "teacher" model is a proprietary, commercially licensed product built by a competitor, the legal and ethical questions get complicated fast.
Frontier AI companies — those building the world's most advanced models — have grown increasingly alarmed by distillation as a competitive threat. Training a cutting-edge model from scratch costs hundreds of millions of dollars and years of research. Distillation, critics argue, allows competitors to shortcut that investment by essentially piggy-backing on another company's work.
A Legal Battleground Takes Shape
Musk's testimony that xAI used OpenAI models to train Grok adds a dramatic new chapter to the long-running feud between Musk and OpenAI. Musk was a co-founder of OpenAI but left its board in 2018; he has since become one of its most vocal critics, launching xAI and Grok as a direct competitor.
The revelation that Grok's training pipeline included OpenAI model outputs raises immediate questions: Was the use licensed? Did it violate OpenAI's terms of service? And does it constitute the kind of unauthorized copying that OpenAI has been pushing to prevent across the industry?
OpenAI has itself taken action against distillation in recent months, updating its usage policies and taking steps to detect when its models are being used as training data for third-party systems without authorization.
The Bigger Picture for AI
The Musk-xAI-OpenAI saga is emblematic of a broader reckoning happening across the AI industry. As the gap between frontier models and open-source alternatives narrows, the companies that spent the most to build those frontier systems are scrambling to defend their competitive moats.
Distillation is now at the centre of that fight. If the courts ultimately rule that training on a competitor's outputs is infringement, it could reshape how AI companies build and license their technology — and potentially slow the rapid democratization of AI that has characterized the past few years.
For now, Musk's testimony has handed the world a rare, candid window into how even the biggest players in AI actually build their products — sometimes by standing on the shoulders of rivals.
Source: TechCrunch
