The AI Boom Has a Problem List
The artificial intelligence industry has spent the last few years projecting near-limitless confidence — trillion-dollar valuations, breathless product launches, and promises of transformative change around every corner. But at this week's Milken Global Conference in Beverly Hills, five people who collectively touch every layer of the AI supply chain offered a more sobering picture.
The panel — spanning hardware, infrastructure, investment, and model development — sat down with TechCrunch to discuss what's actually going wrong as the industry scales at unprecedented speed.
Chips Are Still the Bottleneck
Despite years of warnings, the semiconductor shortage remains one of the most stubborn constraints in the AI pipeline. Demand for high-end GPUs and specialized AI accelerators continues to vastly outpace supply, forcing companies to queue for hardware months or even years in advance.
The panelists noted that the concentration of advanced chip manufacturing — primarily in Taiwan and South Korea — creates geopolitical risk that few companies have adequately planned for. Any disruption to that supply chain, whether from natural disaster, trade policy, or conflict, could stall AI development globally.
Orbital Data Centers and the Infrastructure Race
One of the more striking topics raised was the emerging concept of orbital data centers — computing infrastructure housed in low-Earth orbit satellites. While the idea sounds like science fiction, at least one panelist indicated that serious engineering and investment work is already underway.
The appeal is straightforward: space-based data centers could bypass terrestrial energy and real estate constraints, tap into near-constant solar power, and reduce latency for certain global applications. The challenges — cost, maintenance, thermal management in a vacuum — are equally formidable.
Is the Architecture Itself Wrong?
Perhaps the most provocative thread in the conversation was the suggestion that the transformer architecture underpinning most modern large language models may have fundamental limitations that can't be engineered away with more compute or better data.
Several researchers and investors in the AI space have quietly raised concerns about whether scaling current architectures indefinitely will yield the reasoning and generalization capabilities the industry is betting on. The panelists stopped short of predicting an imminent reckoning, but acknowledged that the field may be approaching a point where the next leap requires a genuinely new approach — not just more of the same.
The Stakes Are High
The Milken Global Conference has long served as a gathering point for the financial and business elite, and the presence of AI supply chain leaders on its main stage underscores just how central the technology has become to global economic planning.
But the candor on display was notable. In an industry that often speaks in relentlessly optimistic terms, the acknowledgment of systemic risks — hardware dependencies, infrastructure gaps, and potential architectural dead ends — signals that at least some insiders are taking the warning signs seriously.
For businesses, governments, and investors who have staked significant resources on AI's continued trajectory, the message from Beverly Hills this week was clear: the wheels aren't off yet, but the bolts are worth checking.
Source: TechCrunch
