People often envision tech bubbles as catastrophic events, but it doesn’t necessarily have to be that dire. In economic terms, a bubble is essentially a risky bet that has gone too far, leading to an oversupply compared to demand.
The key takeaway: It’s not just black and white, and even the best bets can falter if you’re not careful in your approach.
The complexity of the AI bubble arises from the mismatched timelines between the rapid development of AI software and the slow process of building and powering data centers.
These data centers take years to construct, meaning a lot can change by the time they start operating. The supply chain that supports AI services is incredibly intricate and constantly evolving, making it difficult to predict how much demand we’ll face a few years down the line. It’s not just about how much people will use AI in 2028; it’s also about how they’ll use it and whether we’ll see any advancements in energy, semiconductor technology, or power transmission in the meantime.
When investments of this magnitude are involved, numerous factors can lead to failure—and AI investments are certainly escalating.
Recently, Reuters announced that a data center campus linked to Oracle in New Mexico has secured up to $18 billion in credit from a group of 20 banks. Oracle has already arranged $300 billion in cloud services for OpenAI, and together with SoftBank, they aim to create a total of $500 billion in AI infrastructure under the “Stargate” project. Meanwhile, Meta has committed to spending $600 billion on infrastructure over the next three years. We’ve been monitoring all the major developments, and the sheer volume is overwhelming.
However, there remains genuine uncertainty regarding the pace at which demand for AI services will increase.
A recent McKinsey survey examined how leading firms are utilizing AI tools. The findings were mixed. Almost all organizations surveyed are incorporating AI in some form, but few are implementing it on a large scale. While AI has enabled some companies to reduce costs in specific areas, it hasn’t had a significant impact on their overall operations. In essence, most businesses are still adopting a “wait and see” approach. If you’re hoping for those companies to occupy space in your data center, you might be in for a long wait.
Even if AI demand continues to grow, these projects could face straightforward infrastructure challenges. Satya Nadella surprised listeners on a recent podcast by expressing more concern about a shortage of data center space than of chips. As he stated, “It’s not a supply issue of chips; it’s the fact that I don’t have warm shells to plug into.” At the same time, entire data centers are remaining unused because they can’t meet the power demands of the latest generation of chips.
While Nvidia and OpenAI are pushing forward as quickly as possible, the electrical grid and the surrounding infrastructure are still progressing at their usual pace. This creates opportunities for costly bottlenecks, even if everything else aligns.
We dive deeper into this topic in this week’s Equity podcast, which you can listen to below.



