Kevin O'Leary Says America Could Lose The AI Race To China Over One Crucial Factor — Highlights Canada's $70B AI Megaproject As World's Largest

News Summary
Investor and "Shark Tank" star Kevin O'Leary asserts that energy security, not capital, is the defining factor for the next generation of AI infrastructure. He emphasized that building an AI data center requires massive, low-cost, sustainable power, land, and a skilled workforce, conditions he deems rare and extremely difficult to secure. O'Leary believes that nations with dependable and affordable energy supplies will ultimately "win the AI race," framing the challenge as far greater than simply funding hardware or models. He highlighted his Wonder Valley AI Data Center project in Greenview City, Canada, as the world's largest AI data center project currently in the planning stage, with a planned value of $70 billion. This project's scale significantly surpasses other global megaprojects, including Group 42's $40 billion Project Stargate in the UAE and South Korea's $35 billion Jeonnam AI Data Center. O'Leary previously warned that China's massive power infrastructure, rather than its semiconductor production, represents the greatest challenge to the U.S. in the global AI race, echoing Nvidia CEO Jensen Huang's concerns about China's lower energy costs and looser regulatory restrictions.
Background
As global demand for Artificial Intelligence (AI) computing power continues to surge in 2025, the construction and operational costs of data centers, which are core to AI development, have increasingly come under scrutiny. Energy consumption, in particular, has emerged as a critical bottleneck limiting the scalability of AI and national competitiveness in the AI race. The current U.S. administration, led by President Donald J. Trump, consistently faces growing competitive pressures from China across various technological and strategic domains. In the AI sector, competition between the two nations is particularly intense, spanning innovation, talent pools, and infrastructure development. CEOs of leading AI chip manufacturers like NVIDIA have repeatedly pointed to energy costs and regulatory environments as significant determinants of AI development pace.
In-Depth AI Insights
Why might energy security, rather than semiconductor supply, become the primary disadvantage for the U.S. in the AI race? O'Leary's and Huang's perspectives reveal a profound shift in the AI infrastructure competition. For the past few years, semiconductors, particularly high-end AI chips, have been the focal point of U.S.-China tech rivalry. However, as global chip manufacturing capabilities gradually improve (e.g., TSMC's fabs in the U.S., other nations' investments in semiconductor supply chains) and China's own advancements in mature process chips, a simple chip supply shortage may no longer be the sole or biggest constraint. Energy, specifically massive, low-cost, and sustainable power supply, is increasingly becoming the "new rare earth" for AI data center construction and operation. It impacts not only operational costs but also data center siting, construction speed, and long-term expansion capabilities. China's investments in renewable energy (such as hydropower and solar) and nuclear power, coupled with its centralized national grid advantages, could provide it with a structural edge in power supply, leading to greater resilience and cost-effectiveness in the AI race. What are the strategic implications of Canada's $70 billion "Wonder Valley" project for the North American and global AI landscape? This project not only signifies North America's energy advantage but could also become a crucial node in the global AI infrastructure layout. Canada possesses abundant hydroelectric resources, creating favorable conditions for providing low-cost, sustainable power. In a context where power is increasingly becoming an AI development bottleneck, mega-scale, energy-self-sufficient data center projects like Wonder Valley can attract leading global AI companies, reducing their operational costs and improving computing efficiency. For North America, this could solidify its leading position in AI research and application, especially as the U.S. faces challenges of rising energy and land costs, with Canada offering a vital alternative base. Globally, it also reflects a shift in AI infrastructure investment from traditional Silicon Valley or data center clusters to regions with greater energy cost advantages and geographical diversity, aiming to de-risk and optimize global computing power distribution. This could prompt other nations with similar energy advantages to re-evaluate their AI development potential. How should the U.S. Trump administration respond to China's AI competitive advantage in energy infrastructure? Facing China's potential advantage in energy infrastructure, the Trump administration may need to adjust its AI strategy to closely integrate energy policy with technological competitiveness. Relying solely on technology export controls might not be sufficient to maintain long-term leadership. Potential response strategies include: - Accelerating domestic clean energy projects: Significantly investing in wind, solar, nuclear power, etc., to ensure abundant, low-cost, and stable electricity supply for AI data centers. - Streamlining data center construction approval processes: Reducing bureaucratic hurdles related to land acquisition, environmental assessments, etc., to accelerate the deployment of next-generation AI data centers. - Encouraging deep integration between the energy and AI industries: Incentivizing energy companies to collaborate with tech giants to develop more energy-efficient and sustainable AI infrastructure solutions. - Promoting international cooperation: Engaging in closer collaboration with allies like Canada on energy and AI infrastructure development to form a more resilient and competitive AI ecosystem, collectively addressing challenges from China.