Consultancy Circle

Artificial Intelligence, Investing, Commerce and the Future of Work

Elon Musk Questions Stargate AI’s Funding Viability

When Elon Musk speaks, the tech world listens—and his recent skeptical take on Stargate AI’s funding trajectory has stirred up considerable discourse within the AI industry’s inner circles. In late June 2025, Musk openly questioned whether the colossal investments racing into Stargate AI, a powerful new entrant in the AGI (Artificial General Intelligence) arms race, could be financially sustainable. His concerns emerged during a lively exchange on X (formerly Twitter), where he commented, “It simply doesn’t make sense. The sheer number of NVIDIA GPUs required alone is a capital killer” (Yahoo Finance, 2025).

This statement reignites a broad conversation about how funding dynamics, hardware requirements, and long-term economic sustainability intersect in the race to build the world’s most powerful AI models. Stargate AI, developed as a collaborative effort—primarily by OpenAI, using Microsoft’s Azure infrastructure—represents an evolution toward artificial general intelligence. But the magnitude of compute, power, and money required is challenging conventions even within the deep-pocketed AI elite.

Massive Resource Demands: The Heart of Musk’s Concern

The root of Musk’s critique lies in the extravagant scale required to train AI models at the Stargate level. OpenAI’s Stargate project is allegedly built using Microsoft’s supercomputing infrastructure, involving more than 20,000 advanced NVIDIA H100 GPUs (NVIDIA Blog, 2025). Costs related to the GPUs alone exceed $800 million, with some estimates hovering closer to $1.2 billion when accounting for redundancy, networking, and uptime considerations. The actual project expenditure, integrating custom cooling, energy provisioning, and Microsoft Azure optimizations, is projected to reach $2.5 billion in 2025—as reported by CNBC Markets (CNBC, 2025).

While that number may shock the average investor, it’s becoming standard operating procedure among leading labs pushing the AGI envelope. However, as Musk pointed out, this scale may not be viable for most players, nor might it be rational in the absence of proven, monetizable results. Tesla’s AI lead, Ganesh Venkataramanan, backed Musk’s perspective during a Tesla shareholder Q&A in June 2025, noting that “compute resource costs are inflating faster than yield returns.”

Comparative Resource Requirements (2025)

AI Model Estimated GPU Count Total Cost ($USD)
Stargate (OpenAI/Microsoft) >20,000 NVIDIA H100 ~$2.5 billion
Gemini 2 Pro+ (Google DeepMind) 18,000 NVIDIA H100 ~$2.2 billion
Grok-2 (xAI) 10,000 NVIDIA A800 ~$900 million

Source: OpenAI Blog, DeepMind Blog, NVIDIA Blog, CNBC Markets (All 2025)

Economic Sustainability and ROI: The $64 Billion Question

Beyond sheer hardware costs, the broader question Musk raises surrounds ROI—return on investment. According to McKinsey Global Institute (2025), large-scale LLMs must generate at least a 40% year-over-year commercial yield over five years to justify current capital burns. While cloud infrastructure providers like AWS, Microsoft Azure, and Google Cloud offer co-financing benefits through usage credits, these subsidies only defer actual cost realization, not eliminate it.

Deloitte’s 2025 AI-Growth Index notes that for every $10 billion poured into frontier AI model training, only about $1.6 billion has materialized in monetizable downstream applications during the first two years—a yield rate Musk might describe as “non-starters” for sustainable economic modeling (Deloitte, 2025).

At present, monetization largely arises from licensing foundation models for enterprise use, AI copilots for productivity tools (like Microsoft 365 Copilot), and fine-tuning large models for specific industries (such as MedPaLM-3 in healthcare or legal GPTs). The World Economic Forum emphasizes that such “derivative AI ecosystems” will be key to unlocking sustainable value from flagship architectures like Stargate (WEF, 2025).

AI Hardware Wars and the GPU Bottleneck

Musk’s criticism becomes even more poignant considering the global GPU shortage. As of mid-2025, NVIDIA remains the dominant player in high-performance AI chipsets, with the H100 family commanding over 83% of market share for transformer-based model training workloads (NVIDIA Blog, 2025). This creates a scarcity bottleneck, driving up GPU rental prices globally. According to VentureBeat AI, hourly H100 instances on leading cloud providers have surged to $17.50/hour—up 45% from 2024.

To mitigate this, companies like Anthropic (Claude) and Mistral are investing in custom silicon and relying on open-weight models that require less compute. Meta has doubled down on its internally optimized Llama series to avoid dependence on external suppliers, leaning into smaller, “good enough” models for edge devices and consumer interaction (MIT Technology Review, 2025).

Key Players in Custom AI Hardware Development

Company Custom AI Hardware Program Purpose
Anthropic Claude Silicon Project Train energy-efficient AI models
Google TPU v6 Train Gemini 3 models at scale
Apple Apple Neural Engine Early-stage development for edge AI

Source: AI Trends, The Gradient, DeepMind Blog (2025)

Antitrust Winds and Ethical Implications

Regulatory scrutiny is intensifying, and the Federal Trade Commission (FTC) launched a formal inquiry into potential anticompetitive behavior in January 2025 involving Microsoft’s exclusive relationship with OpenAI for Stargate development. The probe centers on procurement priority given to Microsoft for NVIDIA HGX boards and Microsoft’s strategic lock-in of AI modeling via Azure credits (FTC News, 2025).

Additionally, there’s growing debate over the environmental footprint and labor practices associated with massive model development. Reports from the World Resources Institute indicate that Stargate-level model training consumes more power than 90,000 average U.S. households would annually. Musk also flagged this in his commentary, suggesting long-term feasibility concerns breach ethical lines once compute and power needs outstrip national networks in developing regions (Pew Research Center, 2025).

What the Future May Hold

The key takeaway from Musk’s critique is not purely skepticism—but a demand for systemic rationalization. As xAI prepares for Grok-3 using in-house compute clusters on Tesla Dojo chips, the push toward scalability with sustainability grows louder. Balancing AGI ambitions with resource pragmatism will likely define leaders in the second half of the decade.

In parallel, more “smarter not larger” approaches, like OpenWeight AI from Mistral or fine-tuned domain-specific models for automakers and logistics by Tesla, may represent a parallel “middle lane.” These lower-cost, high-efficiency LLMs could outcompete mega-models in every domain except general cognition and hyper-reasoning.

Whether Musk’s predictions hold true, the industry is undoubtedly entering a phase of valuation realism. Investors, consumers, and developers may soon pivot to prioritize lean development methodologies and open inference flexibility—factors that could overshadow any single model’s theoretical potential.

References (APA Style)

  • CNBC. (2025). AI infrastructure costs rise amid GPU scarcity. Retrieved from https://www.cnbc.com/markets/
  • Deloitte. (2025). AI-Growth Index. Retrieved from https://www2.deloitte.com/global/en/insights/topics/future-of-work.html
  • FTC. (2025). FTC investigates AI supply chain monopolization. Retrieved from https://www.ftc.gov/news-events/news/press-releases
  • McKinsey Global Institute. (2025). Scaling AI: Key financial frameworks. Retrieved from https://www.mckinsey.com/mgi
  • MIT Technology Review. (2025). Is AGI worth the cost? Retrieved from https://www.technologyreview.com/topic/artificial-intelligence/
  • NVIDIA. (2025). Powering the AI revolution. Retrieved from https://blogs.nvidia.com/
  • OpenAI. (2025). Stargate: Vision for AGI. Retrieved from https://openai.com/blog/
  • VentureBeat. (2025). AI compute pricing trends. Retrieved from https://venturebeat.com/category/ai/
  • World Economic Forum. (2025). AI and the future of digital capital. Retrieved from https://www.weforum.org/focus/future-of-work
  • Pew Research Center. (2025). Environmental impacts of AI. Retrieved from https://www.pewresearch.org/topic/science/science-issues/future-of-work/

Note that some references may no longer be available at the time of your reading due to page moves or expirations of source articles.