Theme 01: The Class Game of the AI Food Chain—Who is the Landlord, Who is the Tenant?
Core Argument
The competition in the AI industry is not about "whose model is smarter," but who commands structural cost advantages and distribution channels. Understanding the class logic of this "food chain" is the foundation for comprehending all subsequent themes.
I. Why the Death of Sora Doesn't Mean AI is Finished
In 2025, OpenAI axed Sora, its most talked-about AI video generation product, just as its annual losses surged and a $1 billion partnership with Disney collapsed. The internet was flooded with panic: "Is AI finished?"
This is a classic blind spot in perspective.
Sora's failure is not a failure of AI technology, but a failure of OpenAI's specific business structure. Just like a restaurant closing doesn't mean the whole catering industry is doomed—it just means that specific restaurant's rent was too high, its ingredients too expensive, and it lacked enough customers.
To understand this game of chess, you must first recognize the "class role" of each player in the food chain: where their computing power comes from, how their content is distributed, and from whose pockets the money is drawn.
II. The Class Positioning of the Five Major Players
🏠 OpenAI: The Most Depressed "Super Tenant"
Structural Predicament: OpenAI is the most famous AI company globally, but its position in the industry's food chain is extremely fragile—it does not own any underlying infrastructure.
- Computing Power: Has no self-developed chips; all model training and inference rely on Microsoft Azure's GPU clusters. Every time ChatGPT answers a question or Sora generates a video, it pays rent to Microsoft.
- Distribution: Has no native consumer-level platform (unlike Google's YouTube or ByteDance's TikTok). The ChatGPT website and app are its only entry points, but these entry points lack "ecosystem stickiness"—users can switch to Claude or Gemini at any time.
Latest Financial Data (as of March 2026):
| Metric | Figure | Source |
|---|---|---|
| Annual Recurring Revenue (ARR) | Over $20 Billion (End of 2025) | TradingKey, MediaPost |
| Estimated Annual Loss (2026) | ~$14 Billion | RDWorldOnline, Reddit Internal Docs |
| Paid Subscribers | ~50 Million | OpenAI Official |
| Weekly Active Users (WAU) | ~900 Million | OpenAI Official |
| Latest Valuation | ~$850 Billion (March 2026 funding) | TechFundingNews, Benzinga |
Key Contradiction: An ARR of over $20 billion sounds astonishing, but the estimated loss for 2026 is as high as $14 billion. This means that for every $1 earned, it spends over $1.7 on computing power. The axing of Sora is a direct result of this structural contradiction—the computing power required for video generation is thousands of times that of text, and OpenAI simply cannot afford it.
🏰 Google: The Self-Sufficient "Computing Landlord"
Structural Advantage: Google holds the most solid position in the food chain because it possesses three key assets simultaneously:
- Self-Developed Chips (TPU v7): Google has been developing TPUs since 2015, iterating to the 7th generation by 2026. Ten years of accumulation mean Google's computing costs are far lower than those of the "tenants" who must buy GPUs from Nvidia. Google's marginal cost of training Gemini might be only a fraction of OpenAI's cost for GPT.
- Super Distribution Platform (YouTube + Search): Google owns the world's largest video platform, YouTube (over 2 billion MAU), and the dominant search engine (over 90% global market share). Any AI product integrated into these two platforms immediately reaches billions of users. Google's Veo video generation tool is being integrated into YouTube Shorts—something OpenAI's Sora could never do.
- Cross-Subsidization Capability: Search and YouTube ads generate over $300 billion in revenue annually for Google. These staggering profits can endlessly subsidize AI R&D losses. Google could afford to lose money on AI for ten years, and its search/ad profits would still hold it up.
🏗️ Amazon: The Low-Profile "Infrastructure Contractor"
Amazon's AWS is the world's largest cloud provider (~31% market share). Its role in the AI food chain is the "infrastructure contractor"—it doesn't build raw AI products for consumers directly, but provides the computing foundation for all AI companies.
Self-Developed Chips: Amazon's Trainium3 uses a 3nm process, optimized specifically for AI training. Even OpenAI has started using AWS Trainium to reduce its sole dependence on Azure and Nvidia.
Investment Layout: Amazon invested $50 billion into OpenAI in 2025-2026 ($35 billion of which is condition-attached), while also being the largest investor in Anthropic (Claude's developer, with an ~$8 billion investment). Amazon's strategy is "hedging its bets"—whether OpenAI or Anthropic wins, Amazon profits as the seller of shovels.
🚗 Musk (Tesla / xAI): The "Industrial Player" on a Rugged Path
Musk's xAI (developer of Grok) took a completely different path: his AI isn't meant to be sold to enterprises or consumers directly, but to serve Tesla's Full Self-Driving (FSD) and the Optimus humanoid robot.
Chip Dilemma: Musk high-profilely announced the Dojo supercomputer custom chip plan in 2021, but disbanded the team in August 2025, calling it an "evolutionary dead end." He's currently pivoting to AI5/AI6 chips + a $16.5B foundry contract with Samsung, but still heavily relies on Nvidia GPUs.
Unique Monetization Logic: If FSD and Optimus truly achieve massive commercialization, Tesla can absorb the computing cost into the price tag of every car and robot sold. This is theoretically the strongest "closed loop"—but it hinges on whether these products can scale commercially, which remains unknown.
🃏 Meta: The Calculating "Ecosystem Dealer"
Apparent Generosity: Meta spends $70 billion annually building AI infrastructure, owns over 1.5 million GPUs, and open-sources its flagship Llama series. It looks like the philanthropist of the AI world.
Actual Calculation: "Open source" is Meta's shrewdest strategic weapon. By letting global developers use Llama for free, Meta achieved three things:
- Built a developer ecosystem around Llama, making it the de facto standard for open-source AI.
- Weakened the pricing power of OpenAI and Google's closed-source models.
- Accumulated massive developer usage data and feedback.
But "forever free" is an illusion—the next-generation flagship model Avocado is rumored to switch to closed-source. Meta is also developing its MTIA custom chips. The strategy is clear: establish hegemony through open source first, then harvest at the right time.
III. The Three Keys That Determine Life and Death
Aligning the five major players, a clear survival law emerges:
🔑 Key 1: Can Afford It (Custom Chips or Ultra-Low Cost Computing)
| Player | Custom Chip | Computing Cost | Verdict |
|---|---|---|---|
| TPU v7 (7th Gen) | Near factory price | ✅ Can afford | |
| Amazon | Trainium3 (3nm) | Near factory price | ✅ Can afford |
| Meta | MTIA (in dev) | Still relies on Nvidia, but has 1.5m+ GPUs | ⚠️ Transitioning |
| xAI/Tesla | AI5/AI6 (in dev) | Still heavily relies on Nvidia | ⚠️ Transitioning |
| OpenAI | Titan (1st Gen, unreleased) | Fully relies on Azure/Nvidia | ❌ Cannot afford |
🔑 Key 2: Can Distribute It (Owned Distribution Channels)
| Player | Distribution Channel | User Scale | Verdict |
|---|---|---|---|
| YouTube + Search | Billions MAU | ✅ Can distribute | |
| Meta | FB + IG + WhatsApp | Billions MAU | ✅ Can distribute |
| ByteDance | TikTok + Douyin | 1B+ MAU | ✅ Can distribute |
| xAI/Tesla | X + Tesla In-Car | Hundreds of millions | ⚠️ Has channels, but non-mainstream |
| OpenAI | ChatGPT Site/App | 900M WAU but no ecosystem stickiness | ⚠️ Has entry points, lacks moat |
🔑 Key 3: Can Monetize It (Clear Business Model)
| Player | Business Model | Sustainability | Verdict |
|---|---|---|---|
| Ad Cross-Subsidy + Cloud APIs | Extremely High | ✅ | |
| Amazon | AWS Cloud Services | Extremely High | ✅ |
| Meta | Ads + Developer Ecosystem | High | ✅ |
| xAI/Tesla | FSD + Optimus Hardware Sales | Unverified | ⚠️ |
| OpenAI | Subscriptions + APIs, but margins only ~33% | Unsustainable | ❌ |
IV. The Overlooked "Special Class": ByteDance
ByteDance is a fascinating counter-example. Many think of it as a "landlord," but in fact, ByteDance plans to spend ~$14 billion buying Nvidia chips in 2026; its custom chip (partnered with Broadcom) is still in the development phase. Essentially, it is also a "tenant."
But ByteDance has something OpenAI will never have: TikTok / Douyin, a super distribution platform with a billion-plus users.
ByteDance's AI video tool, Seedance 2.0, has already been integrated into CapCut and is rolling out in markets like Brazil, Indonesia, Mexico, the Philippines, Thailand, and Vietnam. Kuaishou's Kling similarly followed the "AI + Short Video Platform" route; by the end of 2025, its ARR exceeded $240 million with over 60 million users.
This illustrates one thing: "Can afford it" and "Can distribute it"—as long as you hold one of these structural advantages, you have a chance to survive. If you have neither, you become Sora.
V. Conclusion: Class Determines Fate
What AI makes obsolete is not technology, but business models that fail to balance their ledgers. The exit of Sora does not mean AI is finished; it is simply a natural cleansing of the food chain—"pure technology companies" without structural advantages are destined to be knocked out first in this brutal class game.
AI is still racing forward; it's just being operated by a new batch of players with far better structural advantages.
The next question is: Why is Microsoft, the seemingly most advantaged "landlord," actually in a far more dangerous position than anyone imagines?
👉 Theme 02: The Dangerous Dance Between Microsoft and OpenAI