The scale of the problem is worth understanding in precise terms.
An average NFL game has approximately 150 plays. Each play can resolve in roughly 4-5 distinct outcome categories (run, pass, turnover, score, punt). Before any pruning based on game constraints, this creates approximately 4150 ≈ 1090 possible play sequences.
The number of possible ways a single NFL game could unfold is larger than the number of protons in existence.
Even after pruning for impossibilities (can't score after the clock expires, can't have negative points, etc.), the search space remains incomprehensibly vast.
The Opacity Problem
This is why traditional sportsbook SGP pricing is necessarily opaque. No centralized model can transparently compute and display all possible paths. The house uses proprietary algorithms to estimate correlations and sets prices accordingly. Bettors either trust the house's math or don't bet.
But markets don't need to compute everything in advance.
The Fundamental Insight
This is the fundamental insight: conditional markets with real-time resolution handle the combinatorial explosion naturally through pruning. As each play resolves, entire branches of the possibility tree disappear. What was 1090 possible futures becomes 1088, then 1086, continuously collapsing until the game ends.
The market only needs to price live possibilities, not hypothetical ones. Each layer of conditional tokens enforces this pruning automatically: you can't hold tokens for impossible game states.
From Sports to Everything
This combinatorial impossibility isn't unique to football. It's inherent to any complex system with cascading dependencies:
Climate and Energy Markets
- "If global temperature rises by 1.5°C by 2030, what's the probability of Arctic summer ice disappearing by 2035?"
- "If natural gas prices exceed $5/MMBtu, what's the adoption rate of heat pumps in the Northeast?"
The dependency tree: emissions → temperature change → regional climate effects → adaptation technology adoption. Each node has multiple possible outcomes, creating exponential branching.
Technology Deployment
- "If GPT-5 is released in Q1 2026, what's the probability it passes the bar exam?"
- "If Tesla achieves Level 4 autonomy, what's the urban parking real estate market value in 2028?"
The tree: technology breakthrough → capability demonstration → regulatory approval → market adoption. Traditional forecasting struggles with these conditional chains.
Geopolitics
- "If a Republican wins the 2024 presidency, what's the probability of specific cabinet appointments?"
- "If Ukraine recaptures Crimea, what's the probability of Russian government change?"
The tree: election outcome → policy stance → diplomatic response → regional stability. These aren't independent events; they're causally linked.
Macroeconomics
- "If the Fed raises rates by 75bp, what's the probability of recession within 6 months?"
- "If unemployment exceeds 5%, what's the probability of rate cuts by Q3?"
The tree: monetary policy → economic indicators → policy response. Current prediction markets treat these as separate questions, missing the dependencies.
Hard Tech Commercialization
- "If Commonwealth Fusion achieves Q>1, what's the probability of commercial deployment by 2030?"
- "If Terraform Industries produces 1,000 tons of syngas, what's their production cost per ton?"
The tree: technical demonstration → pilot facility → commercial scale → unit economics. Stock markets price the company; prediction markets can price the milestones.
In each case, the answer to the first question fundamentally changes the probability distribution of the second question. You can't price them independently without losing critical information.
Why Traditional Markets Can't Handle This
Stock markets price companies, not conditional milestones. Equity in Commonwealth Fusion reflects their ability to raise capital, attract talent, and generate narrative momentum. It doesn't directly price "probability they achieve Q>1 by 2027."
Traditional prediction markets offer binary outcomes: "Will Commonwealth Fusion achieve Q>1 by 2027?" But they miss the follow-on questions: "If yes, how soon until commercial deployment?" "What are the unit economics?" "Which applications get built first?"
The information exists dispersed across thousands of domain experts, but there's no mechanism to aggregate it across conditional chains. Each expert knows their piece: fusion physicists understand Q>1 probability, power engineers understand grid integration timelines, economists understand energy market dynamics. Yet no single model synthesizes the full dependency tree.
This is exactly what Vitalik describes in his info finance thesis. Financial mechanisms should be tools for information elicitation, not just capital allocation. Conditional markets force decomposition of complex questions, price each dependency explicitly, and allow arbitrage across the entire structure.
The Bootstrap Path
The infrastructure to build these markets at scale doesn't exist yet. But the bootstrap path is clear:
Live in-game conditional markets provide the revenue, the volume, and the testing ground. Sports generate billions in betting volume, have clear resolution sources, and attract sophisticated traders.
This builds the infrastructure:
- Real-time oracles
- Conditional token architecture
- Dispute resolution for dependencies
- UI/UX for complex trees
- Market maker tools for hierarchical liquidity provision
Once infrastructure is proven, expand to crypto prices, macro economics, and other domains with frequent resolution and natural liquidity. These markets have some conditional structure (crypto correlations, macro indicators) but simpler trees than sports.
The final form: conditional markets for technology deployment, climate outcomes, geopolitical scenarios, and scientific breakthroughs. These are lower frequency, longer duration, and more complex, but they're also where the information value is highest.
Sports isn't a distraction from "serious" prediction markets.
The Economic Alignment
This progression is both technically sound and economically inevitable.
Sports betting generates massive revenue now. A platform that launches conditional markets for in-game betting captures value immediately. That revenue funds the engineering effort to improve oracles, reduce gas costs, refine dispute resolution, and build better UI.
By the time you want to launch "If Commonwealth Fusion achieves Q>1, what's commercial deployment probability by 2030?" you've already battle-tested the infrastructure on millions of football plays, basketball possessions, and soccer matches. The conditional token architecture is proven. The oracle system handles real-time resolution. The market makers understand hierarchical liquidity provision.
And critically: you've trained a user base to think in conditional probabilities. The sports bettor who learned to price "If Team A scores first, what's probability they win by >7?" can transfer that reasoning to "If fusion achieves Q>1, what's probability of commercial deployment by 2030?"
The cognitive framework is the same. The dependency structure is the same. The conditional token mechanism is the same.
This is how prediction markets evolve from entertainment to infrastructure.
The Vision
The final vision (Vitalik's info finance world where financial mechanisms systematically elicit distributed knowledge about complex, conditional questions) isn't built by starting with fusion timelines and climate scenarios. It's built by starting with football games and basketball matches, generating revenue, proving the architecture, and then expanding to everything else.
We need infrastructure that can handle this scale. Live sports conditional markets are how we build it.