Updated: Jan 27, 2026
Week in progress — summary will be written after the week ends.
The limiting factor on AI buildout is no longer semiconductor supply but power and datacenter capacity. Google has recognized this first and is aggressively locking down 10+ GW of 2028-2029 capacity, acquiring Intersect Power for $4.75B, and backstopping Neoclouds' cost overruns to create a near-monopoly on powered land. This creates a critical strategic asymmetry: Google is power-rich but chip-constrained (few long-term manufacturing agreements); Nvidia is chip-rich but power-constrained. Global AI Capex is set to exceed $900B in 2026, rising to $1.5T in 2027, requiring 34 GW of additional power. (SemiAnalysis)
Investment implication: Nvidia's strategic response matters more than its next chip announcement. Watch for Nvidia to backstop utilities/land deals 2+ years out — this would be a bullish signal that it's playing the long game. If Nvidia doesn't act decisively, Google's integrated stack becomes increasingly dominant. Power infrastructure plays (utilities with AI datacenter exposure, datacenter REITs with power assets) may have longer runways than pure-play chip names.
TSMC's CEO admitted the company is "very behind" in meeting demand, with silicon being the bottleneck — not power. Yet despite Q4 net profit +35% and a capex increase to $52-56B (+27-37% YoY), TSMC remains conservative about building capacity for fear of "holding the bag" if an AI bubble pops. New fab capacity takes 2-3 years to come online, meaning 2026-2027 supply is largely fixed and meaningful capacity arrives in 2028-2029. Wei explicitly noted TSMC isn't concerned about Intel competition, allowing them to stay prudent. (Stratechery)
Investment implication: Real AI acceleration requires Intel becoming a credible foundry alternative — this is the only way to force TSMC to invest more aggressively. The risk is being borne by hyperscalers who face foregone revenue from chip shortages. Watch Intel 18A progress; if credible, TSMC may respond with higher capex in future quarters. TSMC's conservatism creates an implicit ceiling on AI infrastructure buildout regardless of capital availability or demand signals.
Three major agent launches in one week — Claude Cowork, Qwen Task Assistant (100M+ MAUs in 2 months), and MiniMax Agent 2.0 — signal that 2026 is when AI moves from "chat with experts" to "work for everyone." The critical shift: agents now operate on local files and across applications, not just within a browser sandbox. MiniMax's desktop app enables local+cloud execution with "Expert Agents" that inject custom knowledge and SOPs, claiming 70→95+ reliability improvement. Notably, ~100% of MiniMax employees now use Agent as an "intern" in daily work, creating a rapid improvement feedback loop. (机器之心, Citi, Weighty Thoughts)
Investment implication: This is the "ChatGPT moment" for agents — accessibility creates adoption creates economic impact. Q3 2025 BLS data (+4.9% productivity, +5.4% output, +0.5% hours worked) may be the first statistical evidence of AI-driven gains. Watch Doubao's 300M MAU trajectory vs. legacy apps — if agents capture the traffic gateway, it's existential for incumbents' moats.
Aaron Levie argues that in a world of 100X more AI agents than people, the value of systems of record goes UP, not down. Software provides the guardrails on which agents operate, and deterministic systems (ERP, CRM, security) remain essential even as non-deterministic AI handles the creative work. Critically, the budget constraint is changing: software was limited to 3-7% of revenue (IT budget), but agents "bring the work with the software," meaning software now competes for total work spend. Example: legal software TAM was constrained by attorney headcount, but AI agents processing contracts compete for the $400B US legal services market. (Aaron Levie)
Investment implication: Don't assume AI disrupts all SaaS equally. Companies with strong data moats, complex workflows, and human-in-the-loop requirements (e.g., compliance, auditing) are better positioned. Business model evolution from per-seat to consumption-based is inevitable — watch for incumbents that successfully make this transition. The "AI eating software" narrative may be overblown for systems of record.
UBS's 30 customer/partner checks showed 4Q25 demand even stronger than 3Q25, with nearly universal bullishness. AI + data-readiness investments are pulling along cloud spend broadly. Azure continues taking share, but GCP is the standout this quarter on data/AI strength. Pre-leasing remains very strong despite rising capital intensity ($15M/MW becoming common). (UBS)
Investment implication: Hyperscaler capex guidance likely to beat. Revisit GOOG positioning — GCP gains often underappreciated relative to Cloud's contribution to the overall business. Bogeys look achievable: Azure 39%, AWS 22-23%, GCP 36%.
OpenAI launched ChatGPT ads ~1 month after "Code Red" internal warnings, despite Sam Altman previously calling ads "the last resort" (May '24) and putting "other projects on hold." This signals monetization pressure is winning over product vision. Year 1 ad revenue estimates range from $20-40B (ARPU method) to $36-72B (Google comparison). However, the move drags OpenAI back to familiar battleground where Google is the undisputed master — suggesting Google may not be as worried as the market assumed. (知乎专栏)
Investment implication: Google's integrated stack (TPUs + models + ads distribution) creates durable competitive advantage. The ad pivot reduces OpenAI's differentiation and plays to Google's strengths. Google remains the only company owning the full AI stack end-to-end.
The cost efficiency frontier dropped 99.7% (GPT-4: $37.50/M tokens in Mar '23 → $0.10/M in Aug '24). OpenAI compute margins improved from 35% to 70% (Jan '24 → Oct '25), but effective margin may be ~28% after utilization adjustments. OpenAI losses could reach $143B cumulative by 2029. The key question: if only hyperscalers can achieve profitable utilization through customer aggregation, does AI infrastructure consolidate to few dominant players? Google is uniquely positioned — the only company using its own products across the entire AI stack (TPUs, models, distribution). (Les Barclays)
Investment implication: Pure-play AI companies (Cohere, Mistral, smaller startups) face the most precarious positions — mounting pressure from VC treadmill economics. Morgan Stanley strategists note "companies that use AI to accelerate business performance will likely see stocks hold up better than pure-play AI firms." Watch for consolidation via acquihires (e.g., Nvidia-Groq at 40x revenue).
UBS VC Summit showed sentiment split: improved for AI model/infra companies, deteriorated for incumbent apps. VCs praised Claude Opus 4.5 for coding — Sequoia declared "AGI for coding." Key shift from last year: consensus now sees "disruptive change coming fast for SaaS." However, VCs pushed back on OpenAI skepticism, expecting GPT-6 and more funding rounds. (UBS)
Investment implication: Incumbent app vendors face rising disruption risk, especially in coding (threat to TEAM, GTLB). The beneficiaries may be AI-native startups in specific verticals rather than horizontal platforms. Position for disruption in apps while maintaining exposure to AI infra buildout.
No articles this week
AI Labs are currently CPU-constrained for training, not just GPU-constrained. The shift to Reinforcement Learning is driving this: RL environments are increasingly complex (code compilation, tool use, computer use, simulation environments), requiring massive CPU capacity alongside GPUs. OpenAI's Fairwater blueprint shows the ratio: a 295MW GPU building paired with a 48MW CPU/storage building (16.3% "CPU attach" on MW basis). Taiwan ODM orders for general-purpose servers are being revised up materially for 2026 deployment. ~70% of installed server base is older-generation, creating a substantial refresh opportunity. (SemiAnalysis)
Investment implication: The AI Training CPU business is growing from ~$500M (Q4'25) to ~$2.3B (Q4'26) annualized for Azure and AWS — a ~1.5% contribution to revenue growth. AWS and Azure are the key winners (most power-rich); GCP benefits less due to power constraints. Long server CPU refresh plays: INTC (Oak Stream), Aspeed (5274 TT), Lotes (3533 TT), Wiwynn (6669 TT). Low-to-mid-teens server growth expected vs. Street estimates of mid-to-high single digits.
Apple's multi-year Gemini partnership represents a fundamental strategic decision: Apple is exiting pre-training and treating model-makers as interchangeable suppliers. The deal covers foundational models (not a replacement for the ChatGPT product integration), meaning Gemini will power Siri's core inference while Apple retains control of post-training, UX, and branding. Google is charging ~$1B/year, hoping this leads to deeper product integration (making Gemini available as a ChatGPT alternative on Apple devices). The risk: if AI disrupts the smartphone paradigm or becomes the primary UI, Apple has committed itself to third-party AI dependence. (Stratechery)
Investment implication: Apple is positioning itself as an "AI Aggregator" — leveraging platform control to commoditize model suppliers. Watch for Google Universal Commerce Protocol (UCP) as classic "tear-down-walls" strategy to make everything universally accessible, giving Gemini an advantage in an open ecosystem. Apple retains optionality to switch providers, but realistically won't invest enough to catch up to Google/Anthropic on pre-training.
TD Cowen's 2026 survey shows +9.3% software budget growth (vs. +8.7% in '25) with 90% net positive spending intentions. Top spending intentions: SNOW (84% increase), MSFT, TEAM, SAP, CRM, DDOG. Encouragingly, respondents prefer "buy over build" for AI capabilities, suggesting incumbents are capturing AI value. MSFT leads AI budget capture (76%), followed by GCP (66%), AWS (59%). Data/Analytics budgets expected to increase most in '26, with Databricks, CRM/Mulesoft, and SNOW as top vendors. However, MS CIO Survey was more muted: 2026 IT budget only +3.4% (down from +3.8% in 3Q survey) — GenAI excitement not translating to exploding budgets broadly. (TD Cowen, Morgan Stanley)
Investment implication: AI projects are graduating from pilots ($250-500K) to production ($2.5-5M), with SNOW/DBRX bills increasing rapidly but customers seeing value. CRM PT raised to $325. Signs of sanity returning — focus shifting to company-specific opportunities. GenAI monetization emerging specifically in Consolidators, Data Mgmt, and Security categories.
Morgan Stanley argues China's AI trajectory is structurally healthier than the US: less capex on raw infrastructure, more focused investment on applications. Post-DeepSeek, supply is improving (H200 imports, domestic chips maturing) while demand is inflecting. This represents the first positive enterprise spending inflection since 2H21. International expansion is becoming important — already 10%+ of revenue from overseas for major platforms. (Morgan Stanley)
Investment implication: Overweight: Tencent, BABA, PDD, TME. The application-focused approach may deliver better ROIC than the US infrastructure arms race. Watch for enterprise AI adoption metrics as the leading indicator.
Anthropic's CPO Mike Krieger called MDB "really key" for agentic coding at the Wolfe SF conference, citing "hundreds of times data storage growth" potential. MDB's stack maps 1:1 to NVIDIA's CES 2025 agentic architecture vision. Survey data is compelling: 80% of respondents expect MDB spend to increase (0% decrease vs. 9% last year), and 76% are now exploring/piloting AI workloads on MDB vs. 44% last year. (Wolfe)
Investment implication: MDB is positioned well for the AI data layer — the company has gone from "exploration" to "adoption" phase for AI workloads. The Anthropic endorsement is particularly meaningful given Claude's leading position in coding agents.
The cost frontier has dropped 99.7% for GPT-3.5-level performance: from $37.50/M tokens (GPT-4, Mar '23) to $0.10/M (GPT-4 Turbo, Aug '24). However, test-time compute (reasoning models like o1/o3) may increase cost per query even as cost per token decreases. The 280-fold cost drop is for achieving GPT-3.5-level performance, not frontier model performance — an important distinction that gets lost in headlines. (Les Barclays)
Investment implication: Don't extrapolate commodity model pricing to frontier use cases. Reasoning models require significantly more compute per query, creating strain on unit economics. Watch the distinction between "tokens generated" and "value delivered" — they may diverge as test-time compute becomes more important.
No articles this week
Both Goldman Sachs and Morgan Stanley published comprehensive 2026 outlooks identifying the same central tension: legacy internet platforms must simultaneously accelerate To-C AI investments while defending their core positioning against ByteDance. The competitive dynamics are intensifying — Doubao (ByteDance's AI assistant) is already at 300M MAU, creating an existential threat to established traffic moats. Post-DeepSeek, AI supply is improving (H200 imports continuing, domestic chips maturing), while demand is finally inflecting — representing the first positive enterprise spending inflection since 2H21. (Goldman Sachs, Morgan Stanley)
Investment implication: Stock picking should focus on EPS delivery, narrative changes, and shareholder returns. Morgan Stanley is Overweight on Tencent, BABA, PDD, and TME. The key variable is whether legacy platforms can successfully integrate AI before ByteDance captures the traffic gateway — watch MAU/DAU metrics for AI assistants closely.
No articles this week
No articles this week
No articles this week
No articles this week
Avenir's analysis captures a stark divergence: horizontal SaaS is down 49% while Nasdaq is up 50%, representing massive relative underperformance. The strategic choice for software vendors is binary: (1) accept maturity and financialize through buybacks, dividends, and margin optimization, OR (2) embrace AI and evolve into a "system of context" that agents rely on. The good news: early AI monetization is already visible — CRM Agentforce ~$540M ARR, NOW Assist >$500M ARR, Intercom Fin $100M+. And 63% of enterprise buyers prefer existing vendors for GenAI capabilities, suggesting incumbents have a path forward. (Avenir)
Investment implication: The "AI eating software" narrative may be overblown for companies that successfully evolve. Watch for signals of strategic choice: aggressive AI R&D spend signals the "system of context" path; aggressive buybacks signal the financialization path. Both can work, but investors need to know which game management is playing.
Jefferies recommends staying underweight software overall due to AI monetization being pushed to late '26/'27 and continued growth deceleration. But the playbook has clear temporal structure: 1H26, favor consumption-based infrastructure plays (MSFT, ORCL, SNOW, CRWV) that benefit from AI workload ramp; 2H26, rotate into selective apps ahead of the AI monetization tailwind. Within the barbell, overweight large-caps with AI positioning that have the capital, talent, and data/distribution advantages. (Jefferies)
Investment implication: Jefferies picks: Mega (MSFT, META), Large (INTU, TEAM, ORCL), Mid (PCOR, U, WIX), Small (UPWK). The key timing signal will be when AI revenue contribution starts appearing in guidance — that's the catalyst for the apps rotation.
No articles this week
No articles this week
No articles this week
No articles this week
No articles this week
No articles this month
No articles this month
No articles this month
No articles this month
No articles this month
No articles this month
No articles this month
No articles this month
No articles this month
No articles this month
No articles this month
No articles this month
No articles this month
No articles this month