Wanism’s Newsletter
What happened in tech that actually mattered, and what did it mean?
The AI titan NVIDIA released its FY25 Q3 (Aug-Oct 2024) earnings after the bell on November 20th. The company beat Wall Street consensus on both revenue and earnings fronts. Next quarter’s guidance fell short of more bullish buy-side expectations, leading to a roughly -2.5% decline in after-hours trading.
Riding the explosive demand wave for Generative AI and Large Language Models (LLMs), NVIDIA’s data center division hit an all-time high, surging 112% YoY and 17% QoQ. This reflects robust demand growth across customer segments, with Cloud Service Providers (CSPs) accounting for approximately 50% of revenue (up from ~45% last quarter) through aggressive AI infrastructure deployment. CSP-related revenue more than doubled YoY, with the remainder coming from enterprise and consumer internet applications.
The data center segment’s two sub-divisions maintained growth momentum. Core computing revenue reached $27.6B, marking a 132% YoY and 22% QoQ increase. NVIDIA disclosed stellar demand for the Hopper architecture, with H200 sales dramatically ramping up to over $10B, making it the fastest-growing product in company history. Furthermore, while customers accelerate Hopper adoption, they’re simultaneously preparing for the Blackwell series rollout.
In networking, revenue grew 20% YoY to $3.1B, primarily driven by Ethernet demand, including the Spectrum-X platform optimized for AI workloads. Despite slower YoY growth and a 15% QoQ decline in overall networking, NVIDIA reported continued growth across InfiniBand, Ethernet switches, SmartNICs, and BlueField DPUs, with growth momentum expected to resume next quarter.
NVIDIA confirmed successful design adjustments for improved Blackwell series yield, with 13,000 chip samples shipped to customers. The company reaffirmed Blackwell’s production and shipping schedule starting next quarter, ramping through FY26 (2025), with next quarter revenue contribution potentially exceeding the previously forecasted billions due to better-than-expected delivery status.
Market reports emerged about Blackwell chips experiencing thermal issues when connecting 72 chips in server racks, prompting NVIDIA to request supplier modifications to server cooling system designs. This raised concerns among cloud giants like Google, Microsoft, and Meta regarding potential impacts on new data center preparations and operations.
Supply chain sources indicate normal delivery schedules remain intact, with small-volume shipments expected by year-end and volume ramp-up in Q1 next year. Market consensus suggests that demand is merely deferred rather than disappeared even with slight Blackwell delays, given the robust AI industry demand amid severe supply constraints.
Notably, NVIDIA’s previous-generation Hopper series didn’t drive widespread Direct Liquid Cooling (DLC) adoption, with air cooling remaining dominant through 2024-2025. However, since Blackwell’s announcement, customers, including major CSPs, have largely modified specifications to prepare for liquid cooling systems. The cooling market is expected to pivot toward direct liquid cooling from 2026.
The gaming segment posted 15% YoY and 14% QoQ growth, marking six consecutive quarters of YoY growth after a year-long decline. RTX 40 series graphics cards and gaming chips (Nintendo Switch) sales increased YoY and QoQ in Q3, with revenue expected to maintain growth momentum as the PC replacement cycle approaches.
IDC’s latest survey shows global PC shipments at 68.8 million units in Q3 2024, down 2.4% YoY. However, the proliferation of AI PC processors is expected to drive growth in the PC market next year. According to DigiTimes, NVIDIA and MediaTek’s joint AI PC chip development is scheduled for launch in Q3 2025.
Professional Visualization grew 17% YoY and 7% QoQ, driven by strong Ada architecture workstation sales. Automotive revenue jumped 72% YoY and 30% QoQ, breaking $400M for the first time at $449M, boosted by rising demand for autonomous driving platforms and AI cockpits.
NVIDIA CEO Jensen Huang emphasized that the next-gen AI chip Blackwell has entered “full production,” with 13,000 samples shipped to customers. However, the company acknowledged supply constraints for both Blackwell and the existing Hopper series, with Blackwell demand expected to exceed supply through early 2025.
CFO Colette Kress noted all major partners have received Blackwell chips and are proceeding with data center buildouts. “Every customer is racing to market,” Kress characterized the demand environment. The company expects to achieve “billions” in Blackwell revenue in next quarter while existing H200 chip sales showed “significant growth” this quarter.
NVIDIA projects next quarter’s revenue to hit $37.5B, marking a 70% YoY growth and 7% QoQ increase, potentially setting another record high. While this beats Wall Street’s consensus of $37.08B, the magnitude of outperformance is the smallest in two years. The forecast notably falls short of the bullish buy-side expectations ranging from $39B to $41B. Both YoY and QoQ revenue growth rates continue to show deceleration trends.
Gross margin guidance points to further compression at 73.5%. The CFO attributes the margin decline to accelerated Blackwell platform market introduction, projecting temporary compression to 71-73%. However, margins are expected to rebound as Blackwell’s production scales up. Blackwell’s revenue contribution is anticipated to surpass Hopper by mid-next year, with management expressing confidence in margin recovery to approximately 75% by H2 FY2026.
The crucial question centers on its AI business growth path in assessing NVIDIA’s trajectory. Jensen Huang provided an insightful perspective: even considering just the LLM space, each phase from pre-training to post-training to inference follows distinct scaling laws.
More significantly, this excludes other AI model types. While base effects will naturally moderate growth rates, maintaining 25%+ CAGR over the next decade remains a reasonable target for NVIDIA.
Power supply constraints have emerged as a strategic advantage. In an era where energy efficiency dominates data center operations, major cloud providers must continuously upgrade to higher performance/watt chips, creating a natural product refresh cycle.
Despite tech giants like Microsoft actively cultivating AMD as an alternative, NVIDIA’s ability to maintain 71-75% gross margins suggests no fundamental near-term competitive shift. This quasi-monopolistic market position likely persists for at least 1-2 years.
While NVIDIA’s GPUs command premium prices, their versatility creates unique value propositions. The ability to transition GPUs from training to inference enhances investment returns, attracting large data centers and internet companies.
The quarterly decline in data center networking revenue primarily reflects the transition period between late-cycle Hopper and pre-volume Blackwell. Market concerns about Blackwell thermal issues represent short-term technical adjustments rather than roadmap disruption.
The $2B software revenue milestone carries strategic significance. Though modest compared to hardware, exceeding professional visualization revenue demonstrates transformation progress.
Gaming and professional visualization, while showing slower growth, maintain healthy performance driven by small-scale AI computing demand, indicating successful AI tailwind capture across traditional business lines.
As global rate cycles turn dovish, NVIDIA’s automotive business shows strong recovery momentum. This timing warrants attention as it may signal emerging growth opportunities.
NVIDIA’s substantial R&D investment increase communicates its commitment to future technical advantages. This aggressive investment stance and AI computing leadership create a virtuous cycle reinforcing market position.
Based on recent CSP earnings reports, cloud infrastructure buildout shows no signs of slowing. With robust profitability and cash flow, major data center operators plan to increase capital expenditure in 2024-2025, heavily focused on AI investments.
Amazon’s capital expenditure skyrocketed to $22.6 billion in Q3 2024, marking a staggering 81% YoY increase. This massive investment was allocated to Amazon Web Services (AWS) data center expansion and in-house AI chip development to meet surging demand for cloud AI services. As the e-commerce behemoth, Amazon continues to pour capital into logistics and warehouse infrastructure.
CEO Andy Jassy characterized the AI revolution as a “once-in-a-generation opportunity,” justifying the company’s aggressive investment strategy. The tech giant projects full-year 2024 capex to hit $75 billion, with further increases anticipated in 2025.
Meta continues to deploy substantial resources into generative AI and virtual reality initiatives, with Q3 capex reaching $9.2 billion, up 36% YoY. The company has revised its full-year capex guidance from $37-40 billion to $38-40 billion, with further growth projected for 2025. While Meta’s Q3 revenue and earnings outperformed market expectations, with AI-driven improvements in user experience and ad targeting boosting core advertising revenue by 19% YoY, Reality Labs remains a drag on profitability, posting a substantial $4+ billion loss in Q3 2024. This underscores ongoing challenges in mass adoption and monetization. Additionally, decelerating Daily Active People (DAP) growth across Meta’s social platforms has dampened investor sentiment.
Driven by AI product adoption, Google Cloud revenue surged 35% YoY to $11.35 billion in Q3. CEO Sundar Pichai highlighted that Google’s full-stack AI offerings have achieved scale, reaching billions of users and creating a “virtuous cycle.” AI-powered features like AI Overviews and Circle to Search are seeing widespread adoption, while AI enhancements have boosted ad targeting efficiency. Internally, Google leverages AI for code generation, with approximately 25% of new code being AI-generated, potentially reducing development costs and accelerating innovation while reinforcing Google’s market position in advertising and cloud services.
Q3 capex jumped 62% YoY to $13.1 billion, with 60% allocated to servers and 40% to data centers and network infrastructure. Next quarter, capex is expected to remain flat QoQ, while 2025 AI infrastructure spending will increase, albeit slower than 2023-2024.
Microsoft’s FY25 Q1 results exceeded expectations, demonstrating robust operational efficiency. Microsoft 365 Copilot continues to gain enterprise traction, while AI services drove Azure revenue up 34% YoY. However, supply constraints are expected further to decelerate Azure’s growth in the upcoming quarter. Quarterly capex reached $20 billion (including finance leases), up from $19 billion in the previous quarter, with cash PP&E purchases totaling $14.9 billion. Q2 spending will increase further, primarily targeting AI and cloud infrastructure. However, the gross margin contracted by 200 basis points to 69%, while the operating margin declined by 100 to 47%, reflecting investment-related pressure.
The company maintains that AI demand significantly outstrips available capacity, with digital transformation trends pointing to long-term growth opportunities in AI and cloud infrastructure. Goldman Sachs analysts note that despite a projected 75% increase in FY2024 capex, Microsoft’s anticipated $77 billion in free cash flow over the next twelve months should support continued AI and cloud investments while managing near-term cost pressures.
In contrast to its tech peers, Apple has adopted a more measured investment approach, focusing capex primarily on product upgrades and service expansion. Despite relatively modest spending (approximately $10 billion annually), Apple’s external partnership strategy has maintained its competitive position in the generative AI wave. CFO Luca Maestri reaffirmed the company’s “hybrid” strategy during the earnings call, indicating Apple’s flexible approach to utilizing proprietary data centers and third-party providers. This year, The company focuses on rolling out Apple Intelligence services, though the impact has yet to materialize in quarterly results.
From a strategic perspective, NVIDIA has constructed a self-reinforcing ecosystem. This transcends hardware performance advantages to encompass AI computing infrastructure dominance. Similar to Windows’ PC-era platform monopoly, NVIDIA has built an even more formidable moat in the AI era.
Traditional valuation metrics make NVIDIA’s stock price appear daunting. However, viewed through a platform economy framework, this seemingly steep valuation may prove justified. The key lies in understanding that NVIDIA isn’t merely a semiconductor company but an AI-era infrastructure provider. From this perspective, no clear growth ceiling is visible – the greater risk may be underestimating the true scale of the AI revolution’s computing demands.