Google Earnings, Microsoft Earnings, AI Leverage

Good morning,

The NBA season tipped off last night, which is a great opportunity to remind you that Stratechery Plus subscribers have access to what I think is the best NBA podcast: Greatest of All Talk. The most recent episode, which is nominally a season preview, provides an excellent example right up top why this is an NBA podcast like no other — and why the slogan on the website is “Basketball, Life, and National Parks.”

On to the Update:

Google Earnings

From the Wall Street Journal:

Google reported its strongest business growth in more than a year but disappointed investors with relatively weak cloud-computing sales, delivering a mixed picture as it continues to wrestle with competitors developing artificial-intelligence tools. Google’s parent company Alphabet reported third-quarter revenue of $77 billion Tuesday, up 11% from the same period last year. The results marked the third consecutive quarter of accelerating growth for the search giant following an economic slowdown that briefly caused a rare drop in the company’s advertising sales.

Sales growth in Google’s cloud division, which oversees the servers powering the company’s AI programs, slowed to 22% from the third quarter last year, coming in below Wall Street’s expectations. The business brought in $8.4 billion of revenue and reported its third straight quarter of operating profit, making $266 million by that metric.

Google’s advertising results were indeed very good; notably, YouTube advertising revenue actually grew more than Search revenue, confirming that YouTube has, like Meta, fully pulled out of its post-ATT slump.

The cloud results, meanwhile, were indeed discouraging. Here is an up-to-date version of Google Cloud’s financials that I first posted a year ago:

Google Cloud Revenue Growth Loss Margin
Q4 2019 $2,614 $(1,194) -46%
Q1 2020 $2,777 $(1,730) -62%
Q2 2020 $3,007 $(1,426) -47%
Q3 2020 $3,444 $(1,208) -35%
Q4 2020 $3,831 47% $(1,243) -32%
Q1 2021 $4,047 46% $(974) -24%
Q2 2021 $4,628 54% $(591) -13%
Q3 2021 $4,990 45% $(644) -13%
Q4 2021 $5,541 45% $(890) -16%
Q1 2022 $5,821 44% $(931) -16%
Q2 2022 $6,276 36% $(858) -14%
Q3 2022 $6,868 38% $(699) -10%
Q4 2022 $7,315 32% $(186) -3%
Q1 2023 $7,454 28% $191 3%
Q2 2023 $8,031 28% $395 5%
Q3 2023 $8,411 22% $266 3%

The number that jumps out is margin: yes, Google Cloud continues to (finally) be profitable, but this is the first time margin has decreased in a while. Moreover, the previous decline in margin, in Q4 2021, was caused by a step up in capital expenditures; this slip was caused by disappointing revenue growth. Google CEO Sundar Pichai, in a remarkably substance-free earnings call, had the same explanation and reason for optimism that AWS and Azure have had for slowing growth revenue over the last year:

On Cloud, maybe what I would say is, overall, we had definitely started seeing customers looking to optimize spend. We leaned into it to help customers given some of the challenges they were facing. And so that was a factor. But we are definitely seeing a lot of interest in AI. There are many, many projects underway now, just on Vertex alone, the number of projects grew over 7x. And so we see signs of stabilization, and I’m optimistic about what’s ahead.

There are two problems with this response: first, GCP’s margin profile only makes sense if it is still catching up to AWS and Azure; it’s a big difference to be dealing with customer optimizations when your cloud business is already large and, most importantly, profitable. GCP isn’t nearly as big and it’s not a great sign that it is complaining about big cloud problems.

The other problem is AI.

Microsoft Earnings

Again from the Wall Street Journal:

Microsoft’s sales growth accelerated last quarter as demand for its cloud computing services heated up amid growing enthusiasm about artificial intelligence. The company reported Tuesday that its revenue grew 13% to $56.5 billion for the quarter through September. That was above analysts’ expectations and a step up from growth of 11% during the year-earlier period. The growth rate in Microsoft’s Azure cloud business was 29%. While that was below the pace that Microsoft posted in the same quarter last year, it was above the preceding quarter and analyst expectations. It gained around 3 percentage points from demand for AI services.

CEO Satya Nadella has, over the last year, been providing comments very similar to Pichai’s about slowing growth on Azure: we’re optimizing for customers, and AI is coming. The difference is that for Microsoft AI is here, and it’s having a meaningful impact on results. From CFO Amy Hood’s prepared remarks:

Next, the Intelligent Cloud segment. Revenue was $24.3 billion, increasing 19% and ahead of expectations, with better-than-expected results across all businesses. Overall, server products and cloud services revenue grew 21%. Azure and other cloud services revenue grew 29% and 28% in constant currency, including roughly 3 points from AI Services. While the trends from prior quarter continued, growth was ahead of expectations, primarily driven by increased GPU capacity and better-than-expected GPU utilization of our AI services, as well as slightly higher-than-expected growth in our per-user business.

There are three interesting points on this growth — and not all of them are about AI. First, in the decidedly non-AI category, Nadella was clearly excited about Microsoft’s new partnership with Oracle to host Oracle databases; Nadella said near the beginning of his prepared remarks:

We are the only other cloud provider to run Oracle’s database services, making it simpler for customers to migrate their on-prem Oracle databases to our cloud. Customers like PepsiCo and Vodafone will have access to a seamless fully integrated experience for deploying, managing, and using Oracle database instances on Azure. And we are the cloud of choice for customers’ SAP workloads too. Companies like Brother Industries, Hanes, ZEISS, and ZF Group all run SAP on Azure.

He further called out the partnership while explaining Azure’s growth opportunities:

If you just take Azure and try to characterize, where’s the growth for Azure coming from or what sort of drivers for Azure numbers, there are three things all happening in parallel. Like, for example, take cloud migrations. A good reminder of where we are and even the core cloud migration story is the new Oracle announcement. Once we announced that the Oracle databases are going to be available on Azure, we saw a bunch of unlock from new customers who have significant Oracle estates that have not yet moved to the cloud because they needed to rendezvous with the rest of the app estate in one single cloud. And so we’re excited about that. So in some sense, even the financial services sector, for example, is a good place where there’s a lot of Oracle that still needs to move to the cloud.

Imagine telling the Ben Thompson of 2003 that Microsoft would be trumpeting the growth opportunity entailed in hosting Oracle databases! That speaks, though, to Microsoft’s shift away from an integrated Windows-centric stack — not just on PCs but also in the datacenter — to being a pure infrastructure provider, differentiated by its willingness and unique ability to meet its corporate customers wherever they might be in their tech modernization journey. This is the exact approach that undergirded Azure’s growth in the first place.

Nadella’s second growth point was about lapping the optimizations that accelerated a year ago; the third was AI. Part of this is certainly serving those same corporate customers that are Microsoft’s bread-and-butter. It’s not just that, though; again from Nadella’s opening remarks:

Azure AI provides access to best-in-class frontier models from OpenAI and open-source models, including our own, as well as from Meta and Hugging Face, which customers can use to build their own AI apps while meeting specific cost latency, and performance needs. Because of our overall differentiation, more than 18,000 organizations now use Azure OpenAI services, including new to Azure customers. And we are expanding our reach with digital-first companies with OpenAI APIs as leading AI start-ups use OpenAI to power their AI solutions, therefore, making them Azure customers as well.

There you go: Azure is finally breaking through with startups. Of course that breakthrough is intermediated by OpenAI, which means it’s lower margin and not necessarily as additive to Azure as a whole (the data is probably less likely to be stored on Azure, for example), but it’s something!

AI Leverage

Speaking of OpenAI, the most interesting part of the call was this answer from Nadella about the better-than-expected GPU utilization mentioned by Hood above.

It is true that the approach we have taken is a full stack approach all the way from whether it’s ChatGPT or Bing Chat or all our Copilots, all share the same model. So in some sense, one of the things that we do have is very, very high leverage of the one model that we used, which we trained, and then the one model that we are doing inferencing at scale. And that advantage sort of trickles down all the way to both utilization internally, utilization of third parties, and also over time, you can see the sort of stack optimization all the way to the silicon, because the abstraction layer to which the developers are riding is much higher up than low-level kernels, if you will.

So, therefore, I think there is a fundamental approach we took, which was a technical approach of saying we’ll have Copilots and Copilot stack all available. That doesn’t mean we don’t have people doing training for open source models or proprietary models. We also have a bunch of open source models. We have a bunch of fine-tuning happening, a bunch of RLHF happening. So there’s all kinds of ways people use it. But the thing is, we have scale leverage of one large model that was trained and one large model that’s being used for inference across all our first-party SaaS apps, as well as our API in our Azure AI service…

The lesson learned from the cloud side is — we’re not running a conglomerate of different businesses, it’s all one tech stack up and down Microsoft’s portfolio, and that, I think, is going to be very important because that discipline, given what the spend like — it will look like for this AI transition any business that’s not disciplined about their capital spend accruing across all their businesses could run into trouble.

I noted above that Microsoft had transitioned from an integrated Windows-centric approach to being a generic infrastructure provider focused on giving its customers whatever they needed; AI, though, via the OpenAI partnership, is providing a new opportunity for integration. In this case, though, the benefits aren’t realized via a common API or anything necessarily end user facing, but rather through infrastructure optimization.

Think about all of the AI growth opportunities littered throughout this call (which Nadella and Hood were clearly excited to talk about, in contrast to what felt like compelled talking points from Pichai and Porat):

  • CoPilots will monetize via subscription, and run on an OpenAI model
  • Bing monetizes via ads, and runs on an OpenAI model
  • The Azure API will monetize on a usage basis, and runs OpenAI models
  • OpenAI’s consumer business monetizes via subscription, some share of which goes to Azure, and runs an OpenAI model
  • OpenAI’s API runs on Azure, and runs OpenAI models

Nadella and Hood’s point is that because everything is basically running on the same stack Microsoft can both optimize that stack and make sure that stack is fully utilized in a way that competitors seeking to be more general purpose providers cannot. That’s pretty compelling (and note the bit about investing in silicon: whatever Microsoft-built AI chip that finally emerges will obviously be custom-built to run OpenAI’s models).

This isn’t an approach without risk: Microsoft is obviously pot-committed to OpenAI, and is further making a bet that OpenAI’s model will be one of the biggest winners; it’s possible we look back in a decade and open source and company-specific models are a much larger market, and a more general purpose solution like AWS ends up being the big winner. That would fit the Christensen theory of integration and modularization, which posits that integrated products win at the beginning when solutions are not good enough, but ultimately lose out to modularized products that are more customizable and cost-effective. Even then, though, Microsoft’s cost advantage during the AI buildout (because of their ability to maximize utilization because of their mix of products across consumer, enterprise, and developers) could mean they are decently positioned to simply win on scale.

All of that noted, the combination of Google and Microsoft’s results make Amazon’s earnings tomorrow very interesting indeed. If AWS looks like Google than it suggests that Microsoft has a meaningful advantage in AI infrastructure; if AWS is done talking about optimizations and ready to talk about realized AI growth then Google’s results will deserve even more scrutiny.


This Update will be available as a podcast later today. To receive it in your podcast player, visit Stratechery.

The Stratechery Update is intended for a single recipient, but occasional forwarding is totally fine! If you would like to order multiple subscriptions for your team with a group discount (minimum 5), please contact me directly.

Thanks for being a subscriber, and have a great day!