The Tech Sales Newsletter #92: Focus is everything in cloud infra
Source: Clouded Judgement
This week we are taking a look at the latest earnings reports from the cloud hyperscalers (AWS, Azure, GCP). This is an important quarterly check on the pulse of the industry and gives us a strong glimpse at the shape of things to come.
The key takeaway
For tech sales: The Azure sales team continues to outperform and they clearly have a lot of momentum, both in terms of market opportunity and the right product mix. GCP is getting a boost by the Wiz acquisition and the launch of Gemini 2.5 Pro, which together with the already strong Workspace product and agentic integrations is actually a pretty good value prop. AWS's performance is not improving - the mood internally is not great and something else needs to change.
For investors: For those looking for stock performance driven by AI workloads, Microsoft remains the dominant strong bet, with Alphabet as a close second. Until the tariffs situation is resolved and AWS continues to underwhelm expectations, Amazon is a riskier bet. Not financial advice of course, my AI-gains hungry investor fren.
Azure
Amy Hood: In Azure and other cloud services, revenue grew 33% and 35% in constant currency, including 16 points from AI services. Focused execution drove non-AI services results where we saw accelerated growth in our enterprise customer segment, as well as some improvement in our scale motions.
And in Azure AI services, we brought capacity online faster than expected. In our on-premises server business, revenue decreased 6% and 4% in constant currency, slightly below expectations driven by renewals with lower in-period revenue recognition from the mix of contracts.
While both AWS and GCP had their worst year-over-year (YoY) growth metrics in a year, the Azure team delivered an outstanding result. Almost half of it is driven by AI-related products.
Satya Nadella: Model capabilities are doubling in performance every six months, thanks to multiple compounding scaling laws. We continue to optimize and drive efficiencies acrossevery layer, from DC design to hardware and silicon to system software to model optimization, all towards lowering costs and increasing performance. You see this in our supply chain where we have reduced dock to lead times for new GPUs by nearly 20% across our blended fleet where we have increased AI performance by nearly 30% ISO power and our cost per token, which has more than halved.
When it comes to cloud migrations, we saw accelerating demand with customers in every industry, from Abercrombie & Fitch, to Coca-Cola and ServiceNow expanding their footprints on Azure.
The figures coming out from Azure for Q1 are exceptional across all parameters - both growth and efficiency.
Satya Nadella: This adds to other copilot agents like Autofix, which helps developers remediate vulnerabilities as well as code review agent, which has. already reviewed over eight million pull requests. And we are previewing a first-of-its-kind SWE agent capable of asynchronously executing developer tasks. All up, we now have over 15 million GitHub Copilot users up over 4x year over-year, and both digital natives like Twilio and enterprises like Cisco, HPE, Skyscanner and Target continue to choose GitHub Copilot to equip their developers with AI throughout the entire dev life cycle. With Visual Studio and VS Code, we have the world's most popular editor with over 50 million monthly active users.
And with Power Platform, we have the leading low-code platform for AI makers, too. We now have 56 million monthly active Power Platform users, up 27% year-over-year, who increasingly use our AI features to build apps and automate processes. Now on to future of work. Microsoft 365 Copilot is built to facilitate human agent collaboration, hundreds of thousands of customers across geographies and industries now use Copilot, up 3 times year-over-year.
Our overall deal size continues to grow. In this quarter, we saw a record number of customers returning to buy more seats. And we're going further. Just last week, we announced a major update, bringing together agents, notebooks, search and create into a new scaffolding for work. Our new researcher and analyst deep reasoning agents analyze vast amounts of web and enterprise data to deliver highly skilled expertise on demand directly within Copilot.
Beyond horizontal knowledge work, we are introducing agents for every role and business process. Our sales agent turns contacts into qualified leads and with sales chat reps can quickly get up to speed on new accounts, and our customer service agent is deflecting customer inquiries and helping service reps resolve issues faster. With Copilot Studio, customers can extend Copilot and build their own agents with no code, low code. More than 230,000 organizations, including 90% of the Fortune 500 have already used Copilot Studio.
The speed with which Microsoft has translated LLM innovations into sellable products for their Enterprise customers is staggering. It's particularly interesting how much "under the radar" their innovation and deal volume has been around GitHub. While on X you'll mostly hear about Cursor and Windsurf, the reality across the industry is that developers start using AI-assisted coding tools in GitHub first, and many never leave. Turning this into agentic workflows at scale would be complete dominance of arguably the strongest Enterprise use case for AI on the market.
"How To Sell AI" v1.6 is now live, focusing on the most recent research in AI Agents and the long term implications on tech sales.
This is a great time for you to learn the ins-and-outs of selling AI!
The definitive tech sales guide to selling AI (LLMs and Enterprise-grade Machine Learning).
Satya Nadella: We are now applying these learnings to deliver new innovation across our platform. Last month, along with our partners, we introduced Security Copilot agents to help defenders autonomously handle high-volume security and IT tasks informed by 84 trillion daily threat signals.
We also added new capabilities to Defender, Entra and Purview to help organizations secure and govern their AI deployments. All up, we now have 1.4 million security customers. Over 900,000 – including EY Global, Manpower Group, TriNet, Regions Bank – have 4 or more workloads, up 21% year-over-year.
My personal view is that cybersecurity agentic workflows built on single platforms will dominate the cybersecurity industry going forward. It's an obvious play at solving both a human capacity and capability issue, as well as a significant step-up in security posture due to extensive use of telemetry. Eighty-four trillion daily threat signals is just the beginning of what a unified pool of security logs and threat intel can do.
Satya Nadella: Having said that, the key thing for us is to have our builds and lease be positioned for what is the workload growth of the future. That’s what you have to go and seek to.
There’s a demand part to it. There is the shape of the workload part to it, and there is a location part to it.
You don’t want to be upside-down on having one big data center in one region, when you have a global demand footprint. You don’t want to be upside-down when the shape of demand changes, because, after all, with essentially pre-training plus test time compute, that’s a big change in terms of how you think about even what is training. Forget inferencing.
Fundamentally, given all of that, and then every time there’s great Moore’s Law, but remember, this is a compounding S-curve, which is, there’s Moore’s Law, there’s system software, there’s model architecture changes, there’s the app server efficiency. Given all of that, we just want to make sure we’re building, accounting for the latest and greatest information we have on all of that.
And that’s what you see reflected, and I feel very, very good about the pace. In fact, Amy just mentioned, we will be short power. And so, therefore, but it’s not power. It’s not a blanket statement. I need power in specific places so that we can either lease or build at the pace at which we want. And so, that’s the plan that we’re executing.
One of the big FUD stories in Q1 was "Microsoft is pulling out server capacity and stopping further investment". Keeping in mind that all 3 hyperscalers have repeatedly commented on the fact that they are compute-limited, it was a weird development which fed into the narrative that "AI adoption is all fake".
The practical reality of course is the opposite - getting the balance right between driving a strong sales execution and then having to meet those obligations in the most cost-effective manner is really, really difficult.
Satya Nadella: I think at a macro level, I think the way to think about this is, you can ask the question, what’s the difference between a hosting business and a hyperscale business?
It’s software. That’s, I think, the gist of it.
Yes, for sure, it’s a capital intensive business, but capital efficiency comes from that system wide software optimization. And that’s what makes the hyperscale business attractive, and that’s what we want to just keep executing super well on.
What most still don't understand about the difference between the 3 big hyperscalers and data center players like Oracle is that the value gap is the software stack that comes on top of the infrastructure. It's the same capital investment in principle (Oracle can also buy NVIDIA GPUs), but significantly more efficient and valuable to the end customers, who benefit from both IT procurement consolidation and access to S-tier native software services.
Satya Nadella: I mean, it’s kind of like virtualization. What is the difference between servers and, again, client server with virtualization? It was efficiency. What is the difference between virtualization and cloud? It was efficiency. What is the difference between this generation of cloud and AI? It’s efficiency. The more you can kind of continue to think about software driving that efficiency is what drives demand, ultimately.
Let’s dig into this one a bit further.
Era | What changed | Where the efficiency came from | Why that mattered |
---|---|---|---|
Physical servers → Virtualization (mid-2000s) |
One operating-system-per-box became many virtual machines on the same box. | Hypervisors multiplexed CPU, RAM & I/O so hardware ran at much higher utilisation. |
|
Virtualization → Public cloud (2010s) |
Virtual machines moved into hyperscale data centres you rent by the hour. | Sophisticated orchestration, multi-tenancy, global load-balancing, pay-per-use pricing. |
|
Cloud 1.0 → AI-centred cloud (now) |
GPUs/NPUs + large models become a native layer of the cloud stack. | Model compression, compiler optimisations, low-precision maths, distributed training/inference pipelines. |
|
The core of the value prop with each technology shift is that software, not hardware, is the real leverage:
Virtualization software made a single server do the work of many.
Cloud control-plane software made a warehouse of servers look like one elastic computer.
AI optimization software (new model architectures, quantization, better compilers, routing systems, etc.) is now making each GPU-hour or watt go several times further.
Because every new layer slashes the unit cost of compute, companies end up doing more of it—the Jevons paradox effect. That rising demand (for VMs in the 2000s, for cloud services in the 2010s, for AI tokens today) is what ultimately grows Microsoft's revenue, even though the immediate story is "efficiency."
AWS
Andy Jassy: AWS grew 17% year-over-year in Q1 and now sits at a $117 billion annualized revenue run rate. We continue to help organizations of all sizes accelerate their move to the cloud, helping to modernize their infrastructure, lower costs and speed up innovation. We signed new AWS agreements with companies including Adobe, Uber, Nasdaq, Ericsson, Fujitsu, Cargill, Mitsubishi Electric Corporation, General Dynamics Information Technology, GE Vernova, Booz Allen Hamilton, NextEra Energy, Publicis Sapient, Elastic, Netsmart, and many others. It's useful to remember that more than 85% of the global IT spend is still on premises, so not in the Cloud yet.
It seems pretty straightforward to me that this equation will flip in the next 10 to 20 years. Before this generation of AI, we thought AWS had the chance to ultimately be a multi $100 billion revenue run rate business. We now think it could be even larger. If you believe your mission is to make customers' lives easier and better every day and you believe that every customer experience will be reinvented with AI, you're going to invest very aggressively in AI and that's what we're doing.
AWS sits in a peculiar place. At the end of the day, they are the largest business of the 3 and arguably the best platform to build on top of. The philosophy of providing "the building blocks" of compute and software, at the best possible price level and best security foundation in the business, is a very important value proposition.
The problem is that right now Azure is on pace to actually catch up with AWS in 3-4 years. This was unthinkable even six months ago, but it's starting to get hard to argue with the obvious significant differences in the quality of execution, both in terms of sales performance and product value proposition. Being distracted on a corporate level by the tariffs' impact on their core business at a time when AWS needs to be their single biggest priority is not helping either.
Andy Jassy: Well, the first thing I would say is when we've historically said that revenue can be lumpy, we've been saying this well before what's happened with AI over the last couple of years. And the reason for that is the sales cycle, particularly for enterprises. It's true for startups. What you really want is you want to have the type of capabilities where startups want to primarily choose to run on top of your platform.
And that's true. If you look in the startup space, the vast majority of successful startups over the last 10 to 15 years have run on top of AWS. And it's unpredictable when those startups are going to find product market fit and grow substantially. And it's hard for them to predict and even harder for us to predict. And the same thing goes on the enterprise side, but in a different way. When the sales cycle on the enterprise side is that you spend time trying to convince people that they should move from on-premises to the cloud, and then that you have the right solution for them.
And then, you pick a set of projects that you get experience on. And sometimes they use systems integrators. Sometimes they use our own professional services. Sometimes they're doing it themselves. Then there's a next tranche migrations. And those migrations just take time. And some companies get through them really quickly and some companies take longer to get through them. And what happens a lot of times too, is that they get excited and enthusiastic about the cost advantages and the speed of innovation advantages they get moving to the cloud.
And what was supposed to be a smaller next tranche turns into a much larger next tranche. And all of that has been true for a long time. It's very hard for us to predict because it really is contingent on what enterprises, how they want to sequence it and resource it. Then you throw in AI, which has its own very fast growth cycle, particularly in certain types of use cases. And those change. I mean, I'll give you just some examples.
In the early days, of the earliest days, I should say, of AI, what you've seen the most amount of has been initiatives that get you productivity and cost avoidance. And we've seen that from so many of our AWS customers. And we're doing a lot of it ourselves inside of Amazon using AI. And then, you've also seen, I would say, large scale training with a lot of those are running on top of this as well, and as Anthropic is building their next few training models on top of our Trainium 2 chip on AWS.
And then, you've seen a couple of really big chatbots. And then, what you've seen just in the last few months is really kind of the explosion of coding agents. And if you just look at the growth of these coding agents, the last few months, these are companies like Cursor or Vercel, both of them run significantly on AWS, but just look at the growth of that over the last few months. You just couldn't have predicted that sort of growth.
And so, that's why it's lumpy. Sometimes you'll have very significant increases that you didn't predict and you couldn't forecast. And then, they'll grow at a good rate, but maybe not the same rate before the next big kind of explosive growth.
This is Andy's pitch on why "it's fine, bulieveee me". The lumpy workloads will depend on customers figuring it all out, and if we are lucky, the next Cursor will help us drive higher growth.
At the end of the day, this is not what an Enterprise strategy is. The Azure team is demonstrating in real time what a very opinionated sales team can do with software optimized for Enterprise adoption and tied to tangible results.
Keeping in mind the recent shift in sales leadership, I'm not sure that the challenge at this stage is limited to the quality of sales execution. There is a question to be asked about whether the right resources are being put on the product side, to offer "pre-built" building blocks that customers can put into an effective implementation immediately.
Andy Jassy: And so, we've worked hard on that in Alexa+. We've been -- we started rolling out over the last several weeks. So, it's with now over 100,000 users, with more rolling out in the coming months. And so, far, the response from our customers has been very, very positive.
People are excited about it. I think that it does a lot more things than what Alexa did before. And we're very fortunate in that we have over a half billion devices out there in people's homes and offices and cars.
So, we have a lot of distribution already, but there will be to some degree, there will be a little bit of rewiring for people on what they can do because you get used to patterns. I mean, even the simple thing of not having to speak Alexa speak anymore where we're all used to saying Alexa before we want every, action to happen. And what you find is you really only have to do it the first time, and then really the conversation is ongoing where you don't have to say Alexa anymore. And I've been lucky enough to have the alpha and the beta, that I've been playing with for several months.
And it took me a little bit of time to realize I didn't have to keep saying Alexa. And it's very freeing when you don't have to do that. And then, I think it's just experience in trying things. So, you can do things like you have guests coming over on a Saturday night for dinner and you can say Alexa, please, open the shades, put the lights on in the driveway and on the porch, increase the temperature five degree and pick music that would be great for dinner that's mellow.
And she just does it. And like, when you have those types of experiences, it makes you want to do more of it. When I was in New York, when we were announcing, I asked her what were the -- we did the event way downtown. I asked her, what was great Italian restaurants or pizza restaurants. She gave me a list and then she asked me if she wanted me to make a reservation. I said yes and she made the reservation, confirmed the time.
The highlight of the earnings call was the launch of the AI-powered Alexa assistant. While this is important progress for their hardware retail business, it also serves as a contrast to their Enterprise adoption strategy.
Where is Alexa the cybersecurity agent? Alexa the coding assistant? Alexa the customer support advisor?
Plug-and-play, ready to go, powered by Trainium instances and a team focused on outcomes. Ideally all powered by the latest Claude models, brought in-house following an acquisition.
GCP
Sundar Pichai: We provide leading cost, performance, and reliability for AI training and inference. This enables us to deliver the best value for AI leaders, like any scale and contextual AI. As well as global brands like Verizon. And for highly sensitive data and regulatory requirements, Google Distributed Cloud and our sovereign AI make Gemini available on-premises or in-country. Our Vertex AI platform makes over 200 foundation models available, helping customers like Lowe's integrate AI.
We offer industry-leading models, including Gemini 2.5 Pro, 2.5 Flash, Imagine three, VO2, Chirp, and Lyria. Plus open source and third-party models like llama 4 and Anthropic. We are the leading cloud solution for companies looking to the new era of AI agents.
This is a big opportunity. Our Agent Development Kit is a new open-source framework to simplify the process of building sophisticated AI agents and multi-agent systems.
An agent designer is a low-code tool build AI agents and automate tasks in over 100 enterprise applications and systems. We are putting AI agents in the hands of employees at major global companies like KPMG. With Google Agent space, employees can find and synthesize information from within their organization. Converse with AI agents, and take action with their enterprise applications. It combines enterprise search, conversational AI or chat, and access to Gemini and third-party agents.
We also offer prepackaged agents across customer engagement, coding, creativity, and more, that are helping to provide conversational customer experiences accelerate software development improve decision making. And of course, Google Workspace, delivers more than 2 billion AI assists monthly, including summarizing Gmail, and refining Docs.
Lastly, our cybersecurity products are helping organizations detect investigate and respond to cybersecurity threats. Our expertise, coupled with integrated Gemini AI advances, detects malware, prioritizes threats, and speeds up investigative workflows.
This was an interesting quarter for GCP. Their growth is the slowest it's been in a year, something that was mostly ignored on the call. Alphabet is stuck in a nasty litigation around Chrome, and multiple new incumbents (and approaches) are entering the advertising space.
On the other hand, the new Gemini models were finally useful. The integration with the Workspace portfolio remains poor, but it's presentable to customers. My own company ended up adopting the Gemini assistant and rolling it out within Workspace, so they are getting the pricing and pitch right.
The enterprise focus on pitching agentic implementations is also resonating (similar to Azure's approach). The acquisition of Wiz is also clearly signaling that they want to build aggressively on their cybersecurity portfolio.
Philipp Schindler: Throughout 2024, we launched several features that leverage LLMs to enhance advertiser value, we're seeing this work pay off. The combination of these launches now allows us to match ads to more relevant search queries. And this helps advertisers reach customers in searches where we would not previously have shown their ads. Focusing on our customers, we continue to solve advertisers' pain points and find opportunities to help them create, distribute, and measure more performant ads.
Infusing AI at every step of the marketing process. On audience insights, we released new asset audience recommendation which tell businesses the themes that resonate most with their top audiences. On creatives, advertisers can now generate a broader variety of lifestyle imagery customized to their business to better engage their customers. And use them across PMACs, demand gen, display, and app campaigns.
Additionally, in PMax, advertisers can automatically source images from their landing pages and crop them, increasing the variety of their assets. On media buying, advertisers continue to see how AI campaigns help them find new customers. In DemandGen, advertisers can more precisely manage ad placements across YouTube Gmail, Discover, and Google Display Network globally. And understand which assets work best at a channel level.
Thanks to dozens of AI part improvements launched in 2024, businesses using DemandGen now see an average 26% year-on-year increase in conversions per dollar spent for goals like purchases and leads. And when using DemandGen with product feed, on average, they see more than double the conversion per dollar spent year over year. As an example, Royal Cannon combined DemandGen and PMax campaigns to find more customers for its cat and dog food products. The integration resulted in a 2.7 times higher conversion rate a 70% lower cost per acquisition for purchases, and increase the value per user by 8%.
Most of the call was focused on how they are scaling AI features within Google Search and the deeper integrations/productivity tools offered to advertisers.
Similar to AWS, there is a question to be had about the topic of focus. How much of the engineering work and "deep thinking" is going towards their primary products versus the value proposition and capabilities offered by GCP?
Sundar Pichai: Look, on internally, I mean, this has been an extraordinary amount of focus and excitement both because I think we are the early use cases have been transformative in nature, and I think there's still feels like early days and long ways to go. Obviously, I had mentioned a few months ago, in terms of how we are using AI for coding, we are continuing to make a lot of progress there in terms of people using coding suggestions. I think the last time I had said the number was, like, 25% of code that's checked in. It involves people accepting AI suggested solutions.
That number is well over 30% now. But more importantly, we have deployed more deeper flows And particularly with the newer models, I think we are working on early agentic workflows and how we can get those coding experiences to be much deeper. We are deploying it across all parts of the company, you know, Our customer service teams are deeply leading the way there. We've both dramatically enhanced our user experience as well as made it much more efficient to do so.
And we are actually bringing all our learnings and expertise in our solutions through cloud to our other customers. But beyond that, all the way from the finance team preparing for this earnings call to everything. It's deeply embedded in everything we do, but I still see it as early days and there's going be a lot more to do.
If anything, the earnings call indicated a bigger focus on internal productivity initiatives than, per se, a strong focus on turning these into marketable products that the GCP sales team can drive with customers.
Sundar Pichai: And, Mark, thanks. I think this is probably the first question I've got on our earnings call on Waymo. So thank you. And I think it's a sign of its progress. Look, the thing that excites me is I think we've been laser-focused and we'll continue to be on building the world's best driver. And I think doing that well really gives you a variety of optionality and business models across geographies, etcetera.
It'll also require a successful ecosystem of partners, and, you know, we can possibly do it all ourselves. And so I'm excited about the progress the teams have made through a variety of partnerships. Obviously, highlight of it is a partnership with Uber. We are very pleased with what we are already seeing in Austin in terms of rider satisfaction. We look forward to offering the first paid rides in Atlanta via Uber later this year.
But we are also building up a network of partners, for example, for maintaining fleets of vehicles and doing all the operations related to that. With the recently announced partnership with Moo in Phoenix and Miami, obviously partnerships with OEMs. There are future optionality around personal ownership as well. So we are widely exploring and but at the same time, clearly staying focused and making progress, both in terms of safety, the driver experience and progress on business model and operationally scaling it up.
Interestingly enough, the highlight of the call was the scaling of Waymo in new cities. In my deep dive on Tesla, I highlighted that the long-term value proposition for the company is tied to scaling their vision for real-world AI.
In my article on ASI (artificial superintelligence), all paths to such a technical leap will include the idea of "embodying" AI into the physical realm. We are seeing this already with drones; the next step is autonomous cars.
Sundar's point about Waymo being "the world's best driver" is an important one. Based on the reality on the ground, they don't seem far off from getting there, even if it's a title shared with Tesla.