The Tech Sales Newsletter #100: The state of cloud infrastructure software in 2025 (part 1)

We did it, sales anon!

One newsletter dedicated to tech sales, every week, for a hundred weeks. Over time, this attracted a mixed audience of sales reps, GTM leaders, VCs, investors, and industry insiders who were curious about applying the mental models of tech sales to a diverse set of challenges within the industry. This works because in order to be both compelling and insightful in tech sales, you need to develop a multidisciplinary approach and be open to feedback. There is always something new to be learned, a different way to position things, a new reason why what you thought was going to happen didn't. The hottest fires forge the strongest steel, and this is profoundly true for tech sales, now more so than ever before.

The next one hundred issues of this newsletter will be written against the backdrop of a very turbulent time in tech. While we don't know how things will play out, there are some directional guesses that we can make:

  1. Cloud infrastructure software will be considered the only real form of Enterprise software that's monetizable. Everything else will be either open source or AI-generated.

  2. Sales headcount will be stagnant or shrink. The minimum bar to enter and stay in the game will continue to increase. Most companies are not prepared for this.

  3. Your ability to use and sell AI will become synonymous with doing the job.

These are directional bets rooted in the reality we see today. While you can always take the opposite side of the bet, I think that would be a strategic mistake at this stage. Now let's take a deep dive into the state of cloud infrastructure software as of today.

The key takeaway

For tech sales: Cloud infrastructure software represents the biggest opportunity in tech sales today. Whether this is an established incumbent or well-funded startups, every opportunity could be lucrative for those with technical affinity and the right set of skills.

For investors: If value will continue to accrue at the bottom of the stack, then investing in products that make it easier to build the software of tomorrow is a logical step. Currently the most interesting new category is focused around how to make AI agents perform well in production - with a variety of companies trying to bring new tooling to the market that would help accelerate this process.

Cloud infrastructure software will eat the world

Over the next two weeks, we will explore the core thesis behind the "InfraRed" report by Redpoint Ventures.

When we talk about cloud infrastructure software, what we are referring to is the core thesis of this newsletter.

My strong conviction bet is that computing power is the most important resource, alongside oil. Everything that we build in the worlds of atoms and bits depends on it. What are the most important attributes of computing power?

Obtaining it (The art of designing and printing silicon)

Utilizing it (Building applications powered by AI)

Keeping it running (Orchestrating and observing resilient systems)

Protecting it (Securing computing power through software, people, and processes)

When I refer to AI, I focus on both cutting-edge LLMs, as well as ML at enterprise scale.

I believe that the highest value will accrue at the bottom of the stack, which means the chip makers, the cloud hyperscalers and their marketplaces, AI model makers, and select ISVs across data platforms, observability, and cybersecurity.


This week at "How to sell AI" we explore the business case at Care Fertility Clinics, where they were able to significantly improve the efficiency and accuracy of embryo selection due to adopting AI at scale. This is a great example of how this technology is having much bigger impact on companies than beyond simple cost savings.


Since the InfraRed is focusing on the software side, they skip the topic of "obtaining" computing power and focus on the key ways to utilize it, run it and secure it. From their point of view, AI is becoming a foundational layer of how we will interact and run software, with data platforms, DevOps-oriented products (i.e. anything that touches CI/CD pipelines) and cybersecurity being the key opportunities for ISVs.

This is directionally correct but I think that it doesn't account for the complex role that the hyperscalers play, as the primary provider of both compute and the most widely adopted tools to build, secure and run software. Ultimately the point of view from Redpoint is that there's never been a better time to build in infrastructure software.

Let's take off with the AI section of the report. As with any company based near Silicon Valley, the "San Francisco Consensus" is an important part of their focus. As a reminder, currently there is a strong expectation of AGI within 1 to 3 years, followed by "superintelligence" within 5 to 10 years.

Structurally, what that means is that we need to see a step beyond automation and reasoning to conclude that we've reached general intelligence. The biggest topic of today is driving autonomous behavior through AI agents, which I covered in #99. The big directional bet that is being made today is that agents will be able to not only conduct actions and create content based on their training data, but would be able to discover new things. If that becomes a reality, then the idea of running a large group of agents essentially independently (i.e. AI business units or even whole companies) becomes a real possibility.

While a lot of the focus in the last year has been taken by capabilities of the models, practically speaking the reduced cost of inference has actually significantly outpaced any other wave of computing adoption historically. There are a lot of reasons for this between NVIDIA making massive jumps in performance on an annual basis, the hyperscalers and select players such as Musk deploying massive amounts of that compute quickly, and developers and model makers innovating with almost every model release.

Still, the growth has been also absolutely off the charts. The hyperscalers (Azure/AWS/GCP) were generating around 50T tokens in Q1'24. By Q1'25, we were looking at 1,540T or a 31x increase in demand.

All of this inference is going into building out both new applications and implementing agentic workflows within existing solutions.

As venture funding exploded in the last 2 years, there has been a lot of liquidity made available for a variety of ideas, many of them with reasonable product-market fit.

While those new companies represent an opportunity for both sales reps and investors to play into new categories, structurally there are very few established "playbooks" and what many might expect would be a working product from day 1, is closer to a patchwork of tools that might or might not deliver the vision that the founders sold.

This is reflected in enterprise adoption, where the majority of new AI tools used are either Copilot-style products or changes to existing solutions that those companies were already using. AI Agents are slowly becoming part of the enterprise workflow, but until we see faster progression in building and deploying well-working, resilient and safe AI Agents, large corporations will be slow to push PoCs into production.

This leads us to the interesting part of course, how can we help the developers to build better AI Agents? There are a number of companies now debuting in the space that offer new cloud infrastructure software tools that solve problems in different parts of the production pipeline.

These are all worth researching, even if it's just to see what potential features existing incumbents might start to copy (or acquire) in order to better support the development of agentic workflows.

This is becoming an increasingly important topic as MCP adoption is increasing and there is a genuine interest across many segments of testing what LLMs can do with a variety of data sets. For many, MCP opens up the possibility of driving workflow automation at scale. Here is also why it's becoming increasingly important to have some technical interest and affinity if you want to sell in cloud infrastructure software. Many of the applications you might have used with LLMs have system prompts or follow a structure of specific prompts in order to "make the thing, do the thing." This often results in deviations, which users just chalk up to "hallucinations," but often are simply the system messing up the workflow.

One of the recent trends is DSPy - a framework to program LLMs to behave in a certain workflow, that replaces natural language instructions. While not a magical bullet, it's one of the examples of how practitioners are experimenting and solving problems in a very dynamic space, that could easily mean that an existing new cloud infra startup today, might be quickly obsolete tomorrow.

Picking the individual winners and losers in a new category will be very difficult without deeper understanding of the technology and a strong grasp of the key players/directions they are taking. Just this week we had the "Zuck is hiring all researchers" drama, as Meta has pushed hard to build a superintelligence team that is competitive with the other research labs. Money and talent are quickly moving, as the pace of technological change accelerates.

The obvious directional bet here is for agentic workflows to be implemented at scale not just for software automation but also in real-world usage with robotics, drones and self-driving cars. Thus, software tech sales essentially means selling cloud infrastructure software. Everything important that happens next will be done on the foundation of tools that we build and sell today.

The Deal Director

Cloud Infrastructure Software • Enterprise AI • Cybersecurity

https://x.com/thedealdirector
Next
Next

The Tech Sales Newsletter #99: AI Agents Deconstructed