Snowflake Inc (SNOW) Q1 2025 Earnings Call Transcript Highlights: Strong Revenue Growth and AI Advancements Amid Margin Pressures

Snowflake Inc (SNOW) reports robust Q1 performance with significant AI progress, despite facing margin challenges due to increased GPU costs.

Article's Main Image

Release Date: May 22, 2024

For the complete transcript of the earnings call, please refer to the full earnings call transcript.

Positive Points

  • Snowflake Inc (SNOW, Financial) reported a strong Q1 with product revenue of $790 million, up 34% year-over-year.
  • The company saw a significant increase in remaining performance obligations, totaling $5 billion, with a year-over-year growth of 46%.
  • Snowflake Inc (SNOW) is making substantial progress in AI, with over 750 customers using its AI capabilities, including the newly launched Cortex AI.
  • The company has a robust partner ecosystem, including collaborations with major firms like EY, Deloitte, and S&P Global, which amplify its platform's power.
  • Snowflake Inc (SNOW) is expanding its product offerings with new features like Iceberg, Snowpark container services, and hybrid tables, which are expected to drive future revenue growth.

Negative Points

  • Non-GAAP product gross margin decreased slightly year-over-year to 76.9%, impacted by GPU-related costs for AI initiatives.
  • The company is lowering its full-year margin guidance due to increased GPU-related costs associated with AI investments.
  • Deferred revenue was down more sequentially than in prior years, partly due to some large deals being structured with monthly payments in arrears.
  • April saw moderated growth in usage, attributed to seasonal factors like holidays in Europe, which impacted daily consumption.
  • The introduction of tiered storage pricing impacted revenue by $6 million to $8 million in the quarter, affecting margins.

Q & A Highlights

Q: Looking at the front page of the investor relations stage, 5 billion queries. It looks like your query volume is actually accelerating now again. Can you walk us through some of the drivers of that acceleration? Is it new products that are driving the acceleration? Or is it the relief of optimization or just like better data center? So just a little bit more clarity on what's driving that acceleration. And then on the other side, that equation. It looks like there's still pressures on like the price per query. Any indications on whether that like pressure on the price per query is coming more from the compute side of the equation or the storage side of the equation? Any color there would be super helpful.
A: Thank you. Overall, as both Mike and I said, our core business is very strong and growth is coming from both new customers as well as expansion from existing customers. And as we gain more and different kinds of workloads, for example, AI, data engineering are increasing quite nicely. They're all contributing to additional credit growth. And the relationship between credit growth and cost per query is not a simple straightforward one. And we look for broad growth across the different categories of workloads that we handle, and they've all been doing really well.

Q: Sridhar, you trained Arctic LLM with a pretty amazing efficiency. Could you walk us through the architectural difference in the product that might allow it to run more efficiently than other products out there in the market? And Mike, is there any directional change to the $50 million target for GPU spend this year, just considering the launch of Cortex and Arctic LLM and it sounds like some snow park traction. Should we think of that trending a little higher?
A: Thank you. So absolutely, we did train or Arctic in a remarkably short period of time, a little over 3 months. on a remarkably small amount of GPU compute. A lot of the training efficiency of these models do come from architectures. We had a rather unique mixture of experts architecture. These are increasingly the architectures that are driving impressive gains for all of the other leading AI companies. But what also went into it was just an amazing amount of pre experimentation in order to figure out things like what are the right data sets, what orders should they be fed in and how do we make sure that they're actually optimizing for enterprise metrics, the kind of things our customers care about, which are things like -- or these models really good at creating SQL queries, for example, so that they can talk to data. And so we are taking very much the view of how do we make AI much better in an enterprise context because naturally, that's the place where we have the most value to add and our AI budgets are modest in the scheme of things. And so being creative in how we develop these models is something that the team comes to naturally expect. And I think the kind of discipline and scarcity, to be honest, produces a lot of innovation. And I think that's what you're seeing. And then in terms of investments, I'll hand over to Mike in a second. But I'm comfortable with the amount of investments that we are making. Part of what we gain as Snowflake is the ability to fast follow on a number of fronts is the ability to optimize against metrics that we care about, not producing like the latest, greatest, biggest model, let's say, for image generation. And so having that kind of focus lets us operate on a relatively modest budget pretty efficiently. And so the focus very much now is on how do we take all of the products that we have released into production. We have over 750 customers that are busy developing against our AI platform. This is a fast-moving space, but we are very comfortable with both the pace, the investments and the choices that we are making to make AI effective for Snowflake.
Michael Scarpelli - CFO: And I will add that, yes, we think we may be spending a little bit more on GPUs, but it's also people that we're hiring, specifically in AI. We talked about the acquisition of TruEra. Those people all fall into that organization. And so as I mentioned, the world of AI is rapidly evolving, and we are investing in that because we do think there's a massive opportunity for Snowflake to play there. And it will have a meaningful impact on future revenues.

Q: Sridhar, can you just talk a little bit about how we should think about your customers' time to value with Cortex, meaning how long do you think it takes them to start using the technology before it can start to translate into a little bit faster consumption patterns? And then just one for Mike. Mike, can you just talk a little bit about deferred. This quarter was down perhaps a little bit more sequentially than we've seen in prior years. I don't know if there's anything onetime in nature there, but if you could just touch upon that, that would be great.
A: Thank you. One of the cool things about Cortex AI and our AI products in general, in the context of the consumption model, is that our customers don't have to make big investments to see what value that they're going to get because they don't have to make commitments to how many GPUs that they are going to be renting, for example. They just use Cortex AI, for example, from SQL, which is very, very easy to do without a pre comment. And this means that they can focus very much on sort of value creation. And the structure of Cortex AI is also so that anybody that can write SQL can now begin to do really interesting things, for example, look at how often let's say, a particular product was mentioned in an earnings transcript or being able to go from other kinds of unstructured information like whether it is text or whether it is images to structured information, which Document AI, our AI product there, does. And so we very much want to structure all of these efforts as ones in which our customers are able to iterate very quickly, take things to production, get value out of it and then make bigger commitments on top. And that's one of the benefits that you get from making the technology super easy to adopt. There's not a massive learning curve, neither is there a GPU commitment or other kinds of software engineering that needs to happen in order to use AI with Snowflake.
Michael Scarpelli - CFO: Yes. On your question on deferred, Kirk, if you're referring to January to today, the end of the year, Q4 is always a very, very big billing quarter. Q1 is not as big of a billing quarter. So you have that flowing through on the deferred revenue. However, RPO, and you can see RPO, as Sridhar mentioned, is up 46% year-over-year. And we do have, for instance, we signed a $100 million deal this quarter with a customer who pays us monthly in arrear, so it doesn't show up in deferred revenue. We've signed a number of deals with big companies that pay us monthly in arrears that don't show up in deferred revenue, but they're in RPO.

QFor the complete transcript of the earnings call, please refer to the full earnings call transcript.