SaaS Backwards - Reverse Engineering SaaS Success

Ep. 193 - SaaS AI Readiness: Why Most GTM Teams Aren’t Ready for Agents

Ken Lempit Season 5 Episode 10

Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.

0:00 | 24:00

Send us Fan Mail

Guest: Cliff Simon, CEO & Founder of Polaris Ops  

--  

AI may be everywhere in SaaS right now, but most go-to-market teams still are not ready to operationalize it.

In this episode of SaaS Backwards, Ken Lempit talks with Cliff Simon, CEO and founder of Polaris Ops, about what it really takes to make AI useful inside revenue operations. Cliff explains why pressure from boards, CEOs, and private equity firms is pushing companies to adopt AI faster than their systems can support it.

They dig into the real blockers to AI readiness, including poor CRM hygiene, undocumented business processes, disconnected data, and weak visibility into the customer lifecycle. Cliff also shares how leading teams are measuring AI impact through time saved, operational leverage, and even revenue contribution from RevOps.

The conversation also explores agentic AI in SaaS go-to-market, from lead routing and TAM analysis to signal detection and workflow automation. Along the way, Cliff highlights the security risks, vendor dependencies, and build-versus-buy decisions SaaS leaders need to think through before moving too fast.

Key takeaways:

  • Most SaaS GTM teams are being told to use AI before they have the operational foundation to support it
  • AI readiness starts with clean data, documented workflows, and clear success metrics
  • RevOps teams can become revenue contributors when AI is applied strategically
  • Agentic AI can improve routing, targeting, and automation, but only with human oversight
  • SaaS leaders need to weigh speed, security, and long-term ownership before deploying AI tools

---

Stalled pipeline? Lost deals? Diagnose your GTM gaps with a free, actionable checkup. 

🔗 Get your free SaaS GTM Checkup 

Why Most SaaS Teams Aren’t Ready for AI

SPEAKER_00

Welcome to the Stats Backwards Podcast, where we reverse engineer the success of fast-growing Stats firms and explore strategies CMOs and CEOs are using to drive their businesses forward.

SPEAKER_02

Welcome to StAS Backwards, a podcast that helps staff and AI CEOs and go-to-market leaders accelerate growth and enhance profitability. Our guest today is Cliff Simon, CEO and founder of Polaris Ops, a go-to-market advisory firm that helps SaaS companies and private equity operating partners evaluate AI readiness and deploy agentic AI capabilities across their revenue operations stack. Welcome to the podcast, Cliff.

SPEAKER_01

Thanks for having me, Dan. How are you?

SPEAKER_02

I'm great and excited to dig in with you. But before we jump into this episode, could you please tell us a little bit about yourself and your company, Polaris Ops?

SPEAKER_01

Yeah, I've been in GoToMarket for going on about two decades now. Enjoyed being on the bigger side of things at a Fortune 20 and then uh spent the majority of my career helping companies go from zero to 10, 0 to 15, but also on the consulting side, having helped companies anywhere from, I don't know, 10, 20 million all the way up to, you know, however many billion. So I've gotten to see a lot over the last few years. And then as far as what I and my team do, we focus mostly on helping companies understand where they are in their AI journey and help them deploy that within their systems and within their customer lifecycle, whether that is through a myriad of systems that they own today or helping them build custom AI solutions.

SPEAKER_02

Yeah, it's super important to be thoughtful and kind of detailed about how you're going to deploy AI and take advantage of those capabilities too, right? It's a lot of gains to be made. So when we did our prep, you mentioned that AI is really becoming a mandate from boards and private equity operating partners. But you also said that most go-to-market teams don't actually know how to operationalize AI. Where do you see the biggest gap in the pressure to quote unquote do AI and what companies can realistically execute today?

SPEAKER_01

There's a ton of pressure, not just from the board, from CEO down, right? We have to use it, we have to use it, we have to use it. But the problem is most companies aren't putting that into a programmatic fashion. There isn't a set way or motion or enablement around how we're going to be using AI. It's very much been go play with it, go figure it out, and sort of report back what you've been able to do. The places where there is the easiest level of entry and previous to the last couple of months was using ChatGPT, perplexity, the baseline stuff, right? Now with Cloud Cowork coming out, I'm seeing a ton of folks really drive into that. We're starting to see more and more platforms come in with an MCP capability. And that's driving a lot of the user adoption. But again, folks that are starting to jump over the chasm on these kinds of things are the people who are already putting in a significant amount of time off the clock to really do this because this is the next step in their career or the next step in how we operate as an industry. So yeah, we're seeing so many people that are really interested in this, putting in three, four, five, 10 hours a week outside of their nine to five, right?

SPEAKER_02

Yeah, it's amazing. We actually have a once and and probably future client, a guy named David Gabriel at a small software company called Rumbics. And he's built an entire agentic product marketing capability kind of as his own project, building it in the open. So I think you're exactly on where leaders and thoughtful people are going. And by the way, David's not even a programmer, has no technical talent, and is building a pretty robust go-to-market like enhancement on his own, basically.

SPEAKER_01

And those things are really cool. The challenge there becomes if you are not someone who's familiar with the different regulations, you're not familiar with best practices around security and those protocols, you might be very well building something that opens up a whole host of challenges, you know, Pandora's box, if you will. So you really want to make sure that you're testing one of the guys that works for us, built a pen testing tool for these vibe coded solutions. Tested it with a friend of his just looking at the URL, and just from that exposed URL, was able to go in and find private information, billing information, credit card numbers, things like that on the Stripe account. Like, so you have to be really careful with the things that you're actually making visible without recognizing it.

SPEAKER_02

Yeah, there's the exposure of the data, and there's also exposure to your internal systems, right? So building in isolation, probably a really good idea. Also, interestingly, just back to David, he implemented a way for people to engage with this system. They're using a Trello board to task the agents so that there's sort of an arm's length between the agent and the rest of the organization. So kind of a lot to think about in terms of the security and vulnerabilities that you might be creating. Are there ways that people are like reporting up and measuring this we're doing AI stuff? Like, what are the milestones that you see people committing to and achieving?

SPEAKER_01

So I think the most common ones are lines of code that are being written but with AI on the engineering side of the house within operating teams and within more specifically RevOps and VisOps teams, we're seeing hours saved. So if you would typically take a set of tasks, break them down into a level of effort of, you know, the way we would do it, a small, medium, large, right? Zero to five, five to 10, 10 to 15 hours for a task, and then break that specific task down into other tickets, more seeing people measure in time saved. So if a typical task used to take you 40 hours, but now because of AI, you're able to do it in five, you've gained 35 hours, right? And they're they're cataloging that across their team over the course of a month or a quarter.

SPEAKER_02

Oh, so that's real opportunities to demonstrate the return on investment.

SPEAKER_01

That's right. Yeah. And then if you want to, you multiply that against the average salary of that team member plus their fully burdened costs. And now you can roll that up and say, this is what we would have paid to do this thing, but now we can do it this much quicker.

SPEAKER_02

It's really interesting seeing the kind of metering the effort and go to market. That's something maybe that's coming more fully, right? That people are going to be much more able to measure what it takes to do things.

SPEAKER_01

I think the other way that we can measure is especially in a PLG motion, we're starting to see RevOps teams no longer be a cost center. We're seeing RevOps teams create automations and workflows and capabilities where they can start generating revenue. We don't need to have a salesperson in the loop on some of those areas. And if the RevOps team can architect the right customer journey, put the right systems in place to activate on that journey and bring that to a revenue perspective. Now they're covering their costs, right? It's a completely different paradigm than what we've thought about for the last 15 years.

SPEAKER_02

That is pretty amazing and would obviously spur more investment in those areas, right? So very positive. I want to start with where you begin an engagement with clients. And you said to me that you usually start with an AI readiness assessment before even touching the tech stack. Can you talk to us about what you're looking at there? What are the signals you look for that indicate a company might be ready to operational AI in go to market? And like if not, you know, what needs to be done usually.

SPEAKER_01

It's all the foundational components, right? We've spent so much time as an industry over the last 15 years growing as quickly as we wanted to, as quickly as we could, without ever really putting in the foundations, data hygiene, definitions within an organization, understanding how the customer journey actually moves from stage to stage to stage, both lead lifecycle, contact lifecycle, account lifecycle. We need to have all that metadata accurate. It has to be correct. Because if it's not, and we're not able to tell the AI what it's supposed to be doing, how could it ever get there? We need to have accuracy and we need to have clear direction and clear metrics of what success looks like. And we have to teach it that. Once you give it that, great. But if you don't have those pieces in place, it's never going to work the way you want it to.

SPEAKER_02

Can you maybe dig into recent engagements and talk about things you've discovered and remediations that were needed to be able to make next steps?

SPEAKER_01

Again, foundationally, things need to be entered into the CRM. They can't live in a hundred different Google Sheets, right? It's impossible to pull that all in into one place. It's very time consuming. It's very manual and gone through that process now with one of our customers. But in order to help them get to a place where they can actualize their dream of leveraging AI, which they're starting to do with Cloud directly inside of Salesforce now, there's so many other components that need to happen, business processes that needed to be documented, levels of maturity that needed to be created. So that is a very large-scale effort. But once you get those pieces in place, you can start looking for the corollaries and the data patterns that will help you actually make decisions on where to drive the business forward.

SPEAKER_02

So if you're working within the CRM, right, the sort of system of the source of truth, we get everybody to a level of hygiene where we can actually put AI against it. Is the resulting workflow creating better data? Like our salespeople now, is it easier for them to be doing their part of getting the data in? And is that helping us?

SPEAKER_01

Yeah. So if you're using a good call recording technology, you make sure that you're driving enablement around that, that people are asking questions in the right way so that it's grabbing that information. And then the AI on the back end is filling in all that. This is the case of the pool was dirty. We cleaned all the crap out of the pool, you put it where it had to go, you clean it out, you put a new filter in, now it's working, but we still got to make sure people aren't just throwing random crap into the pool manually, right?

SPEAKER_02

I love the idea of farming the recordings to do the work that the salespeople just by nature don't want to do, right?

SPEAKER_01

It's not that they don't want to do it even. It's it's low quality work for them to be doing. If I'm paying an enterprise seller 303, 350 KOTE, I want that person talking to people. I don't want to be spending a quarter of their time paying them to type into a system. It's stupid. It's uh or a stewardship of capital.

SPEAKER_02

I could see a guy like Brian Burns saying those exact words like, why should you be forcing your people to do that when we can get the AI to do that work for you, right? Even if you only get 15% back from those high-performing salespeople, that's a lot. That's a lot of hundred percent. I want to dig a little deeper into kind of where AI agents do operational work. You know, can they effectively do, you know, total addressable market analysis? Can they do lead routing? And what are the signals they're able to create within the GTM world?

SPEAKER_01

Yes, you can have it look at all of your existing customer base, ideally, the customer base of the ones who are paying you the most, who are buying more of your services year over year, who are consistently renewing, right? We want to take that archetype and create that, and then have the AI go out and look into the open field for anything else that exists that looks like that. I think that's pretty straightforward and something that AI is very good at. But it's again, it needs a human in the loop to make sure that it's checking all those different parameters and that that's being refined over time. The other component there, and I this goes across all of AI, is we as executives need to understand that we are not buying an implementation the way that we've always thought of it. This is not me buying something off of the shelf that is going to be implemented within a two-week to two months to one quarter timeline and be set it and forget it and done. These AI solutions, especially when they're tied into multiple LLMs. You know, you've you're tied into anthropic, you're tied into open, you're tied into perplexity, you're tied into Garak, all of those have multiple models that are constantly being updated. The API maintenance here and the maintenance of how those LLMs are interacting with the components that you've built is a daily task. It is a very different level of intensity when it comes to managing it. And the cost of ownership is wildly different than traditional SaaS. So that's something to think about. And then also credit usage, et cetera. With that caveat said, yes, you can go out and build lead routing. HubSpot's a great example, right? We're partners with Line Data, with Chili Piper. Both those guys do a great job of doing lead routing into Salesforce. There's not really anything that does the lead routing into HubSpot. So we went out and built it. And HubSpot's great because they've got awesome documentation out there. They have a very robust and open API that we can access directly from an MCP server. Boom, boom, boom, send things back and forth. And you know, we can have very defined territories, very defined routing guidance based off of any of the firmographic information. So I think that's really straightforward. Salesforce is a little bit more touchy at the moment. And I think that's why there's still a bit of a moat there. But we'll see how that comes around over the next six to 12 months. When it comes to the custom signal component, you can get really, really interesting. Anything that you want to get your hands on, you probably can, as long as you can find it publicly on the internet somewhere.

SPEAKER_02

The signal, signal identification and using those trigger events are a big part of what we see as helping sellers be more efficient, right? We're pointing them in pointing them in the direction. Let's say you've got 5,000 companies you might ever sell to. You have 10 sales executives, they really need to have some way to prioritize other than the obvious somebody comes to the website and asks for help.

SPEAKER_01

Yeah, yeah. So in that case, you're what you're doing is you're carving together all the right pieces. We have the right firmographic information. We know they're a company that's a potential client for us. We know they're big enough. We know they typically have the right employee count, whatever it might be, that they have the problem that our thing solves. What other things are we looking for? Yes, we're looking for new job postings in our typical personas. We're looking at new people in seat over the last 12 months. We're looking for a growing team. We're looking for changes in pricing on their website. We're looking for one of my customers is APAR software. We're looking to see if there's a payments link anywhere on their website, your typical fundraising information. But most of those are lagging indicators, right? We want to try to look for indicators that are going to be things that matter so we can find the people who are most likely going to be a customer of ours in the future, not necessarily just the three to five percent of people who are in a buying motion today. So one that a lot of our customers who happen to sell into legacy industry are finding very interesting is what we've called a succession planning capability or succession planning signal. We're finding companies where there is a boomer parent with a Gen X or a geriatric millennial child in the business in an operations or finance role. And we're pairing that together alongside all those other signals, and it's leading to some really great results. So interesting because you're trying to get ahead of them before they go through this process. They haven't digitized yet. And now you tie all that together, you've got a really good package of like, okay, a business where you know the business owner is going to be transacting out in the next few years just based off of demographic. Have you thought through this problem yet? And it leads to a very different type of conversation than if you're just trying to sell somebody something, you know?

SPEAKER_02

Yeah. So like figuring out what are the forward-looking insights here is a way to use these pretty commercially available tools now, right? Yeah. Now you guys also do some of your own research, right? I think you mentioned that many of the AI tools in our go-to-market stack depend on other platforms and data sources, something you alluded to earlier in the conversation. Can you highlight a couple of the risks that are created? Again, I think you sort of touched on it, but for these SaaS companies that are trying to adopt tools quickly, what are the top risks to be aware of?

SPEAKER_01

Yeah. So 80% of what we've seen to be the agentic go-to-market market is built on the other 80% of the market. So people who have really good contact level data are going and getting account level data from other sources, and vice versa, majority of the industry is still using either SmartLead or instantly as the sending capability to turn their system or their platform into a system of action so that it can actually go out and do those outbounds. The challenge with any of those is if a component fails or a company gets acquired or gets shut down, because most of these companies are fairly early, still sub$5 million in revenue. Now you've got a gaping hole that that company has to go fill. And for whatever period of time, you may not be able to use the software the way that you thought you were going to, which means your go-to-market motion is now at risk. Because if you're not generating the pipeline and doing the things today that you know you need to be doing, you're going to have a gap three, six months down the road. It's a significant risk. And the folks that came out on top from a technical perspective, within the research that we did, all scored as high as they did because they had proprietary technology that was built in that had redundancies that allowed them to continue to operate appropriately without having external risk pressures from other vendors.

SPEAKER_02

I want to jump to your own research. You've been researching Agentic AI across GoToMarket. You've started to publish your findings. And for those, for those of your fans, this is a new firm, right? Polaris, a little less than a year old. You've been publishing your findings. And I'm wondering if there's a place that you publish or plan to publish the a full research report and if you want to share what's included and when that would be available.

SPEAKER_01

Yeah. So uh this year we did a 42 vendor diagnostic. We rank them as systems of action, systems of record, systems of information. And all of that's going to be available on our website, which is Polarisops.com. And we'll also put it out on our newsletter, put it out on LinkedIn. But yeah, we'll share that with everybody.

SPEAKER_02

Jumping back to one last topic, when you do complete a readiness assessment, a major decision that you told us about is whether to build AI capabilities internally or continue this process of stitching things together with the inherent opportunities and risks. This is like an age-old problem, right? Build versus buy, and especially at your larger customers or larger clients of your firm, they have the capability to build, right? So how are we making those decisions? Is it the same build versus buy decision we've always made? Core capability, core competitive set capability, or is it often better to buy, but buyer beware?

SPEAKER_01

I think that particular conversation has shifted dramatically because the old cost of building something meant that I was taking developers or engineers off of my product team and pushing them to something that was customer-facing right here or on the go-to-market side, right? Wasn't customer facing anymore, wasn't building something for the consumer, but I was building something for my internal team. And it used to take a really long time. Now you can build products in like a weekend. You know, something that gets the job done. I'm not saying it's a finalized thing. I'm not saying it's gotta pass all the security requirements, but the speed at which you can build and the cost at which you can build is incredible. Yes, I know there's an an ever increasing cost when it comes to credit spend, et cetera, but it's it's so much faster. It's incredible how fast you can build. I'm really nervous for a majority of the tech companies that are like that 2013 to 2021 vintage. Right? That's there's a lot of tech today that can be rebuilt so much faster, so much cheaper as a lower back end cost by a single person that it used to take teams of 20 to 50 people to build. So that I think that's a very different thing. At those larger firms, it's not about do they have the capability, it's do they have the bandwidth. You've got smart folks, but they're probably already doing other things. How much time can you peel off and allow them to go out and experiment? I hear from people day in and day out that they feel like they can't keep up with the pace of change. I hear that from my team, and they're the ones that are doing the research and getting out there and like looking at all these tools and like they're constantly like, holy crap. So it it's very difficult, I think, for anyone in general to stay on top of it. You sort of have to pick a lane, see what you're interested in, and continue to build. If you if you're a company and you're thinking about build versus buy today, I'd really try to assess if you don't have a team internally that could go and tackle this for two weeks and see if you get to a product that's close enough. Because there are things that were not on the market three months ago that exist today that you could just buy, which is wild. But you can go out and build it probably just half the time. I just keep being asked to build things and then something else pops up, and something else pops up. It's it's very interesting to see both sides of the market.

SPEAKER_02

I agree with you. You know, early in my career I built stuff for IBM and Citibank, and they were person year projects, right? We would have eight people work for two years on one big push. And today, you know, you have eight people work for a week and you have a product. It's a it's a really different life.

SPEAKER_01

My buddy still. Phil, who was on on my podcast most recently was working with Verizon. He told me experimentation there was yeah, we take you know a PM, an engineer, a product manager, and maybe a fourth resource if necessary, and we just toss them at a problem and toss like, I don't know, seven or eight million dollars at it in in overall cost and spending. And that's an experiment in over a quarter, and they can generate tens, if not hundreds of millions of dollars. It's like that's crazy that things are driving to that point, you know.

SPEAKER_02

It's like a yin and yang, right? There's like immense positives possible out of all this technology and you know, some scary stuff either for existing software companies or people in career transition as a result of the tech. Lots of opportunity, lots of potential risk. And if people want to navigate that more effectively, Cliff, you probably could help them take a look at their plans. Please let listeners know how they might reach you and your new company, PolarisOps.

SPEAKER_01

Easiest way, you can go to PolarisOps.com or you can reach out to me directly on LinkedIn. I'm the Cliff Simon with the cloud in front of his name.

SPEAKER_02

My LinkedIn is LinkedIn slash in slash Ken Lempit. If you want to learn more about our advertising and go-to-market agency for SaaS and AI companies, we're at AustinLawrence.com. And if you haven't subscribed to the SAS Backwards Podcast yet, please do so wherever they're distributed. Cliff Simon, thanks so much for joining us again on the SAS Backwards Podcast.

SPEAKER_00

Thanks for listening to the SAS Backwards Podcast, brought to you by Austin Lawrence Group. We're a growth marketing agency that helps SaaS firms reduce churn, accelerate sales, and generate demand. Learn more about us at www.austinlawrence.com. You can email Ken Lempett at kl ataustinlawrence.com about any SaaS marketing or customer retention subject. We hope you'll subscribe and thanks again for listening.