Women in Customer Success Podcast
Women in Customer Success Podcast is the first women-only podcast for Customer Success professionals, where remarkable ladies of Customer Success connect, inspire and champion each other. In each episode, podcast creator and host Marija Skobe-Pilley is bringing a conversation with a role model from across the industries to share her inspirational story and practical tools to help you succeed and make an impact. You’re going to hear from the ladies who are on their own journeys and want to share their learnings and strategies with us. You’re going to be inspired.
Women in Customer Success Podcast
[WiCS PowerUp Masterclass S2:E4] The Future of CS: Biggest Takeaways from Pulse Europe
Text us your questions and thoughts!
What's Next for Customer Success in 2026? That’s the question we’re answering today.
Welcome back to another PowerUp Masterclass in partnership with our friends at Gainsight where we’re going beyond the Pulse Europe conference recap and into what really matters: where Customer Success is heading in 2026.
Together with Tori Jeffcoat, Director of Product Marketing at Gainsight, we're unpacking the biggest trends and ‘aha’ moments from Europe's leading CS event, and sharing our predictions for the year ahead.
You'll walk away knowing:
- Why AI adoption is lagging and how teams are gearing up for agentic success
- How CS is becoming a revenue driver through better post-sale visibility
- What's actually working in digital-scale strategies right now
- What CS, Product, and Revenue leaders should prioritise next
This is your chance to turn Pulse's key takeaways into your 2026 game plan.
Grab your notebook and tune in.
Featuring
Tori Jeffcoat, Director of Product Marketing at Gainsight
Marija Skobe-Pilley, Founder, Women In Customer Success
👉 Follow Tori on LinkedIn: https://www.linkedin.com/in/torijeffcoat/
👉 Follow Marija on LinkedIn: https://www.linkedin.com/in/mspilley/
👉 Learn more about Gainsight: https://www.gainsight.com/
__________________________________________________
About Women in Customer Success Podcast:
Women in Customer Success Podcast is the first women-only podcast for Customer Success professionals, where remarkable ladies of Customer Success connect, inspire and champion each other.
Follow:
Women in Customer Success
- Website - https://www.womenincs.co/podcast
- LinkedIn - linkedin.com/company/womenincs
- Instagram: https://www.instagram.com/womenincs.co/
Host Marija Skobe-Pilley
- LinkedIn - https://www.linkedin.com/in/mspilley/
Check out our Courses:
- The Revenue CSM - https://womenincs.co/the-revenue-csm
What's next for customer success in 2026? That's the question we are answering today. Welcome back to the Power Up Masterclass in partnership with our friends at Gainsight. Today we are going beyond the recap of Pulse Europe, and we are focusing on what really matters: trends and predictions and directions for customer success in 2026. Together with Terry Jeff Coat, Director of Product Marketing at Gainside, we are unpacking the latest trends and aha moments from Pulse Europe and we are sharing our predictions for the year ahead.
SPEAKER_00:Let's dive in. Hey everyone, so excited for you to be joining us today for our webinar with Women in Customer Success and Maria, the founder of Women in Customer Success. We'll be talking today about some of the key trends that we saw at our recent Pulse Europe event and that we're seeing across the customer success industry. In today's webinar, we will be covering again some key topics and themes we saw from our Pulse Europe event. We'll be touching on how that's impacting the CS industry and some of the key data points that we're seeing from our recent CS Index report in how customer success teams are bringing a lot of these topics and themes to life within their own organization. And hopefully you'll have a good couple of takeaways as we're heading into 2026, where some of these data points can impact your teams and how you can leverage them moving forward. So excited to get today's content kicked off. It was great seeing you in person at our recent Pulse Europe event, Maria. We had so many great insights, conversations, learnings from attendees, both on our keynote stage, in our sessions, and in general conversations. Our Pulse theme was Wicked this year. The Wicked for Good movie just came out, so well timed. But we talked about how the magic of AI, of agents, of all of these great tools are really influencing customer success as an industry. I had the opportunity to be on the keynote stage for the first time, which was both nerve-wracking and very exciting. Um, but do an actual demo of our renewal agent, which was really, really cool to do on stage, and really cool to see everybody's excitement about the impact that AI and agents can make in customer success teams. So it's great to see everybody in person at Pulse Europe and really excited to share some of the key topics and trends that we saw from attendees and again from our sessions at Pulse. And then I think we wanted to do a quick poll, right, before we jumped into revealing what those key themes and topics were.
SPEAKER_01:Absolutely. I'm really sure that everybody is interesting. It would be nice to know also who who managed to attend. Uh, but while people are responding to the poll, I would also say it was an awesome pulse in Dublin. I've been coming ever since 2019 in London. Pulse Europe was also one of the reasons when I started Women in Customer Success podcast, just based on one particular session during that year. So it's always super exciting to just be there and see all the folks. And I'm interested to see what people think about the main topics.
SPEAKER_00:I'll go ahead and end the poll so that we can share what we actually saw. So, the number one topic, if you didn't guess this, uh, I feel like this is probably most of the answers. Number one topic was AI. It was so top of mind for everyone. I feel like it's been in every conversation, every webinar, everything that everyone's done for the last year. So probably no surprise. Um, I did find it super interesting. We did some analysis of our key topics and sessions post-pulse. And AI-related topics came up four times more than the other closest topic. So not only was it the most common topic, but it was really clearly dominating almost every conversation and on every track as well. Um, Pulse usually has a couple different tracks for all the key themes. And while we did have an AI track, it came up, I think, in every other track as well, which just shows how prevalent it is to how we're solving the CS problems and challenges we're all facing today. A couple quick things that I'll highlight on AI as a topic. We did see it repeated, not just for the manual work, the time savings, the efficiencies that CS teams are gaining, um, but also for some of those deeper use cases, using AI to coach and train teammates, using it to help eliminate key workflows and maybe do things completely differently than CS has traditionally done it. But what's really exciting is how many people are seeing real results from it. Not just a buzzword, but something that we've kind of seen actually evolve into a true time savings, a true value add to multiple CS teams. One thing that I will note, um, and I know we'll talk a little bit more about this later with some of our CS index data, but there are still a lot of folks that are still in the early stages of adoption when it comes to AI. So if you're not using it for everything in your CS org, that's you're you're not alone. A lot of teams are still figuring out where they can really find the most value from AI and customer success. How do they use it really effectively and deploy it across the organization? I think one thing that's super interesting is a lot of early use cases are really individual productivity as opposed to like organization-wide change, right? And new processes and new systems. So excited to talk a little bit more about that, but I thought that was uh just a key call-out is you're not alone if you're not AI everything always, but it is still really top of mind for CS teams. The next big topic that we saw was community. Community came up as a really big topic this year and a key way that a lot of teams are actually leaning into digital scale, self-serve motions, creating those scalable avenues for their customers to engage and eliminating some of the common FAQs and questions that I think teams are facing. A lot of sessions touched on how community is that continuous source of behavioral signals as well. So not only is it a great way for you to scale how you engage your customers, it's also a great way to get information back from them. What are the topics they're talking about? What are the ways that they're engaging? How do you get the peer-to-peer networking, right? Element of your customer business that helps make your customers more sticky and achieve a lot of value from the organization. The next topic, also probably no surprise to everyone, but better visibility and measurement. CS teams, I think, are being pushed across the board to more frequently own the number, whether that's renewals, expansion, but really being tied to revenue a little bit more prescriptively than I think the customer success industry has been, right, in the past. And so we're getting a lot more granular with what we need to track, how we track it, how we're measuring customers' experiences and making sure that we're keeping a really prescriptive pulse on where our customers are at. Also, really important when it comes to stronger risk and churn signals. I think we're seeing a lot of CS teams where maybe they were just measuring high-level metrics in the past. They're getting a lot more granular. So we can all get better about identifying that risk early and taking action to save our customers ahead of time. And then the last topic, before I'll stop talking for a second, but the last topic was product adoption. Not a topic that's necessarily always been top of mind for CS teams. So it was super interesting looking at all the sessions and how this kind of came up at Pulse was that a lot more CS teams are getting a little bit more prescriptive with how they measure not just those risk signals and the renewal aspect, but also how they measure product adoption. It's not just about, you know, license utilization, but now it's really, are they activating the right features? Are they using the sticky features? And how are we really prescriptively understanding where adoption is failing so we can engage and encourage folks to adopt our products better? If you're a B2B SaaS organization, I'm sure product has always been top of mind, but it's interesting to see how CS teams are tying a little bit more closely to this metric so that they can get more granular in that attention and drive adoption, thus influencing retention and expansion. Um, one last call-out on this one. There was a lot of great uh content and great conversations around how in-app signals can really be used to push expansion, upsell and cross-sell as well. So, not just for retaining our customers, but using those signals to identify when someone's a really good fit for expansion. Um, and potentially even do that in-app with a dialogue or uh an in-app guide that's pushing someone to engage with that upsell opportunity potentially. So, really, really strong topics. Um, a couple other things that came up, of course, was learning, leveraging changes in AI and changes in how we operate to influence how we train our teams, right? There was a lot of good conversation around the change management side too, not just how are we delivering customer success, but how are we preparing our teams and upskilling them to be able to deliver that better as well? So, really, really great topics, I think, across the board.
SPEAKER_01:Super interesting topics, and thanks so much for sharing some of those themes. It's interesting to see. Probably we could have predicted that that AI would have been the most dominated topic. But tell me, were you surprised by this level of interest, or did it reflect what you're hearing in conversations with people and on the floor?
SPEAKER_00:Yeah, I think AI is one of those things that everybody's talking about. What's different is how well we're actually leveraging it. So half the conversation was we're doing AI everything, this is how we're using it, this is the results we're seeing. And the other half of it was, I know I should be using it, but I have no idea what I'm supposed to be doing. So it was a really interesting mix, I think, of those two viewpoints. Not at all surprised that it was common. The the four times every other topic, I think, was even more than I expected. But it was just really interesting to see how how divided the industry is almost, right? On where we're supposed to be using this and how we think about it.
SPEAKER_01:Absolutely. I had my my session in, I believe, leading with empathy type of a channel. And even there, there was just loads of AI everywhere in terms of how do we lead teams in the AI world, how do we help them adopt, how do we everything with AI? Like we can't escape it now, but it's absolutely prevalent. So we have to be talking about it. I'm really also super happy to hear more insights into that product adoption and expansion with in app, thinking, thinking as a user. Well, yes, of course, I would like to be kind of approaching that way. It's so much easier than trying to start a conversation with somebody and picking up the phone. So those are really great and super interesting themes. Thanks so much for that. Yeah. I think we can also dive into more details of customer success trends. So I do believe this is really helpful for us to understand like what people are talking about and pulse teams are the result of it. But what has been happening with the CS index report? Because I do believe we will see loads of overlaps with what is the actual state of customer success currently. Would you like to share some of those reports with us?
SPEAKER_00:Yeah, absolutely. I think exactly what you just said, there's so many great themes and topics that we saw at Pulse Europe that were also reflected in our CS Index report. So we use a lot of these data points at Pulse. And I've kind of pulled them out just to kind of reference for for this webinar, but also just for the broader CS industry to understand where we're seeing these trends actually come to life and how that's shown showing up in the data. So we do our annual CS Index report from GainSite every year. We just got the results for this year's. So it hasn't yet been published. So you all get to be the first look, the first preview of some of these numbers, but want to share what we're kind of seeing ahead of this report being published again, because it uh is such a good representation of those same topics and things we were seeing at Pulse. So the first point is just investment in digital and investment in customer success is not going away. Um, I know there's a lot going on globally, of course, in tech right now, but we're continuing to see investment either holding or increasing in customer success. And interestingly, European companies, uh, we do this report globally, but we do pull out North American respondents versus European respondents. And we saw that European companies are actually investing twice the amount in CS as a percentage of revenue as North American teams. So not only is investment increasing, but particularly in Europe, we're seeing a lot of strong investment in CS overall. A lot of that investment is actually going into digital channels, which is the chart on the um right side of the slide here. But a lot of that investment is going into digital channels in order to increase engagement in tools like community, digital learning, and app guides as well. And another really interesting data point from the CS Index is that more European companies deliver the experience digitally compared to North America. Um, so we're seeing a lot of increased digital across all of these great channels, but also in delivering the customer journey. If you think about onboarding, adoption, renewal, uh, and expansion and advocacy as well across all those different journey stages, there was a much stronger digital pull when it comes to our European respondents there. So we're seeing a lot of investment. We're seeing a lot of that investment increasing, and again, particularly in digital, as well as AI and agents, which I'll talk about in a minute as well.
SPEAKER_01:I'm really excited to hear that kind of great data that Europe is adopting more of a digital solutions and digital customer journey. Now, I wonder, what do you think is the reason for that? Is it more of a strategic choice or more of a necessity given our markets and localization challenges and conditions?
SPEAKER_00:Yeah, it's a really great question. I think we're um historically, we've always seen if we look at North America versus European response to the CS index, like the first time we did this report four or five years ago, it was very much skewed the other way, where North America saw a lot of increased investment and general CS programs were a little bit stronger in that area. I think we're seeing Europe catch up. So there's a lot of increased investment in terms of where we're seeing it grow and thrive there, not just in the North American space compared to that historical data point, right? Um, but I think that maturity curve for Europe has also resulted in starting CS programs with a more digital mindset. I think if you think about the historical, you know, CS programs that have been around for 10, 15 years, those have historically not been as digital, as digital forward, right, as some of the newer programs. So I think it's a combination of the maturity, kind of starting with digital in mind. And to your point of the localization, the you know, broader global space necessitates being able to reach your customers digitally and having those digital tools to help translate, work across languages, right? Be able to engage with customers in that way as well.
SPEAKER_01:That's kind of one of the results that I'm seeing talking to leaders, how everybody are so excited about AI translating everything now, because previously it was such a laborious work of translating websites and onboarding journey and so many other things. Now it's done basically within with an agent in a minute. Another trend that I'm seeing, learning management systems and in-app messaging seem to be growing in popularity from year to year. Maybe that's a little bit kind of versus self-service adoption is increasing, at least given these graphs. What do those trends suggest about the user behavior and and and their needs of you know how do they want to be communicated to?
SPEAKER_00:Yeah, absolutely. I think a couple of years ago, when digital really became kind of front and center for customer success teams, we talk a lot about digital being a strategy, not a segment. Um, I think a lot of teams, when they initially rolled out digital or sometimes called tech touch programs, it was really focused on like the long tail, right? The customers you couldn't reach at all, didn't really want to engage with um with a human headcount, right? Just based on the type of customer, ARR, things of that nature. But we're really seeing digital kind of be embraced front and center now across every segment and all teams, which means there's a lot more investment in it because it becomes a much broader way for you to engage with your customers. I think it's also a way for us to scale customer success teams. So even for, you know, high-touch segments where maybe you had a wider or a better, smaller account ratio of customer success teammates. We're seeing a lot of teams try and scale a little bit more prescriptively and use those digital channels to help make that experience still a really positive one, right? For customers, still engaging, still thriving in terms of their usage of the product and how they engage with the company, just shifting some of those modes of engagement to some of these digital channels essentially.
SPEAKER_01:That's great data point. Let's check in with the audience again. I do believe we have another poll. We'll just like to get to know a bit where is your organization in its AI adoption journey for customer success?
SPEAKER_00:And while we're getting answers to that question, one other data point I did want to share is just the other metrics that we're seeing customer success teams kind of score their organizations on or things that they're measuring and being gold to. Things like GRR, NRR, of course, renewal rate is of course the number one revenue related metric. But a couple other things that we're seeing is teams really focused on expansion. So CSQLs or CS qualified leads are pretty strong. The red line here, also just to note on the chart, is the 50% line. So everything above that line has more than 50% of CS organs measuring that data point. So the things that we saw were CSQLs, which again reflects some of the expansion ownership we talked about earlier that CS teams are increasingly being tied to. NPS is, of course, a strong one still, but we actually saw a pretty big dip in NPS over time. So it's becoming less of a metric. Um, and I think in large part people are starting to measure things like sentiment instead. So it gives you more of a continuous look at where your customers are feeling on a day-to-day basis, as opposed to NPS, which is more of a every six months or maybe even annual type measure. And then the last couple of things that I'll note is health scores saw a huge jump. That's something I think we've all measured. But again, going back to how prescriptive everyone needs to get about measuring those so we can engage with risk earlier and find that risk a little bit more effectively. Uh not surprised to see that kind of jump up as well. And then product usage, again, just to go back to the adoption conversation, key metric, I think CS teams are increasingly being tied to. So just a couple other call-outs that I'll give there. Um, it does look like we have a really great response rate to our poll, though. So we can go ahead and end that poll and potentially share results with everyone. I think our view is different from everyone else's view. So I'm not totally sure if all of the attendees can see the answers here. But if not, we'll follow up with this webinar with a couple key call-outs for where everyone kind of stands and what those results are. Um, but I'll jump into what we saw from our CS index report and you all can kind of compare and contrast where you stand. One thing that I think uh was super interesting for me looking at our CS index data, particularly considering how big and how prevalent of a topic AI was at Pulse Europe and in general in conversations, is that we're actually not seeing a huge jump in AI adoption year over year among CS teams. So our global responses for last October, CS teams are about 44% leveraging AI for CS workflows. I'll just add a quick caveat here. That's for CS workflows. So if you personally use AI to help you write an email faster, recap a conversation, those are more like individual use cases. This is more measuring AI for use in, for example, renewal workflows or onboarding workflows, things that are a little bit more CS specific there. So it was 44% last year. This year we jumped up to 52%, which is an increase, but only an 8% increase. I think with the number of conversations, the number of focus we're having in AI, that was a little bit lower than I expected this number to be. So I think we're still, again, lagging in figuring out how we use it, all excited about it, just looking for a little bit more direction there. Um, one call out on our European versus North American respondents as well is that the AI adoption in Europe was just a tiny bit lower, 48% versus that global number of 52%. Among that 52% also, thought it was super interesting that most folks are still in the initial rollout phase. So of those that are adopting it, 63% are still in that initial limited use case uh bucket for the most part. 10% of respondents had broad adoption across their org, meaning everyone is really leveraging AI in those ways prescriptively. And only 2%, the top 2% here, had AI fully embedded across their CS teams and workflows. So we're getting a little bit better at it. We're getting a little bit more prescriptive with it. I think we just still have a great opportunity ahead of us to leverage AI more effectively in CS.
SPEAKER_01:I'm super envious of those 2% that already have AI fully embedded. It would be a dream, obviously. And it would be just great to see what are all of those different use cases that they are using it for and how do they get there? As you're saying, it seems that a large majority of people, 63%, have done some initial use cases. They're experimenting, they're trying it out, they're excited about it, but then maybe don't know exactly how to move beyond that first use case or how to prove the ROI to the business so that they can go to the wider adoption. Do we have data on what we are seeing is preventing people from kind of fully embedded AI solutions? Indeed. That's a great question.
SPEAKER_00:So the top barriers that we're seeing to AI adoption, why we're only at that 52%. We asked the same question last year and this year. And I think the shift in these answers is super interesting to look at. So the top barriers that we've seen and that we've we've asked um attend or poll respondents to give us their inputs on is if it's lack of adoption, internal integration complexities, data privacy concerns, output reliability, or resistance from teams. Those are kind of the top barriers that we're seeing across across folks. The answers have changed dramatically, I think, from last year to this year in terms of what those top barriers were. Last year, the top barrier that we saw, pretty pretty wide margin, was integration complexities. We've kind of figured out the technical side now, so that's no longer as big of a barrier. It had a 17% drop year over year. Similarly, data privacy concerns have actually gone down a lot as well. So I think there was a lot of initial concern around the data part. What is it using? How is it using it? What does that look like? That we started to get a little bit more comfortable with, or at least have a better understanding of to now understand how it's working and that it is still respecting those privacy needs for the most part, or depending on what tool you use, I guess. But through that, we've seen a big dip in those two. We've seen an increase, surprisingly, in lack of internal expertise. So more teams don't know how to use AI than they did last year. I think this is a really interesting data point. And two, I guess, possible reasons for it or things to think about. We've seen so many different AI models over the last year. Between October of last year and October of uh this year, um, when I was looking at the data to put together for Pulse Europe, uh, there was, I think, 24 different new AI models that were released and 31 different AI tools from like major vendors like Google and OpenAI. So we not only have we been trying to figure it out from what we had a year ago, we've introduced all these new ways for people to use AI and think about it and new models for us to understand and be able to deploy effectively. So year over year, not only have we seen a ton of new stuff in the market for us to be able to understand and leverage and deploy in our organizations, but we also haven't gotten super prescriptive with the change management part. And I think that's a really common barrier that we're seeing is people still don't know how we leverage it, how we deploy it, how we get adoption of it effectively. And that's, I think, part of that 3% increase in lack of internal expertise as well. A couple other things that I'll call out on this data is the output reliability too. So we've seen a big increase in the reliability, not the data privacy part, but the reliability of the output. And I think a lot of that is really related to AI slop, which is, if you're not familiar with the term, just AI outputs that are really not value add in any way, shape, or form, right? So the email that you ask it to craft that is actually none of the data points you wanted it to include, right? Or the summary of a meeting that invents stuff, right? Um it hallucinates information and isn't really value add for what you're trying to get it to do. I think we've seen a lot of this across our organizations, right? Where people are using it because it's faster, because it's easier, because it's the shiny new thing they want to use, but they haven't really understood how to use it. And so it's creating all this great content, but extra content that actually needs to be reworked seven times before it's what you actually need it to be, right? So a lot of that AI slap, I think, is the result of that output reliability increase over time. So the short version of all this is we don't know how to use AI well yet, and we don't yet totally trust it based on our experiences over the last year.
SPEAKER_01:Yeah, and that that's really kind of understandable. As you mentioned, we have seen so many new different AI models and tools coming into place. I think that we are so far from kind of adopting it yet, because there's way too many things to start playing with and start experimenting. And every few weeks there is something new that you can try with your either email or or any other tool. So just the whole shear of different things to play with and experiment is a lot to try to divide. Okay, what are we actually doing, what are we deploying, how to do change management about it. I think that everybody is still very comfortable being in that situation of, okay, we're talking a lot about it, we're experimenting, we're playing with it, and then we need some time to figure out what to actually take further and properly deploy.
SPEAKER_00:Yep, exactly. We do have a couple good data points on e-use cases for AI as well. So while we are still in a lot of those challenges, right, uh overcoming some of those barriers and understanding how we want to deploy it, we do still see teams adopting it and again getting a lot of value out of it. A couple of key use cases that we're seeing for CS teams, both either adopting or exploring. So the two bar charts here is those that are adopting AI today, and then the light blue is those exploring it over the next year. The ones that have been most adopted today, no surprise probably, is the basic use cases like auto summarization or drafting email responses and follow-ups, right? Pretty typical, I think, today. Those were some of the early use cases of AI. Still add a ton of value. Uh don't want to discount the number of hours you can save using AI for some of these things. Um still a big impact on CS teams, but again, a little bit more basic, not as CS specific. Where we're seeing a lot of interest in exploration and where people are learning how they can use AI and thinking about it in that way is things like churn or risk prediction, sentiment analysis, next best action recommendations as well, which are a little bit more CS specific and a little bit more prescriptive to our needs and our use case, which is actually a great opportunity for us to think about AI. Ultimately, AI and agents is just another way of delivering the value and solving the challenges that CS has faced over time. So a great opportunity for us to think about what are our core challenges, how can we optimize the human element for where it's going to add the most value, where relationships are still and always will be really critical in customer success, and leverage AI and digital and automation to solve kind of the rest of those points, right? So I think that's a great opportunity for CS overall, just kind of going into next year is what are the main challenges we're trying to face? How can we reimagine or reinvent how we're delivering this using AI and using agents and using these tools that are now uh available for us, right?
SPEAKER_01:It would be almost interesting to start predicting what would the results look like in a year's from now. We don't have to go that far. But I kind of have a feeling that lots of those light blue options that we are experimenting for will become less experimentation and more adoption as we will be so much kind of better off with understanding the use cases. Uh but overall, yes, there is still a lot to do with AI in terms of getting us trained, getting us understand what to use, how to use it for which use cases, and how to trust it better, right? Especially as you mentioned about the emails. Yeah, it's very often beautiful. You have 15 different tools to help you with emails, and then you just after you read all the 15 versions, then you have to rewrite it all by yourself sometimes. But okay, that's that's fine. We will leave you that. Uh, maybe we don't trust it fully yet and don't know how to use it well yet. To wrap it up before we jump into the QA, I wonder what would be your practical advice to give CS leaders who are trying to build that trust around AI and with a team and just trying to get them to be a bit more kind of adoptive of AI?
SPEAKER_00:Yeah, it's a great question. I think a lot of it is the training and the change management part of it. Um, I know we have one question that came in that I'm excited to answer in a second around AI adoption training as well. Like, how do we recommend CS teams get comfortable with it? A lot of it is getting really prescriptive with what your use case is. So understanding what you want AI to be able to deliver and creating a really strong foundation of who you're going to test it with, what are the success metrics that you're going to be measuring? So knowing the before scenario and knowing the after, what are you actually trying to influence, right? Really key point, I think, of getting internal buy-in and alignment on it is knowing what you're trying to do with it, how you want to make that impact. And then really being able to showcase what that impact has been. A lot of teams, I think, are very, very receptive. I hope, I think, across how do I save hours? How do I do my job more effectively, right? And not have to do all these manual things that are real pain points for me. So getting a lot of that buy-in, I think is being able to showcase that impact that you were able to save X hours, that you were able to make things more efficient and more effective. I think the other piece too is letting your teams kind of be your change management champions. So bringing everyone in to be part of that rollout process, having them part of the scoping exercise, right? The picking of the tool that you want to use or understanding how you want to deploy it yourselves. Really great ways to get that initial buy-in and then have those folks be your champions internally to be able to share with their teammates, with everyone else in the org, why AI has been such a great value add for that particular use case, right, that you're deploying it for. So a lot of it's knowing what you're trying to achieve and being able to measure it, and then also getting their buy-in and support across the process. One other thing I'll just add to answer that question is one thing that Gainsight's done to help encourage our own internal adoption of AI is every Friday we have this AI for all program where folks can come in and share what use case they're exploring, or hey, here's a challenge or a thing that I want to use. I have no idea how to figure it out. Help me. So they have a great opportunity to showcase what they're doing independently, have that individual support and engagement with the rest of the team on AI. And some of those use cases are great ones that we've started to codify as well. Um, it's great being our own customer zero for having how our own CS teams are leveraging AI, thinking about how we can put that into our own products or make that more prescriptive. So it's a great opportunity for our teams to get comfortable, explore, chat about AI, learn what's new from others, um, and be able to deploy that more effectively themselves.
SPEAKER_01:That's such a good call-out story. I've been hearing from some of the European faults at games and how they really love those sessions on Friday because they learn tons. And it's a great reminder how there are just so many people around us who love to play with just different tools. I know every time when I'm meeting with a team, I get I get 10 different tools that I've never heard of on almost on a weekly basis that I can start exploring. So it's really great call out to try to involve the team and ask them what they are playing with recently. And we have a last poll that we are launching just before our QA. Uh, we would like to hear from you what's your number one priority for the next six months in the new year? So feel free to take some time to respond to it. And I guess it's uh it's a good time to jump into the QA, Tori. Could you please recommend any AI adoption training for CS teams?
SPEAKER_00:Yeah, it's a great question. Um, it depends what tool you're using, what adoption training would be the most um effective and valuable for you. I think there's a lot of great internal programs you can run that are tool agnostic, things like AI hackathons, right? Um, just to get people exploring and engaging with how they can leverage AI. So there's a couple of ways you can think about that for holistic, just general AI learning. Depending on what tool you're using, most providers, um, I just being from GainSight, we do this for our tools. Um, but most providers have like training courses or in-app guides and videos you can watch to understand how to leverage that particular tool a little bit more effectively for your teams. So a lot of it is just the internal programs and processes you can stand up. The other half is understanding how you're leveraging specific tools and making sure people get the training, the body, and the support they need to adopt those most effectively. One other thing I'll share on this one is we have a lot of different metrics for adoption of like our own features. Um, so just as an interesting data point for everyone, tools that are within workflows, like things that are now you have a summary provided by AI, or now you can click a button and get XYZ, you know, output for you, are a lot easier for CS teams to adopt than when you're moving to like a totally different workflow. Tools like ChatGPT, for example, right? Where you have to know how to ask a good prompt. You have to understand what that output is and be able to refine and reflect that a little bit to get the output you're looking for, are usually a lot harder for teams to adopt because it requires a little bit more upfront knowledge on their end. So just thinking about what types of features you're using and how that AI is showing up for your teammates should also be a big factor in how you're doing that adoption training.
SPEAKER_01:Thank you for that, Tori. Question from Callum. Do you feel AI recording devices will impact the openness and candidness of customers now that there are so many recording devices that record and analyze? I often find customers give more when there is no recording. What do you think?
SPEAKER_00:No, it's a great question. There's kind of like two, I guess, goals of thought here. I feel like everything we do nowadays is recorded in some form or fashion somewhere, right? So it's become a lot more uh broadly accepted, I think, across customers, across ourselves when we're doing our own work, right, as employees for different companies. But I think there is always a great opportunity to turn recording off, like using it as a you making the call with something should be recorded, should not be recorded. There's also a lot of, I think, opportunities to analyze, but not necessarily record the whole session. So sometimes you can just get the transcript, right, versus doing a video recording if that makes customers more comfortable. So a couple of different ways you can think about it. I feel like in general, we haven't seen a huge impact to date. Cause again, I think people just have come to accept that recording is a thing we do for the most part across most channels. Um, but definitely a good call out to be mindful of when it comes to customers.
SPEAKER_01:Next question from Haley. How are Gainset customers utilizing staircase AI? You're just onboarding GainSend back into RCS org, but are excited by this new offering and cannot wait to get to use staircase. Great to hear the excitement.
SPEAKER_00:Yeah, staircase is obviously unbiased, but fantastic. I think it's a great tool for getting some of the data points I think CS has historically missed. Um, so the way that staircase works is it analyzes customer conversations, emails, phone calls, and recordings if you're if you're recording those for meetings, um, as well as things like support tickets, uh, Slack conversations if you Slack with your customers, right? And it takes all of those data points in to give you a really good sense of customer sentiment, customer health, where they're trending, where their risks, opportunities in a way that we've never been able to do for CS teams when we're typically only historically able to look at really structured data points, right? So this gives us that unstructured part of the puzzle. We've seen some customers actually, they they did an analysis, they're a longtime CS customer, just added staircase and did an analysis of if they looked at their historical data, when would staircase have identified uh risk opportunities or risk um signals before their historical CS programs would? And they found that with staircase, they actually saw risk three months earlier than they would have otherwise, because now they have those data signals and because it's doing that automated agentic analysis of where those risks are coming from and flagging those. So really exciting output, particularly for CS teams that are really focused on risk. Um, you know, a whole quarter to save a customer is a lot and it gives you a lot better opportunity to engage early on. So we're seeing a lot of really great results from Staircase. Um, it is fully embedded in our CS program. So if you're just for for Haley, if you're a CS org, you can access Staircase directly inside Gainsight CS as a product and use them really prescriptively together as well. I'm really excited about Staircase. That's all I'll say on that. Sorry. But um really great tool for us to get again those additional signals.
SPEAKER_01:Question from A. How can you become proactive and then predictive without having access to useful product data?
SPEAKER_00:It's really tough. Product data is, I think, one of the biggest aspects of getting really good risk analysis of a customer, as long as you sell a B2B SaaS type product, right? So knowing what they're using, how they're using it, how effectively they're adopting that particular tool and feature set is really important, I think, for CS teams, which again, why we saw that that trend coming out of pulse Europe. Really important for CS teams to be able to effectively manage their customers. It's not impossible, right? There are other data sets that collectively you can kind of bring together to fill in some other gaps there. But uh, I would definitely recommend product data as a really important part of the puzzle for CS teams. So that's a great question.
SPEAKER_01:And can you share the example of CS best practices for technology integrator and reseller companies that work across different tech brands and tech silos? Do you have any partner network for it as well?
SPEAKER_00:So I think for CS teams that are working with resellers, um like third-party partner programs, right, that deliver customer success motions, something like ThroughPartner CS, which is basically where you take all of those different resellers, bring that customer data into your CS system so you can help manage, understand, flag different risks, and really make all those resellers work on the same page, right, with where your customers are is probably a good recommendation there. Happy to follow up, Christian, with some some ThroughPartner CS best practices on that one as well that can help.
SPEAKER_01:And so far, the last question for today looks like it. How can we use AI to re-engage low users?
SPEAKER_00:Can you give some examples? So I think there's a couple of different ways you can leverage AI there, depending on what technologies that you have, right? Um, one great best practice if you have some sort of in-app engagement tool where you're able to surface in in-product right uh recommendations to customers is you have an opportunity to re-engage them based on what their current usage pattern is, trigger those engagements, right, when they need to, um, when you're trying to drive adoption back up of something. One challenge is always if they're not in your product because they have low adoption, in-app engagements aren't necessarily super effective. So triggering things like emails to those customers, um, making sure you're referencing and pulling them back into the product is a really great way to re-engage those low usage type customers. I think uh another key thing to use AI for is the risk signals there. So the earlier you know that a customer is starting to drop off in that adoption curve, the better you can engage them while they're still engageable, right? Versus a customer that has totally disconnected and isn't really interested, right, in re-engaging with your product. So using AI to get those early risk signals, I think is a key way to solve that.
SPEAKER_01:Tori, thank you so much for sharing all of your wonderful insights today. Thank you to our participants for great questions. And I guess that's all for today.
SPEAKER_00:Thank you. Thanks everyone for joining us today. Great to chat with you all, great to chat with you, Maria, and really appreciate everyone joining us. So have a great rest of the day, everyone.