Kissmetrics Case Study

Industry
Business Type
Employee Count
Location
PagerDuty’s Analytics Stack for Optimizing Onboarding
David Shackelford is a Product Manager at PagerDuty, an operations performance platform that helps companies get powerful visibility, super reliable alerts, and faster incident resolution times. This is David’s story on how PagerDuty used Kissmetrics to improve their trial experience.
At PagerDuty, reliability is our bread and butter, and we believe that starts with the first time the user interacts with our platform — our trial onboarding. We’d heard from customers that our service was simple to set up and get started, but we also knew there was some room for improvement. Occasionally users would finish their trials without seeing all of our features or not finish their account setting which caused them to miss a few alerts. We wanted to help all of our customers reach success, and we knew if we did this well, we’d also boost our trial-to-paid conversion rate — a nice added benefit. We talked to customers, our support team, and our UX team before diving into the onboarding redesign, but we also knew we had to complement the qualitative feedback with quantitative data. We wanted to use telemetry to understand exactly what users were doing in our product, and where they were getting blocked, so we could deliver an even better first experience.
Before we started making changes, we needed to establish a baseline: how were users moving throughout the app? Did their paths match our expectations? Did they match the onboarding flow we were trying to push them through? After researching the user analytics space, we found Mixpanel and Kissmetrics had the best on-paper fit to answering these types of questions. However, the investment (in both money and implementation time) to adopt these kinds of tool was significant — so much so that we wanted to test both to make sure we picked the right tool. But, the only way to comprehensively test tools like this is to run them against live data. That’s where Segment came in. We were excited that to find a tool that would let us make a single tracking call and view the data, simultaneously, in multiple downstream tools. Segment made it easy to test both platforms at the same time, and with more departments looking at additional analytics tools, the prospect of future fast integrations also excited us.
We used our questions about user flow and conversion funnels to test if Kissmetrics and Mixpanel could help us understand the current state of affairs. Using Segment, we tracked every page in our web app and as many events as we could think of (excluding a few super-high-volume events such as incident triggering). Then our UX, Product, and Product Marketing teams dove into the tools to evaluate how well each could answer our usage questions. After spending a few weeks with both tools, we went with Kissmetrics. To be honest, they’re both great, but we liked the funnel calculations in Kissmetrics a bit more. They also offered multiple methods for side-loading and backloading out-of-band data like sales touches, so Kissmetrics was the winner. Throughout this process, we also learned that we should have tracked far fewer events. If we were to do it again, we’d only collect events that signaled a very important user behavior and that contributed to our metrics for this project. It gets noisy when you track too many events.
After combing through the data, our user experience team had a lot of ideas to develop a new onboarding flow. We mocked up several approaches, vetted them against both customers and people inside the company, and tried out the one that we thought would best communicate the value of PagerDuty and help customers get set up quickly. Our approach included both a full-page takeover wizard right after signup, as well as a “nurture bar” afterwards showing steps remaining to complete setup. After implementation, we tracked customer fall-off at each stage of the wizard to see how we were doing. We also measured user performance against “Trial Interaction,” an internal calculation for when an account hits a particular state that shows they’re doing well in their trial, what others call “the aha moment”. Using Kissmetrics, it was very easy to measure how the new design was working.
After shipping the new experience, we saw a 25% boost in the “Interaction” metric mentioned earlier, measured using Kissmetrics Cohort Reports. Kissmetrics showed us that with the new wizard, new users were actively using the product earlier in their trial, and using more of the features in the product. In addition, far fewer new users were ending up in “fail states,” such as being on-call without notification rules set up. Since then we’ve run various experiments on the trial funnel, and the Cohort Report is really helpful for looking at the effects of those experiments— determining whether our changes are actually helping users be more successful in trial. Qualitative feedback is also important to get a full picture of how changes to a product affect the user experience. We’re pretty low-tech in this regard — I dumped a list of users I pull from Kissmetrics into to CSV, then send them a quick micro-survey to see if anything was unclear. We also gathered internal feedback from our sales team and support teams to confirm that customers were finding our new onboarding easy to use and understand.
Since our initial work on the wizard, we’ve expanded our use of analytics to look at behavior of both trial accounts and active customers. Whenever I’m about to start work in a particular area of the product, I’ll use Kissmetrics to pull a list of highly active users in that area, and then reach out to them to understand how they’re using our features and what their pain points and goals are. We implemented mobile tracking as well because some of our customers mainly use our service through our mobile app, and we also installed the ruby gem for code-level tracking. There are plenty of improvements we’d like to make to our onboarding, but since it’s doing pretty well right now, our next project is going to be investigating simultaneous A/B testing. We move fast, and if we’re running a sales or marketing initiative alongside product changes, sometimes it’s tough to sort out what impacted what. Split-testing trial experiences should let us get cleaner data about how our onboarding work is improving our trial users’ experience, and ultimately help us make better decisions about the ideal trial experience.
Like any new initiative, you learn a lot when implementing analytics for the first time. Here are some of our takeaways — hopefully they’re helpful to you as well.
Get more customers, make smarter decisions, and boost your bottom line.
Schedule a Free Demo