What happened when we turned off self-serve sales for a week

How a radical sales experiment with sign-ups vs. demos solved an internal debate in our team

When we first turned on paid plans for Slite, it was a no-brainer to make sales self-serve. Teams could simply visit our website, pick a plan, and sign up.

But self-serve means we have less control over activation and actually makes it harder for Sales to get in touch. Like with many freemium products, leads had access to the major part of the product right away. It was difficult for the Sales and Customer Success teams to come in and show people how to structure their Slite and communicate better within it.

In early 2021, we had a hunch that a more personalized sales model could work better, but the only way to find out was to test it. We often discussed piloting a direct sales strategy in leadership meetings, but we needed to show that it was worth it. More specifically, we needed to prove two things:

  1. whether onboarding calls increased activation
  2. and if having calls with every qualified lead was scalable based on our model

So we set up a radical experiment to measure it:

For one week, all leads would have to book a demo to access the product.

Simple, right? But there was a bit of a bottleneck.

  • At this period, we had 150 qualified leads (out of about 1000 total leads) signed up per week.
  • We also only had a single salesperson (*record scratch, freeze frame* Hey, that's me, Brieuc Sebillote. I may have tried to help you make the most out of Slite at some point, and if not, I probably will soon. Maybe even at the end of this article.)

There was no way I could handle all the potential demos alone. I needed some help.

How we set up the experiment

We followed a modified version of the scientific method to gather our sales data. Here's what we did:

  • Built a new Sales squad internally
  • Narrowed down the sample size to get the most accurate data
  • Set parameters to measure results
  • Created a Sales experiment doc using a Slite template to share learnings with the rest of the team.

Each step played a vital role in the experiment's success.

Our new sales squad

For the experiment, we wanted to utilize product experts in our team, so the CEO and members of the Customer support & Product teams all pitched in. Sales demos took time away from their regular duties, so the experiment was a true luxury (and not one we could easily replicate in the future).

Validating the data

Now that we had assembled a squad, we needed to make sure the experiment's results were as informative and accurate as possible.

First, the time frame: while it would have been nice to run the experiment for a month, we honestly didn't have the resources. So we decided to run it for 1 week.

Then, the sample size. Slite is open to everyone, but we segment the teams who use Slite by size. We decided to filter out individual users, and focus on teams that were greater than 10 people, who would really benefit from white-glove onboarding.

Example from a random week:

This would provide enough data to give us statistically significant results.

Measurement

Since we'd disrupted 5 team members' work for a week, proving impact was crucial. We decided to measure impact via a few key variables:

  • # of signups (self-serve vs. demo)
  • % of teams activated
  • % converted to paid
  • % change in revenue (measured over 60 days)

Activation criteria included 2 or more team members signing up for Slite and using the editor more than 2x each. Activation was a big challenge for us, because we wanted to find ways to add value for our newest customers, and demos seemed like the best way to do it. To verify if this new onboarding model could be scalable, we measured revenue on a 60-day basis and in USD.

Results

In the end, we booked 46 demos for the week. Here's how the results turned out, compared to adjacent weeks from the same monthly period.

Quantitative results

The activation and conversion rates were much higher with demos, but there was also a bias: leads who took the demo could be more likely to have greater intent to purchase. But that didn't affect the 2nd goal:

The Average Revenue per Account (ARPA) was higher, but the revenue was not significantly bigger enough. We expected at least twice so it could start to make sense. It was not scalable when we took into account the resources it took - the demo approach only starts to be relevant if your ARPA is way higher.

Qualitative results

We also collected data we hadn't planned to measure. For instance, we got to fill in gaps in our customer journey (more detailed customer profiles, buying triggers,  behaviour, and more.)

We also found out from talking to leads, that the biggest blocker was understanding how to organize docs and communication in Slite.

We grew more empathetic as a team, since 5 people from different teams were exposed to the Sales process.

Final thoughts

In short, a radical experiment can bring new insights in the way safe bets can't. Our sales experiment showed us the effectiveness of personalized onboarding, even though it's not exactly feasible right now for our team. And there were several unseen benefits - experts with different specialities got to learn the Sales process, and our customers got to meet different team members personally and see the care and thought that goes into the product (we also do this with All Hands Support!). We also learned that while this might not be the right moment to do white-glove sales and onboarding, that it can be an effective tactic for Slite if and when we start increasing the ARPA significantly by focusing on larger customers or increasing the price.

Lastly, our experiment was an inspiration to the rest of the team to question their own processes, and try to break them to learn something new.  

If you're interested in learning more about Slite, try it out (and feel free to reach out to me about booking a demo).
You
Bring your team on the same page.
Discover Slite