Best Practices

Maximize Your Revenue with PQLs: The Guide to SaaS Success

Sylvain Giuliani
Sylvain Giuliani October 22, 2020

Syl is the Head of Growth & Operations at Census. He's a revenue leader and mentor with a decade of experience building go-to-market strategies for developer tools. San Francisco, California, United States

This is part one of our Product Data to Revenue series, exploring how you can turn your product data into more revenue. In each part of this series, we're revealing actionable tactics to help you generate more revenue by leveraging your existing product usage data.

More than half of SaaS companies now offer some kind of free trial or demo to potential customers. Because as Jason McClelland, CMO of Domino Data Lab, explained, free trials and freemium services help “generate interest amongst developers and data scientists, who typically don’t care about flashy marketing initiatives — they just want to try the product out.” Industry folks often refer to this strategy as “product-led.” We just call this good business. And it leads to a whole new kind of sales qualification: the PQL.

Product qualified leads (PQLs) have made their predecessors—marketing qualified leads (MQLs) and sales qualified leads (SQLs)—obsolete.

Our friend, Francis Brero, co-founder and CRO of MadKudu, explained PQLs’ superiority, saying:

“Instead of trying to persuade someone to purchase a product they’ve never used, you’re helping them get more value from a product they already love. Instead of guessing their needs from a contact form submission or the pages they’ve visited on your website, you can actually see how they’re using your product and shape your sales process to match."

PQLs are the data-driven future of SaaS industry sales, and if you’re not leveraging them, you’re ignoring your ideal customers. Here’s why you need them and how to use them.

But first, WTF is a PQL?

PQLs are potential customers who have already had a positive interaction with your software. They’re users of your freemium version or free trial who have achieved a specific set of qualifying actions, designed to identify users who are most likely to buy.

A Slack free user, for instance, becomes a PQL when their team reaches 2,000 sent messages. Slack founder Stewart Butterfield explained that

Based on experience of which companies stuck with us and which didn’t, we decided that any team that has exchanged 2,000 messages in its history has tried Slack—really tried it,” and, perhaps most importantly, “. . . 93% of those customers are still using Slack today.”

WorkOS, described as a provider of “developer APIs/SDKs for enterprise-ready features like Single Sign-On (SSO),” assesses external connections to identify product qualified leads. As its CEO, Michael Grinich, explained, “Our PQL could be defined as number of external connections. External connections are defined as companies that integrate SSO or directory sync with our customers’ applications using WorkOS.”

PQLs Are the MVP of Sales Leads

PQLs are the most valuable type of sales lead for two main reasons:

  1. PQLs convert at a higher rate. According to Kieran Flanagan, the VP of growth at HubSpot, “In B2B SaaS, PQLs (product-qualified leads) will often convert at 5x to 6x that of MQLs (marketing-qualified leads).”
  2. PQLs are less likely to churn once they’ve become paying customers. Bonjoro reduced its customer churn by over 60% just by switching its focus from MQLs to PQLs.

So, if you want to acquire more customers with a high lifetime value, you need to focus your marketing effort on generating PQLs and your sales teams on converting them—and say goodbye to MQLs and SQLs.

What Goes Into a PQL Scoring Model

To create a PQL scoring model, you need to begin by identifying your business’ most important metrics. You likely have an intuitive understanding of what these are— the metrics you most often report on—things like the number of active users, number of shares, or adoption of enterprise features such as SSO.

The best way to score those metrics and identify your PQLs is to use logistic regression. Your objective is to identify an action or series of actions that frequently lead to a free user becoming a paying customer.

First, form a hypothesis by considering what these actions could be, based on your unique software. Slack, for instance, included the number of messages sent as a possible indicator of a qualified lead, but they could have also included the number of emojis sent or even co-workers invited to a shared channel. If your SaaS company provides route optimization services, you may want to look at the number of stops routed per driver or the number of drivers added to an account during their free trial.

Next, review your historical data using a logistic regression model to see if your hypothesis was correct. If there’s a positive correlation between your action and conversion, your graph will look something like this:

Based on this graph, when a free user performs the action in question five times, they become more likely to convert. Thus, five actions (such as messages sent or stops routed) would be your PQL. On the other hand, if your graph looks more like the one below, you can rule out this action as a potential PQL.

As you can see, this particular action has no correlation with conversions. It’s also important to note that, for some companies, PQLs are actually two separate actions completed by the same user. Slack, for instance, could have found that users who had sent 1,500 messages and started at least two shared Slack channels were most likely to see value in Slack.

How to Use PQLs in 4 Steps

Here are the essential steps to leverage PQLs:

1. Make sure you have one “source of truth” for your data

Your data architecture needs to be set up in such a way that you have one “source of truth” to ensure that you’re deriving your PQLs from quality data. Use a data warehouse like Snowflake and a hub-and-spoke model to synchronize your data without running into consistency and reliability issues.

2. Create a score on your unified set of accounts

Separate your statistical data by buyer personas and account types. Organizational context is extremely important for understanding your leads, as different types of customers may have different qualifying actions. Enterprise accounts, for example, may need to perform a certain action 100 times before they become qualified, where smaller teams and accounts may only need to perform an action 25 times.

Next, set up a lead scoring model that automatically identifies leads who are most likely to convert to sales. Read more about lead scoring strategies in The Modern Guide to Lead Qualification.

3. Route the accounts to the right place with the context provided

Apply your PQL scoring model (as explained in the previous section) to identify which free users your sales team should be reaching out to (and why). Once you’ve identified your PQLs and refined your scoring model (so you know which actions to focus on), use Census to send those PQLs to your CRM, such as Salesforce, where you can then route the PQLs to the right sales reps and trigger notifications. You can also use Census to send PQLs to your MAP (Marketing Automation Platform), such as Marketo, where you can automatically start an email campaign or sequences.

4. Tailor sales methods for each type of PQL

Now for the really fun part: Reach out to your PQLs and engage with users who already value your product in a meaningful way. Build an engagement strategy for your PQLs based on each of your buyer personas.

Consider the problem your PQL is trying to solve (the thing they’re using your software to tackle) and write messaging that speaks to that solution. Show them how or why your paid version will deliver an even better experience. Since you know how they are using your product right now, it is easy to craft a personalized messages and convert them to customers.

Get Data Where and How You Need It With Census

To leverage PQLs, you’ll need to be able to access your user data. At Census, we help companies of all sizes syncing data from their warehouse to their business tools to get their valuable user insights where and when they need them.

Book a free demo, and we’ll show you how to leverage your existing product data to build a PQL scoring model to find the gold in your existing freemium userbase.

👉 Want to see some real-world PQL use cases from the best PLG companies? Check out part two of our Product Data to Revenue series!

Related articles

Customer Stories
Built With Census Embedded: Labelbox Becomes Data Warehouse-Native
Built With Census Embedded: Labelbox Becomes Data Warehouse-Native

Every business’s best source of truth is in their cloud data warehouse. If you’re a SaaS provider, your customer’s best data is in their cloud data warehouse, too.

Best Practices
Keeping Data Private with the Composable CDP
Keeping Data Private with the Composable CDP

One of the benefits of composing your Customer Data Platform on your data warehouse is enforcing and maintaining strong controls over how, where, and to whom your data is exposed.

Product News
Sync data 100x faster on Snowflake with Census Live Syncs
Sync data 100x faster on Snowflake with Census Live Syncs

For years, working with high-quality data in real time was an elusive goal for data teams. Two hurdles blocked real-time data activation on Snowflake from becoming a reality: Lack of low-latency data flows and transformation pipelines The compute cost of running queries at high frequency in order to provide real-time insights Today, we’re solving both of those challenges by partnering with Snowflake to support our real-time Live Syncs, which can be 100 times faster and 100 times cheaper to operate than traditional Reverse ETL. You can create a Live Sync using any Snowflake table (including Dynamic Tables) as a source, and sync data to over 200 business tools within seconds. We’re proud to offer the fastest Reverse ETL platform on the planet, and the only one capable of real-time activation with Snowflake. 👉 Luke Ambrosetti discusses Live Sync architecture in-depth on Snowflake’s Medium blog here. Real-Time Composable CDP with Snowflake Developed alongside Snowflake’s product team, we’re excited to enable the fastest-ever data activation on Snowflake. Today marks a massive paradigm shift in how quickly companies can leverage their first-party data to stay ahead of their competition. In the past, businesses had to implement their real-time use cases outside their Data Cloud by building a separate fast path, through hosted custom infrastructure and event buses, or piles of if-this-then-that no-code hacks — all with painful limitations such as lack of scalability, data silos, and low adaptability. Census Live Syncs were born to tear down the latency barrier that previously prevented companies from centralizing these integrations with all of their others. Census Live Syncs and Snowflake now combine to offer real-time CDP capabilities without having to abandon the Data Cloud. This Composable CDP approach transforms the Data Cloud infrastructure that companies already have into an engine that drives business growth and revenue, delivering huge cost savings and data-driven decisions without complex engineering. Together we’re enabling marketing and business teams to interact with customers at the moment of intent, deliver the most personalized recommendations, and update AI models with the freshest insights. Doing the Math: 100x Faster and 100x Cheaper There are two primary ways to use Census Live Syncs — through Snowflake Dynamic Tables, or directly through Snowflake Streams. Near real time: Dynamic Tables have a target lag of minimum 1 minute (as of March 2024). Real time: Live Syncs can operate off a Snowflake Stream directly to achieve true real-time activation in single-digit seconds. Using a real-world example, one of our customers was looking for real-time activation to personalize in-app content immediately. They replaced their previous hourly process with Census Live Syncs, achieving an end-to-end latency of <1 minute. They observed that Live Syncs are 144 times cheaper and 150 times faster than their previous Reverse ETL process. It’s rare to offer customers multiple orders of magnitude of improvement as part of a product release, but we did the math. Continuous Syncs (traditional Reverse ETL) Census Live Syncs Improvement Cost 24 hours = 24 Snowflake credits. 24 * $2 * 30 = $1440/month ⅙ of a credit per day. ⅙ * $2 * 30 = $10/month 144x Speed Transformation hourly job + 15 minutes for ETL = 75 minutes on average 30 seconds on average 150x Cost The previous method of lowest latency Reverse ETL, called Continuous Syncs, required a Snowflake compute platform to be live 24/7 in order to continuously detect changes. This was expensive and also wasteful for datasets that don’t change often. Assuming that one Snowflake credit is on average $2, traditional Reverse ETL costs 24 credits * $2 * 30 days = $1440 per month. Using Snowflake’s Streams to detect changes offers a huge saving in credits to detect changes, just 1/6th of a single credit in equivalent cost, lowering the cost to $10 per month. Speed Real-time activation also requires ETL and transformation workflows to be low latency. In this example, our customer needed real-time activation of an event that occurs 10 times per day. First, we reduced their ETL processing time to 1 second with our HTTP Request source. On the activation side, Live Syncs activate data with subsecond latency. 1 second HTTP Live Sync + 1 minute Dynamic Table refresh + 1 second Census Snowflake Live Sync = 1 minute end-to-end latency. This process can be even faster when using Live Syncs with a Snowflake Stream. For this customer, using Census Live Syncs on Snowflake was 144x cheaper and 150x faster than their previous Reverse ETL process How Live Syncs work It’s easy to set up a real-time workflow with Snowflake as a source in three steps: