The State of Product-Led Growth: Onboarding Benchmarks

The State of Product-Led Growth: Onboarding Benchmarks

This article is an abstract of “ The State of Product-Led Growth” to date. The article refers to the key onboarding benchmarks used today both in a Customer Success and a self serve onboarding strategy. Authored by Despina Exadaktylou founder at Product-Led Growth Hub, World's #1 PLG Academy.

Research Scope:

To uncover the challenges and competing strengths Product-Led organizations employ, we embarked on a mission to define product experience in the SaaS landscape. To achieve this, we surveyed 40 SaaS organizations and interviewed 50 executives who own or lead Product Management, Customer Success, Marketing and Sales activations. 

The goal of our effort is to map, through real-world examples, Product-Led techniques used and estimate how they affect critical aspects of the customer journey. We believe that our massive data sample and thorough analysis will enable SaaS organizations to align their internal teams and optimize product delivery. Our commitment is to help reshape the SaaS growth mindset via the lens of Product Led practices. The survey’s results directly guide those efforts, and this extensive report will highly reflect on them.

Onboarding Benchmarks

No alt text provided for this image
  • SaaS organizations use an array of metrics - a blend of business (51%), marketing (15%), and product (34%).
  • Being a fundamental need of any organization to be aware of its business and marketing activities ROI, those KPIs can be lagging indicators as they have little to do with product experience. Negative changes on those metrics do not make their appearance instantly, but when their (churn) effects are in place.
  • Organizations should emphasize on product metrics to distill the right conclusions on users’ behavior. The continuous onboarding delivery forces product managers to discover actions that reinforce or downgrade the process by considering in-app user behavior. This is how product experience will become an intentional effort across an entire organization.

Personalization Factors

Personalization, the double-edged knife following any onboarding tactic, on Human Assisted onboarding implicates Sales and Customer Success by increasing acquisition and retention costs. On Self Serve, it is supposed to be established via scalable engagement practices tailored to users’ needs.

No alt text provided for this image

Key Takeaways

  • It is evident that Self Serve adopters strive to balance the involvement of Sales (40%)* and Customer Success organizations' (63%) in an effort to activate and retain an account.
  • On Human Assisted Onboarding, the disposal of Sales (69%)* during acquisition and of a dedicated CSM (93%) later on, is the first choice coming into mind. The nature and needs of Mid-Market and Enterprise customers make sales procedure complex and customer satisfaction key priority.
  • Email practices substitute a personalization tactic with 84% (Self Serve)  and 60% (Human Assisted) preference.
  • Education practices (Free trainings) affect the buyer-seller relationship with 68% preference on Self-Serve and 60% on Human Assisted.
  • The same impact have On site/In-App messages with 84% and 53% preference on Self Serve and Human Assisted, respectively.
  • Social Communities play an essential role when establishing a personal connection with an almost equal level of preference among Self Serve (73%) and Human Assisted (66%).
  • Social Media is a prominent personalization tactic on Self Serve (57%) and less preferred on Human Assisted (20%)

User Behavioral Metrics/Per Strategy

No alt text provided for this image

Data Analysis

  • On Human Assisted onboarding, top benchmarks are Monthly Churn Rate (100%), NPS (Net Promoter Score) (85%) and Key Features Adoption (65%).
  • On Self Serve onboarding, top behavioral metrics are Monthly Churn Rate (65%), Website Traffic (65%), NPS (45%) and Key Features Adoption (45%).
  • 40% of the metrics the research uncovered are product related, 40% marketing and 15% business.
  • PQL measurements (PQL Scoring) are used by 35% of participants across both strategies.

User Behavioral Metrics/Per Department

No alt text provided for this image

Data Analysis

  • Top behavioral metrics are NPS, Monthly Churn Rate and Website Traffic, with mean preference 75%, 72%, and 60% respectively. 
  • Paradoxically, Marketing (46%) favors more PQL scoring, followed by Customer Success (36%) and Product Management (33%).
  • Key Feature Adoption is monitored mostly by Customer Success (63%), followed by Product Management (51%) and Marketing (46%).
  • Despite the increased consideration of product metrics (eg.PQL Scoring) Marketing equally assesses Lead Scoring 46%.
  • Time to Invite Team Members (Breadth of Use)
  • is mostly monitored by Customer Success (33%) while Product Management ranks second (17%) and Marketing (9%) last.
  • The ROI onboarding yields (Conversions), is monitored mostly by Customer Success (55%) and Marketing (46%) and less by Product Management (21%)

Key Takeaways

Product Metrics

Emerging product metrics (PQL Scoring) redefine the evaluation of customer satisfaction by considering product usage.

Marketing Metrics

Marketing metrics still play a significant role when evaluating user behavior. Being out of the product activities, those evaluations don’t assess product behavior but acquisition practices.

Product Management

The lack of product managers to adopt product usage KPIs to their full extent suggests that there is room for improvement when evaluating user behavior. Product performance and customer feedback in conjunction with product data analysis should be the top three priorities following product management decisions.

No alt text provided for this image

Data Analysis

  • SaaS organizations invest in software solutions related to customer engagement (Live- Chat s/w), Account Based Management (CRM), Email practices (ESPs) and Marketing Automation (MA)
  • Investment in solutions that go hand in hand with product experience evaluation (Product Experience Software) is 20% and realized only from Human Assisted adopters.
  • Qualitative data pinpoint organizations’ investment (50%) in data enrichment and data storage solutions too.

Key Takeaways

Current analytics programs

The adopted services measure product analytics on multiple dimensions, such as feature or account/user-level measurements. If usage and UI instrumentation measurements do not supplement their analysis, they provide a short-sighted view of how user behavior is delivered.

Advanced Analytics programs

Organizations in favor of advanced analytics programs (20%), put together quantitative and qualitative measurements to derive the right insights. They invest in core product analytics and augment data by focusing equally on the breadth, depth, efficiency, and frequency of use.

Advanced Analytics Programs Characteristics

User Mapping: Advanced analytics programs can skew users’ progress on the micro level and macro level whether or not an account is moving forward retention and expansion.

Customer Feedback Capitalization: By investing in product engagement actions customer feedback complements Sales or Customer Success activations.

Heavy Experimentation: Experimentation delivers insights on users in-app behavior & enable organizations to experiment with product engagement practices.

Human Assisted Onboarding

Part one: Key Performance Indicators

No alt text provided for this image

Data Analysis

  • Top KPIs are NPS (50%), Activation (42%) ,Time to Initial Value (42%), Key Feature Adoption (33%), Usage (25%) and Team Activation (25%).
  • Product measurements are critical, despite the limited involvement of Product Management (38%)*, due to the continuous monitoring of the customer journey from Customer Success.

Key Takeaways

Customer advocacy: Customer advocacy (NPS) should consider additional usage metrics. In the opposite scenario acts as a lagging indicator and does not deliver objective measurements

Retention: Product engagement performance indicators are intrinsically related to Retention on Self Serve onboarding, but on Human Assisted, additional parameters are considered. Being subject to Customer Success activations, Retention substitutes a KPI for 8% while Usage and Key Feature Adoption (Depth of Use) have a 33% and 25% of preference respectively.

Breadth of use: Activation indicates how successfully users embark into products' value and should be measured via the lens of team activation (Breadth of use). While most participants attested to getting internal buy-in from end users as the number one challenge, Team Activation is considered important only by 25%.

Expansion: While Retention is neglected, its offspring, Expansion, is measured by 25%. Expansion is also subject to CS practices. Qualitative data reaffirm that consistency in meeting buyers needs, via one on one interactions, is the number one criterion leading to accounts’ long term prosperity.

High touch vs High Tech: Despite the need to provide exceptional customer experience based on personal engagement, harmonization among the high touch techniques and high-tech techniques is essential. Targeted in application training is not supposed to be used to replace one on one interactions. When used in conjunction with product data, it supplements Customer Success activations and allows CSMs to focus more on customers’ needs. In addition, harmonization among practices reduces onboarding costs and increases adoption rates.

High Touch Vs. High Tech use cases

Use Case : Pendo

Pendo, the popular product experience software, is a living proof that this balance can be achieved. When new customers are onboarded, Customer Success sets up one-on-one calls and leverages the software’s features to train users at the same time. 

Being ingrained into product data analysis as an organization, the provided training at scale always considers the POE metrics to provide context behind users’ usage. Those practices provide positive outcomes for Customer Success since its constituents have the luxury to focus on business objectives by dedicating only 25-30% airtime in application onboarding.

Use Case: Userlane

Userlane , the popular onboarding S/W , serving mostly enterprise customers launched a while ago an alternate asynchronous onboarding in addition to its Human Assisted strategy. The vendor uses the S/W own features to self serve end users all the way through without compromising the required decision making between the two parties. The “Inception project” as it’s known internally improved the delivery of the onboarding process and increased adoption (+48%). The implementation of targeted walkthroughs within the product allowed a thorough personalized, albeit automatic onboarding process that didn’t require the active participation of different units. Additionally, time to initial value was decreased and after launch, buyers could go through the set up autonomously.

Part two: A/B Constituents

The optimization of product engagement practices is a rather big discussion, but mostly an outcome of extensive experimentation. Οn Self-serve onboarding, where product engagement prevails, experimentation is perceived as a given. 

On the flip side, when targeting Mid Market and Enterprise customers, customization is subsequent to any onboarding practice. The secret, in both cases, is disposing scalable practices cautiously by focussing on the context behind usage, always in conjunction with historical data analysis.

No alt text provided for this image
  • 16% Does not experiment on any aspect of the onboarding strategy.
  • 8% Role-Based onboarding, Activation, UI/UX, Product Content optimization, Hotspots, and Sales Outreach have 8% preference.

Part Three: Scalable Tactics

No alt text provided for this image

Data Analysis

Moving forward to the scalable automation techniques:

  • Email Marketing Campaigns, NPS, and Walkthroughs are top of mind with 73%, 66%, and 60% preference respectively. 
  • Product tours, Live Chat Flows, and Tooltips are preferred from half (53%) of the participants.
  • Customer feedback (In-App Surveys 20%) and in-app training (In Product Tutorials 33%) are not among the top preferred tactics.

Key Takeaways

Role Based Onboarding: If we compare the A/B constituents versus the automation tactics, we realize that onboarding owners have multiple in-app engagement practices at hand. Limited investment on Role Based Onboarding (8%) indicates that in-app practices suggest a predispose route and don’t consider the context behind usage.

Breadth: Organizations in need to onboard hundreds of end users should optimize product experience by focusing on POEs metrics (e.g. Breadth) performance. In the opposite scenario, Sales and Customer Success will act as product champions who miss context behind users’ behavior.

Self Serve Onboarding

Part one: Key Performance Indicators

No alt text provided for this image

Data Analysis

  • Main onboarding KPIs constitute Activation (61%), CRO Techniques (57%) and Retention (33%). 
  • Retention, subsequent to usage measurements, is considered an important indicator by 33% while Expansion by 14%.
  • Activation (61%) is top of mind for most participants but its constituents, Time to Initial Value and Team Activation, fall behind with 5% and 10% preference respectively. Increased preference in Activation and neglection of Retention and Expansion indicates that onboarding prevalence ends after the activation stage.
  • 51% CRO techniques are among the top three KPIs, but LTV (5%) is neglected.
  • 16% doesn’t set any KPIs to monitor onboarding.
  • 81% Product Management has increased involvement in the onboarding process (81%), but critical indicators like Product usage (5%), In-app engagement measurement (5%) and Educational Content Monitoring (5%), are neglected. The only exception to the rule is Feature Adoption (28%) which given the performance of product measurements, is not evaluated efficiently.

Part Two: A/B Constituents

No alt text provided for this image

Data Analysis

  • Email practices (out of the app onboarding activations) take the lead with 47% preference, followed by In-App Chat Flows (33%), testing of the Sign-up experience (20%) and UI/UX practices (20%).
  • CRO Techniques (57%)* are vital for onboarding evaluations, but their constituent, Sign Up Experience, is optimized by 20% of the participants.
  • Low levels of experimentation on Activation (5%) lead to the conclusion that acquisition practices (eg.Sign up experience) prevail.

Key Takeaways

  • Increased preference on UI/UX experimentation (20%) in combination with decreased investment on User Type & Proficiency and Activation (5%) make the product experience subject to design practices.
  • Speed to implement (Trial Length), the number one factor associated with Self Serve activation, has 5% preference.
  • 20% of the participants do not test any aspect of their onboarding process.

Part Three: Scalable Tactics

No alt text provided for this image

Data Analysis

  • Most preferred automation tactics are Email Campaigns (84%), In-App Tutorials, and Walkthroughs with 73% preference. 
  • Equally important are Live Chat Flows, Product Tours, and Tooltips with 70% preference.

Part Four: Experimentation vs. Scalability

The table below is a representation of the corresponding onboarding tactics for the top five A/B constituents:

From the top five experimentation constituents, we exclude Sign-up Experience since it falls under Marketing practices and is not reliant to in-app activations. 

No alt text provided for this image

Key Takeaways

  • Content: Content affects many scalable practices but is optimized by a limited 14%.
  • Email Campaigns: Despite Email Campaigns prevalence (84%) only 47%of the participants iterate them constantly.
  • In-app Engagements In-App Tutorials (73%), Live Chat Flows (68%), Welcome Messages (68%) & In-App Sales Demos (57%) have high preference but are iterated by 33%.
  • User type & Proficiency Limited investment on User Type & Proficiency (5%) experimentation implies that in-app. messages are thrown randomly in front of users and neglect context of usage. Displaying messages to users indicates a predisposed route and that the learning curve, role, or proficiency levels are overlooked. Only product data analysis and historic usage indicate efficiently where and why product engagement practices should exist. This investment does not imply displacing tooltips or product guides for the sake of experimentation. Any product engagement iteration, should follow usage and executed when the onboarding team can validate the why behind that change. Random displacements lead to confusion and friction while keeping conversions and retention rates low.

---------------------------------------------------------------------------------------------------------------

Originally published at ReinventGrowth's site on May 16th, 2019. Claim a free copy of the extensive report here

At ReinventGrowth we are convinced that Product-Led practices is the future of SaaS. Keen on making the transition to a Product-Led GTM model? Contact us today and let us know how we can help your organization get there.

ΑΘΑΝΑΣΙΟΣ ΚΟΛΛΥΡΗΣ

Web Designer, UX Designer, Σχεδιαστ?? Ιστοσελ?δων, Επεξεργαστ?? Εικ?να?

5 年

Very nice research!

要查看或添加评论,请登录

Despina Exadaktylou (She/Her)的更多文章

社区洞察

其他会员也浏览了