The State of Pricing, Demo, and Case Study Pages
When you think about B2B SaaS websites, regardless of how disruptive the product is, there are always some “sacred pages” that most companies maintain. These pages are so ingrained in the fabric of the site that their existence is rarely, if ever, questioned.
Let’s think about the pricing pages; if the pricing isn’t transparent, does a company really need a pricing page? If so, is saying “submit the form to find out about pricing” enough? This was the initial question. The next question was about case studies. Many companies aim to feature as many case studies as possible on their websites, but to what end? Do website visitors actually engage with these pages? Do these pages influence conversions? Do companies even track these pages? These realizations sparked our curiosity; unsure whether these were merely following the status quo or proven best practices, we set out to myth-bust these.
Methodology
MQL: High-intent demo, pricing page, contact us submissions. Basically every hand-raiser on the website. Ebook form submissions, lead-gen stuff, webinar registrations were not counted as MQLs. In the first Labs report, some people got a bit heated with us for saying MQLs, but I won’t be changing this simply because I like the sound of it.
Demo Pages: Any page that says “request a demo”, “book a call”, “contact sales” etc. including PPC landing pages.
Pricing Pages: Any page that says “pricing”, “quote” etc.
Sample Size: Over 31M unique website visitors from 80 companies that have both pricing and demo pages on their websites, all B2B SaaS.
37.5% of the dataset had non-transparent pricing compared to 62.5% with transparent pricing.
Sample Description: From $5M ARR to $1B ARR; average ACV from $5K to $120K.
Caveats: Since we’re not looking at the entire funnel, we haven’t had a specific time period filter for this analysis, meaning that if a new customer implemented HockeyStack in December, we included their data for January (not February to not miss any pipeline data). So, some companies might have a month-long data, while others might have 18-month-long data.
TLDR – for B2B SaaS companies:
Bounce Rate: 59%
Avg. Session Duration: 2 minutes 35 seconds
Page per Session: 2.62
Part I – Transparent vs. Non-Transparent Pricing
Let’s start with the website metrics. Our dataset shows that on average, the bounce rate for pricing pages is 39%, which is significantly lower than the average website bounce rate. We see an average of 35% bounce rate when there’s no transparent pricing, and 42% when there’s transparent pricing.
I think the reason for this could be because if there’s transparent pricing and that pricing doesn’t align with the buyer’s budget, they bounce right away; but if there’s no pricing, they stick around more, perhaps to discover or estimate the potential cost. For example, if they don’t see pricing, then these visitors might then be looking for customer logos or case studies – since if there are enterprise logos, one can safely assume that the cost will be higher. We need to look at additional metrics to make more assumptions.
When pricing is not transparent, website visitors visit an average of 2.57 pages compared to 4.26 pages when pricing is transparent. This suggests that transparent pricing might have a higher bounce rate, but if visitors don’t bounce, they end up visiting 65% more pages and spend an average of 3 minutes and 25 seconds on the website compared to 2 minutes and 31 seconds when pricing is not transparent.
So, here what we are seeing is that when there’s no transparent pricing, the bounce rate is lower; however, website visitors end up visiting fewer pages and spend less time. As suggested above, maybe they just try to estimate the pricing by looking at other pages.
Then what do we see on the conversion side? On average, pricing pages have a 3.8% conversion rate. When there’s transparent pricing, this conversion rate drops to 2.8%; and when there’s no transparent pricing, this conversion rate jumps to 4.6%. I believe the main reason for this is curiosity; lower conversion rates in transparent pricing pages indicates that visitors only convert if they know they can afford the tool, and higher conversion rates in non-transparent pricing pages indicate that visitors might just want to learn about the initial cost.
It’s rare—if any—that B2B companies actually differentiate between pricing and demo calls. Thinking of the origins of having forms on non-transparent pricing pages, it’s possible that pricing pages might seem more approachable, more relaxed/chill in a way, hence tempting buyers who aren’t fully ready to submit the pricing form over a demo request. But if they’re not fully ready, are they truly the conversions you want to get?
Let’s look at the pipeline data now. Non-transparent pricing pages have an average of 10.31% submit:pipeline conversion rate; for transparent pricing pages, this rate is 17.50%.
Although non-transparent pricing pages have better form submission rates (4.6% vs. 2.8%) and they -on paper- generate more MQLs; on the pipeline side what we’re seeing is that transparent pricing page MQLs convert into pipeline 1.7x better.
This is very interesting because when we look at the form submission rate differences, non-transparent pages have a 1.64x better conversion rate; while on the pipeline conversion rate side, transparent pages have a 1.7x better conversion rate. This data shows that transparent pricing pages generate a better pipeline than non-transparent pages.
What about submitting demos? How does transparent pricing forms impact demo form submissions compared to non-transparent pricing? For this, we’ll use Lift Reports where we can easily zoom in on specific marketing activities to measure the incremental contribution by using control and treatment groups.
When we use the lift analysis, our dataset shows that when there’s non-transparent pricing, users are 9.5% less likely to submit the demo form compared to transparent pricing; the control group difference is 0.92 vs. 0.84x.
Hence, not only do transparent pricing pages bring better a pipeline but also they influence the demo form submission rates better than the non-transparent pricing ones.
Part II – Pricing Pages vs. Demo Pages
I’ve started including “if you have any ideas, please let me know” at the end of each report. Surprisingly, five people brought up the same question: what are the conversion benchmarks for pricing vs. demo pages?
From my experience with previous clients and companies, although the metrics varied, pricing pages consistently showed lower conversion rates than demo pages. This discrepancy was generally accepted, underpinned by the belief that if someone doesn’t convert from the demo, they might still convert from the pricing page—effectively “doubling” the chance for a conversion.
Let’s start with bounce rates. While the average bounce rate for pricing pages was 39%, for demo pages it jumps to 70%.
At first glance, the significant difference between the bounce rates of pricing and demo pages might seem alarming, but it’s important to consider other factors here.
My first hypothesis was that maybe marketers direct most of their paid traffic to the demo pages regardless of whether it’s high-intent traffic or a cold audience, and this was causing the high bounce rate. I refuted this hypothesis by looking at the total unique users difference. According to our dataset, pricing pages have 13x more unique visitors than the demo pages, so it’s not anything like “demo pages get lots of visitors and these visitors bounce.” This situation left me with three other hypotheses.
The first one is simply human psychology. When you think about the nature of a pricing page, it needs to build interest and create curiosity in a way. Even if there’s a transparent pricing page, when users see the prices, they are likely to visit other pages to justify the pricing and understand features better. If there’s no transparent pricing, I don’t think this situation will massively change because in this case, users will probably look at your other pages to understand how much it might cost – as discussed above, for example, if your homepage is full of enterprise logos and you don’t have a PLG motion, then it’s not hard to guess that your product won’t cost just $100.
My second idea revolves around landing page jails. Landing page jails are landing pages used mostly in PPC campaigns, where the user has no option to navigate other than submitting the demo form; the user doesn’t see the header, footer, or anything else that could take them to another page. The user has only one choice: submitting the form—there’s literally no other option. However, I don’t think this second hypothesis is necessarily a strong one because the use of landing page jails is not that common, hence there’s a really slight chance that it could have impacted the overall data. (Don’t know the origin of this term, but I first heard about it from Gaetano )
领英推荐
My third hypothesis is about the general PPC experience. When it comes to PPC, marketers, including myself, tend to be really conversion-focused and direct their entire PPC traffic to demo pages. This is not necessarily a bad thing, but it also depends on the keyword intent. I see no problem with directing the “HockeyStack competitors” traffic to a demo page where I compare my product with others; but I also know that I only target high-intent keywords. For every account I have audited or managed, I’ve always seen that there’s always this one campaign with low intent but high volume where marketers are too afraid to pause because pausing it certainly will damage their number of MQLs. I don’t and can’t blame them, I understand the pressure they are facing, the pressure to generate more and more leads – but this would also explain the high bounce rates on the demo pages. These low-intent, high-volume keywords direct users to demo landing pages. Since most of them are unqualified, they directly bounce. The not-so-unqualified ones submit the forms, and a small percentage of them actually end up moving through the funnel.
Another interesting data point to look into alongside bounce rate is the average time on page. For the demo pages, the average time on page is 1 minute and 51 seconds, whereas for pricing pages, it’s 1.8x more; meaning that there’s more than a minute difference. This suggests that not only do people who visit pricing pages bounce less, but they also spend more time on the page; regardless of whether pricing is transparent or not.
Let’s summarize what we discussed:
-The pricing pages get 13x more visitors than the demo pages
-The pricing page visitors spend 1.8x more time compared to demo page visitors.
-Pricing page visitors tend to bounce 80% less.
On the conversion side though, things change. As mentioned above, pricing pages have an average of a 3.8% conversion rate; for demo pages, this rate jumps to 5.5%.
Regardless of whether the pricing is transparent or not, demo pages on average have a 1.44x better conversion rate than the pricing pages.
Apart from the pricing and demo page conversion rates, we found out that the average website conversion rate per unique visitor is 1.1% meaning that for every 91 unique visitors, a B2B SaaS company gets one form submission on average.
Part III – Do Case Studies Work?
This was initially not going to be in this report, but it was something that we’ve been discussing internally. Companies are spending so much resources to convince their customers for case studies; they create videos, write long articles, and everything, but:
-Do they know if these pages get visitors?
-Do they follow the metrics of these pages?
-Do they know if these pages actually convert?
Let’s start with the ratio of website visitors. On average, case studies get less than 1% of all website traffic, to be exact, 0.76%. To compare, pricing pages get 16.5%, and demo pages get 0.91% of the website traffic. This suggests that the traditional website journey funnel where users see the homepage, then product pages, then case studies, then pricing, and then demo is broken (at least for now).
On the bounce rate side; case studies have an average of 53% bounce rate which is slightly better than the average of 59%. But I think we need to consider this from a different perspective. First, let’s think about the definition of bounce rate. “Bounce rate is a metric that measures the percentage of visitors to a website who navigate away from the site after viewing only one page.” ?
How does this impact form submissions and pipeline data?
Case studies, by nature, aim to build trust and should help improve the overall user journey with fewer friction points. If we think about the traditional marketing funnel, case study pages fall under the consideration phase, so the mid-funnel stage. Hence, seeing 1 out of 2 visitors bouncing in the mid-funnel stage is indeed concerning.
What about the people who stick around? This time, we’re seeing a similar pattern to what we saw on the transparent pricing side. On average, case study visitors visit 4.37 pages and spend 3 minutes and 55 seconds. Considering that the average was 2.62 pages and 2 minutes and 35 seconds, this suggests that visitors who don’t bounce engage with the website better.
And how does this impact form submissions and pipeline data?
Once again, we’re doing the lift analysis here to measure the influence of case studies. According to the dataset, website visitors who visit the case study pages without bouncing are actually 8% less likely to submit the demo form, and 22% less likely to submit the pricing form. This is crazy. This data suggests that case study pages actually do more harm than good for most B2B SaaS companies when it comes to form submissions. (This, by any means, doesn’t mean that logos don’t work. Perhaps what visitors need is just to see the customer logos rather than reading through case studies.)
Things get even more interesting at the pipeline level; although case studies negatively influence the form submissions, it seems like this isn’t the case at the pipeline level. We see that website visitors who read case studies before submitting any forms are 18% more likely to become an opportunity than visitors who didn’t read case studies. This data shows that although case studies don’t have a positive impact for generating form submissions, they actually help increase the opportunity conversion rate.
Conclusion
In the beginning of this report, we had three main questions:
– How metrics change when pricing is transparent vs. when it’s not
– What are the average metrics for pricing pages vs. demo pages
– Are case studies really necessary?
It seems like although non-transparent pricing pages have better form submission rates, transparent pricing pages generate more pipeline. As mentioned, I think the main reason for this is that when pricing is hidden, the intent of the submission becomes learning about the pricing without knowing if they can afford it; whereas when the pricing is transparent, the submissions are more qualified since the user already knows if they can afford it – there’s a clear buyer intent to proceed.
When it comes to demo pages, we’re seeing way better form submission rates than the pricing pages, even though the bounce rate is almost two times higher than the pricing page. Considering the intent levels of these pages, this wasn’t surprising.
For me, the most surprising thing was the state of case study pages and how they actually do more harm than good when it comes to form submissions. But still, this doesn’t mean that these pages aren’t working because we can clearly see that they still influence the pipeline positively.
I think we busted one myth: pricing pages should be transparent.
And almost-busted another myth: the state of the traditional marketing funnel. It was a widely accepted thought that the funnel was broken, and now we can back this up with more data. Website visitors submit pricing forms, they submit demo forms before checking the case studies—maybe we need to change the way we position the case study pages or the case studies themselves. Maybe this is a sign that we need to test different kinds of pages with different kinds of content.
I’m going to think about this, and we’ll launch a case study page experiment in the next few weeks. You can subscribe to the Labs Report to learn about the results of this experiment.
I hope you find this report insightful. Please feel free to reach out with any questions or requests for future reports.