4-in-1 Case Study – Top Benchmarking Recommendations

4-in-1 Case Study – Top Benchmarking Recommendations

I recently signed my 30th benchmarking project, and in my first discussion with this client they asked for examples of “consequential” recommendations I made to previous clients. ?Below are the first four I could think of- my client got the short versions; you get the extra context.

MISSING TECHNOLOGY

Many of my clients are service based CROs who enable or supplement many of their services with technology offerings. One CRO client had me benchmark their full unit pricing, and shortly after signing asked if I could also benchmark what they were paying vendors for technology. In reviewing their request, I looked at their unit grid and algorithms, and didn’t see what technology they were pricing out to their customers, beyond those handled as pure pass-through costs.

Their approach was that if a technology wasn’t a pass-through, they didn’t charge clients for it. I had them list out which technologies fell in this category, then I listed back which of those were commonly priced out by competitors and the prevailing market ranges for that pricing where applicable.

They assumed tech fees would be far harder to justify since there wasn’t a “cost basis” for those fees. I said the exact opposite was true- tech fees tend to get far less pricing pushback than service fees, for that exact reason!?I'd estimate this company now generates high 6 digits worth of incremental tech fee revenue per year.

For more on how to choose whether to charge for a technology you use, read my post from February.

PRICE SYNERGY

I was building a pricing tool for a client, and towards the end of the project they acquired a small company to help expand their presence in a strategically important region. My client’s rates and the acquired company’s rates were quite far apart for several identical roles in this region, so we signed a side project for me to quickly benchmark these conflicting rates against prevailing market ranges and then recommend go-forward rates.

The results showed the acquired company’s rates were well below market. I recommended the increases they should target and, importantly, a staggered approach for implementing these increases for existing clients. An added bonus was recommending my client increase certain of their own existing rates they didn’t realize below market or at the very low end.

Folks, in my world, that’s as good as it gets.

I call this an M&A “price synergy”, where you generate additional ROI from an acquisition through simply increasing below-market prices up towards market ranges (which ideally is where yours are). This ROI is typically unexpected at the deal stage, but when acted upon generates both short- and long-term improvements to revenue, gross margin, and EBITDA (the “trifecta” of financial improvements).

Check out this post and a recent podcast appearance for more on price synergies.

EXPENSIVE BUG

When I’m scoped to benchmark CRO levels of effort algorithms, I’m often digging through the CRO’s tool to determine what their algorithms are in the first place. Last year I was reviewing one such tool and couldn’t figure out why a certain important algorithm was coming out 3x lower on “High” complexity compared to “Low”.

Turns out this was a bug, in place for at least a year, for an algorithm where “High” was the default and rarely changed. The result was a persistent undercharging equivalent to 0.5%-1% of the entire service fee budget. The bug had never been flagged because this particular algo table was on its own hidden tab, and if anyone had ever noticed it they never called it out.

To avoid bugs like this in your pricing tool, try to store the algorithm info close to where users select the complexity. Example:

Also, make your complexity labels super intuitive to anyone using the tool. “High, Medium, Low” are far more intuitive than “1, 2, 3”. I have seen situations where a “Level 5” means complexity is lower than “Level 1” but you would only know that if you memorized their definitions and algorithms.

MISSING SERVICE AREA

I once helped a startup’ish CRO get a few templates in place and validate that the services they were charging for were being priced and presented appropriately. I noticed an entire service area appeared to be missing in their pricing worksheet. I assumed they used a separate worksheet for it- the service area was extremely common amongst CROs and the company’s own website and proposals described their capabilities for those services.

When I asked about it, the response was they didn’t think it was appropriate to charge out for that service area, because they didn’t have any dedicated staff for it yet and they were developing a solution to completely automate the solution. I said they needed to charge for it, both right now and even after they’ve automated it someday (that “automation someday” typically never comes).

I still tease the owner about it to this day. They now have a small team focused on that service area, and I’d like to think those jobs exist because the company is now actually getting paid to provide the service.


I specialize in pricing and financial strategies for service and technology providers, and provide external benchmarking services and pricing tool validation for clinical research providers. Contact me to discuss solutions for your organization.

Alec McChesney

Helping Life Science companies transform their strategy, marketing, branding, pipeline, and business | Director of Business Development at SCORR Marketing

2 个月

Just superb content per usual, Joel!

要查看或添加评论,请登录

社区洞察

其他会员也浏览了