Why Managed Triage Is A Poor Economic Proposition: A DarkHorse Security Research Paper
Grant McCracken
Making proactive security accessible and affordable for organizations of all sizes and budgets. Founder @ DarkHorse. Senior executive, author, technical leader, and a few other things.
Managed triage is a poor economic proposition.
In this writeup I’ll explain why I believe managed triage (as it's being sold and delivered in the market today) is a poor economic proposition and choice. Doing so will also inherently explain why DarkHorse has chosen not to offer it.?
Given my background, saying “managed triage isn’t worth it” is not something you’d expect to hear from me. I started my career in crowdsourced security as a triager, and spent the last decade at Bugcrowd promoting its benefits. I even oversaw the triage group for a number of those years. If anyone could argue for the benefits for managed triage, it’d be me.
And yet as I’ve gained perspective after leaving Bugcrowd, I’ve come more and more to the conclusion that the way that bug bounty / VDP providers sell managed triage today is fundamentally disadvantageous to the buyers.
I’ll explain.
First, let’s set some base assumptions:
As a quick aside, for those unaware of what I’m talking about when I say “managed triage”, I’ll explain quickly. It works like this:
On the surface, this sounds like a pretty great and straightforward thing; however, as given away by the title of this report, I’ll dig into why this isn’t actually the case.
Again, using these extremely favorable numbers, we have an average of 190 reports across an average contract size of $40k. This gives us an average cost per report of $210. And that’s not including rewards (if one is running a bounty).
Before we go any further, it goes without saying that these assumptions are imperfect. For instance, a single organization may run multiple programs under a single contract, etc. However, this is why I’ve been extremely generous in my assumptions. Some organizations may run multiple programs under a single contract, BUT there are also many contracts that go for well under $40k (such as for pentesting or attack surface management) that likely drag down the average contract value across all contracts. AND there are likely significantly more private programs than there are public ones (which would drag down the average number of reports per program, while simultaneously raising the cost per report average). All things considered, I’d wager a decent amount on the belief that the average $40k contract gets significantly less than 190 reports per year. If there’s data to support this being incorrect, I’ll happily update my position - but it’s unlikely anyone is going to get transparent here. Said differently: the real average contract size for VDP/bounty is likely higher than $40k, and gets fewer reports than 190; which is to say that the real average cost per report is also significantly higher than $210.?
Since the existing providers bundle everything together, in order for us to get into triage costs (so as to establish whether or not it makes sense financially), we first have to make a quick n’ dirty assumption around how much we think the platform should cost.
Fundamentally, the platform (paid for by whatever costs we don’t associate with triage) allows for one to receive and manage vulnerability reports, along with a few other things such as integrations, crowd management, and so on. Going on gut, how much is this functionality worth per report? $50 per report feels like a good number, considering that 80% of all reports are “noise”, and you still have to pay this amount per report, noise or not, it seems reasonable that when accounting for noise, you’re actually paying approximately $250 per valid report. That seems palatable as a starting point.
Given our starting cost of $210, this leaves $160 left to cover triage costs per report.?
NOTE: you can also do this math for your own program / contract - just divide your contract cost by the number of reports you got for the year. I think you’ll be surprised to see how much you’re paying per report when you start looking at things objectively.
So, is $160 for triage per report a deal… or no? Let’s see.
Across 100 reports, let’s make these generalizations:
Loosely speaking, I think these numbers are directionally accurate. As always, I’m happy to update assumptions based on any official data, should anyone be willing to provide it.
Now, let’s give some weighting to these reports in terms of effort. One effort point is how much work it takes to triage a single valid, unique vulnerability report.
So, as an average across 100 reports, we get the following total effort scores:
47 effort points in total.
In terms of what gets triaged, you (as an organization) should just see the 20 valid, unique reports, and then we’ll say another 10% of total reports where you need to weigh in on NA/NR/OOS and so on. In my experience, most program owners will agree that this 10% number is probably too low - it’s fairly common for triage to ask the client for input on a large number of reports since they don’t have business context - e.g. asking things like “is this a valid attack in your threat model” or such. We’ll go low and call this an extra 2 effort points in total.
Now, with everything that makes it through triage (valid issues, false positives, etc), you still need to review, reproduce, process, and give input on those reports… and while it’s possible / probable that some of that effort may be reduced as a function of good triage notes, we also need to take into account the presence of false positives and false negatives - which we’ll say is ~5%. On average, we’ll say that any speed gained from good triage notes is offset by the presence of false positives / negatives.
In all, after triage has done their job you’ve got 22 effort points to cover on your own.
Out of an initial total of 47.
Said differently: even after paying 100% of the cost for triage to look at your findings, you still end up having to perform 47% of the work or more (22 is 47% of 47… just in case all the 47s start to get confusing).
I’ll say it again for emphasis: you pay 100% the cost of triage, while having to re-perform nearly 50% of the effort. And again, all of this is based on fairly generous and favorable assumptions. It's highly possible that in many cases the level of effort exceeds 50%.
I’m unsure where you draw the line in regards to where something becomes inefficient, but if you’ve only got 50% output for 100% input, that’s inefficient no matter how you cut it. If I got out 50cents for every dollar I put into something, I’d stop doing whatever that was real quick.?
But inefficient can sometimes be ok - if you can pay ten people that do the work of three for the price of two, you’re still coming out ahead, regardless of how inefficient it is in the aggregate. Is this true for triage?
Earlier we said the average dollars per report that (ostensibly) go toward triage is $160. Maybe that’s still a good deal? Let’s see how much it’d cost you to triage…
The average triager can process 30-40 reports per day. But one of the big arguments made in favor of managed triage is that the managed team is significantly faster to triage. So we’ll again be generous and say that you can probably only process half as many as a trained triager: 15-20 per day. So, if you had someone on your team do the same amount of work, assuming you’re paying a security engineer $50/hr (which is ~$100k/yr), your average cost per report with someone processing 20 per day would be: $400 (daily cost) /20 (number of reports processed in that day)…
Wait.?
That’s $20 per report. That can’t be right.
What’d we miss?
Well, for starters, you have to pay benefits to your employee and there are other costs (assuming you don’t just 1099 offshore contractors, which would cut costs by well over half…). ChatGPT tells me that employees actually cost 1.25- 1.4x their salary. Let’s go with the high end of 1.4x. Additionally, let’s increase that salary to $120k. Now where do things sit??
As a note, considering that performing triage is generally considered an entry-level position, paying $120k in salary for someone who performs triage is pretty darn expensive in 90% of the United States, and 99% of the world. Anyways, let’s proceed with this number. Said differently: on average, this is a generous salary for triage.
120 base * 1.4 = $168k/yr OR $84/hr
Surely managed triage will be a veritable steal now that our costs have nearly doubled. Let’s see!
Across 20 reports processed per day, that’s $33.6 per report.
Huh.
And again, all of this assumes you’re paying a pretty decent, high-cost-of-living-area wage to the person doing the triage (especially considering that triage is often an entry-level role). If this role were offshored (which is what you’re typically getting when paying for managed triage), your costs would be halved (or less).
That can’t be right. You can’t be paying effectively $160 per report (not including the effort that you have to re-do!), when the real cost is $34 to do it on your own. What if we cut it to 15 reports per day? Does that fix it?
Not really... now it's just $51 per report.
For reference, when I was a triager there was no platform tooling to support, and when I finally worked my way up to making $3 per fully processed report, I was making anywhere from $12-15 per hour. Said differently: I was a part-time triager who certainly wasn’t the most brilliant or prolific triager in history, and I did 4-5 reports per hour, on average. If I could work through 30+ reports in 8 hours on a 10” tablet with 2gb of ram (where most of my time was spent waiting for my VM to respond), I’m absolutely confident someone on your team (or even an entry level intern) can process 15 (let alone 10) in a day at a whole lot less than $120k in salary (again, a mere ten years ago, I and a bunch of other people were doing this for effectively $15 an hour with no benefits).
But wait, the math gets even worse.
Since you have to expend ~50% of the effort, no matter what (even if you're paying for managed triage), that's going to cost you ~$1700 per 100 reports (50% of $34 per report).
So, your base cost, again, no matter what, is ~$1700 per 100 reports.
To do the other 50% of the work across those 100 reports, it's another $1700.
How much is it for managed triage?
At $160 per report, to do the other 50% of the work, that'd cost $16,000.
Said differently: managed triage is nearly TEN TIMES the cost.
Ten. Times.
Shoot.
It doesn’t take an MBA to see the better option between spending an extra $1700 to triage in-house, or paying $16,000 for managed triage (though it does take an MBA to spin the narrative that managed triage is somehow a good deal in light of all this).
Managed triage isn’t just 3-5x more expensive (as it would appear on the surface), it’s TEN TIMES more expensive.??
There’s certainly something to be said for the ease of not having to manage other people and what not… it most certainly is easier just to let someone else handle it all. BUT remember, you still have to do approximately 50% of the work, no matter what. And instead of handling the other 50% yourself, paying someone else to do it will cost you ten times what it could cost you to pay someone on your own team to finish the job. When you start framing things in context, it starts to make sense to just do the other 50% in-house.
So, what’s happening here? And why??
Fundamentally, these businesses market themselves as SaaS businesses, but they carry a huge amount of overhead in their triage teams, which are effectively services groups. However, valuations for services companies aren’t all that great (much smaller multiples), so they need to make SaaS margins (~80%) across the business, even on services (which more typically have a much smaller gross margin of 20-30%).
We can see this by doing the math… Assuming the cost of triage is approximately the same for the organizations selling it, if we take our $34 report cost (20 reports/day) and apply 80% margins to it, we get $170. Not all that far off from our initial estimate of $160.
On principle, I don’t love inefficiency - paying 100% of the cost for 50% of the value is already a tough proposition to swallow. Add on spending $16,000 on something I could do myself for roughly 1/10th of that cost, and this just seems like bad economics.
It’s up to you what you do with this information, but I know this isn’t something I’d pay for with my budget.
Maybe it’s a deal for you, in which case, I applaud your deep budget and how much you must pay your employees for this to make economical sense.
It’s also possible to argue that one is paying more for the platform than we estimated earlier ($50). But that also becomes a tough pill to swallow once you start paying $100 per report to the platform… just for existing. Sure, there’s R&D and features n’ such, but is it worth that much per report? AND that’s paying $100 for every single report, just for the luxury of it existing. Keeping in mind that only 20% of the reports are valid - so you’re paying $100 per duplicate, invalid, not applicable, and so on. Does that feel like a good deal? Only you can make that call.
And even if the platform gets $100 of the $210 per report, now you’re still paying over $110 for triage per report. And, as we’ve established previously, across 100 reports that’d effectively be $11,000 in cost per 100 reports vs. doing it in house for $1700. Still a whopping 650% increase, AND you still have to find a way to justify paying $100 to the platform for simply existing.
All things considered, this just doesn’t sit right with me at this point in time.
In fact, I feel so strongly about the massive pricing disparity here that I built a platform where the per-report cost is literally $7 (which, you’ll note is a lot-lot-lot less than $50 or $100 per report). I won’t try to sell you on managed triage either. Just a low cost platform that puts you back in control with 80-90% of the features, for a fraction of the cost. You can learn more at https://darkhorse.sh, or shoot me a message and I’d love to help you out.
Of course, this review wouldn’t be complete without going over some of the common reasons given for why managed triage is better and/or required. Let’s go through a few of them…
Isn’t this why people pay for managed services in the first place? Nearly anything is cheaper if you do it yourself… people pay premiums all the time to be able to have a qualified expert do things for you. Why is this any different?
Managed triage is faster than doing it yourself.
Managed triage is more competent than doing it yourself. They see more bugs and do a better job as a function of it.
Managed triage ensures a better experience for hackers by enforcing tight SLAs.
All the above being said, my (and the DarkHorse view) on all of this is:
You / program owners are more capable than they’re given credit for.
You / program owners benefit enormously from directly interacting with the crowd.
The platform should support and guide you / program owners.
Managed triage is expensive.
Managed triage is inefficient.
Doing triage yourself can save you a lot of money.
Despite the stories, managed triage isn’t significantly better than doing it yourself.
DarkHorse is an affordable alternative.
Hopefully this guide is helpful. What you do with it is up to you; as they say, knowing is half the battle.
As always, if you're looking to save money while being more secure, send me a message and I’d love to help you out!