Builds don't have to suck

Builds don't have to suck

A build is how you go from an idea to a product, from a problem to a solution. It's the middle step that always has to happen to get from point A to point B.

Yet, builds are generally wildly inefficient across the board.

They are fundamental in any development cycle, whether it be a developer building an app locally or a CI system building it to deploy to production.

However, they are also the biggest bottleneck in the delivery pipeline from A to B. When build performance degrades, all iteration eventually comes to a halt.

Not to mention that it drives developers up the wall, straining team morale as well.

Developers can optimize the hell out of a build, no doubt.

But build performance doesn't usually surface to the top of a product teams priority queue.

You typically pay for slow builds twice at a minimum

The build pipeline for both local development and CI.

When we first started Depot, we were annoyed by 20-minute builds in GitHub Actions. When we tried to add multi-platform Docker image builds using QEMU emulation, we saw our builds go over an hour in GitHub Actions and locally.

Any time we wanted to build locally or in CI, it would take over an hour to finish the build in the worst case. That sucks and is a massive waste of time and developer context-switching.

It gets worse because that pain is multiplied by each developer on a team as well. But wait, developers don't just wait around for a build to complete. True. Developers are great at multitasking.

But multitasking comes at a cost. You have to switch to another task, load the context for that task, start working on it, and then switch back to the other task when the build finishes.

It's like CPU thrashing on a developer's brain.

Faster builds mean faster iteration cycles

It's not hard to see that build performance and developer velocity are interleaved. The faster builds are, the faster developers can move, and the faster the business can iterate.

A delivery pipeline leveraging Depot looks quite a bit different.

Delivery pipeline with Depot

The two disconnected and inefficient build pipelines are merged into a centralized Depot platform where builds are exponentially faster.

The build that once took over an hour now takes less than 3 minutes. Results from a local build are automatically shared and available to CI, and vice versa.

We do it by tailoring the underlying platform for build performance for the specific tools running on top of it.

We started with Docker image builds and a few core principles:

  • Run builds on native CPUs because emulation is painfully slow
  • Orchestrate the layer cache on fast NVMe SSDs automatically
  • Give the build an entire instance for ultimate security sandboxing
  • Simple to drop in Depot by swapping docker build for depot build

It turns out that building a platform on those four principles can make any given Docker image build up to 40x faster, sometimes even more.

The best part? All that performance is available and shared across your team; anywhere you're building Docker images.

The benefits of this kind of build performance are massive:

  • Developers are no longer stuck waiting hours for a build, getting back time in CI, and hardware resources when running locally
  • Developers don't have to context switch to another task; the build is so fast that they don't have time to switch to anything else
  • Developers don't have to wait for the slow build in their CI environment to get their new feature, bug fix, or security response into production

But, after 18 months of building Depot, we've learned that those benefits are just the tip of the spear. The real benefits are what is unlocked once those barriers are removed:

  • Businesses building container-related tools can just call Depot as an API and get instantly faster builds for their own customers, way cheaper and faster than building their own system
  • Once impossible, packaging AI and LLM workloads into containers is now not only possible but fast with projects like depot.ai
  • Merging the two pipelines, local builds and CI, entire teams get exponentially faster by reusing each other's work

The future: near-instant builds

Many would say that we're entering a new era of technology. AI is rising, LLMs are increasing daily, APIs are becoming the new internet to connect various services from OpenAI to Salesforce, and developers are in the middle of it all.

The faster we can make developers, the faster we can accelerate across the board. We already see this with AI today; Copilot looks to help developers write code faster. It's not perfect, but it's going to get a lot better.

We believe that Depot can do what Copilot does for developers in their IDE for their entire build and delivery pipeline.

We started with Docker image builds, but we now have three different products that look to accelerate builds in a variety of places:

  • Docker image builds is our original product: depot.dev
  • Depot Build API puts the entire acceleration of container image builds behind an API so you can build any container from your own code instantly: depot.dev/docs/api/overview
  • Depot-managed GitHub Actions Runners bring the build performance of our first product to the entire workflow: depot.dev/products/github-actions




Kelly Millar

?????? & ?????????????? ???? ???? ???????????????????????????????? ????????????????. I am an expert at driving brand growth and visibility through personal branding, thought leadership, company brand building and PR.

8 个月

Agreed! Improving build speed and efficiency is crucial for delivering value quickly Kyle Galbraith

回复

要查看或添加评论,请登录

Kyle Galbraith的更多文章

  • My expat journey: Why move?

    My expat journey: Why move?

    This is a blog post I recently published over on my personal blog. If you follow me on Twitter/X, you'll likely notice…

    6 条评论
  • The future is not Docker

    The future is not Docker

    Sounds strange to hear out loud right? But after working over the past 18 months building Depot, I'm fully convinced…

    4 条评论

社区洞察

其他会员也浏览了