Concurrency is Not Parallelism: Insights/Notes
Photo by Swastik Arora on Unsplash

Concurrency is Not Parallelism: Insights/Notes

Okay, picture this. You’re in the middle of making chai. Water’s boiling, tea leaves are steeping, and you’ve got just enough time to toast some bread while you wait. That’s concurrency—you’re handling multiple tasks but not doing them all at once. Now, imagine you’ve got a sibling who steps in to toast the bread while you focus on the chai. That’s parallelism—getting things done faster with extra hands (or cores). Cool, right?

Concurrency vs. parallelism. If you’ve spent even a little time around software systems, you’ve probably heard these terms tossed around like confetti—sometimes interchangeably, sometimes like they’re long-lost enemies. And if you’re still confused, don’t worry, you’re in good company. Rob Pike, one of the brains behind Go, had an absolute gem of a talk titled "Concurrency is Not Parallelism," and it’s here we are going to discuss my few key takeaways from it.

Concurrency ≠ Parallelism (Yes, Really)

As Rob Pike puts it, concurrency is about dealing with lots of things, while parallelism is about doing lots of things. They’re related, but not the same—and getting this wrong can turn your system into a dumpster fire.

Concurrency: Designing systems to handle multiple tasks efficiently

Parallelism: Executing multiple tasks simultaneously

You might ask, "This is simple, why are you writing an article on this Shivam?"

Because that's just definition. What I started with thinking after I saw the talk, which made me read through a bunch of stuff was the question, why, where and how concurrency vs parallelism knowledge is useful. And since I was anyway reading through a bunch of stuff, thought why not share it here?

The Critical Distinctions

  1. Concurrency is in Your Code: It’s about how you design your program to handle tasks independently. You can have concurrency even on a single-core machine because it’s about structure, not speed.
  2. Context Switching Isn’t Free: Overloaded concurrency can backfire if context switching costs overshadow its benefits. Design with intention, not chaos.
  3. Beware of Deadlocks: Even the most elegant concurrency model can fall victim to poorly designed task dependencies. Avoid situations where tasks are stuck waiting for each other by keeping inter-dependencies minimal and communication explicit.
  4. Concurrency Adds Predictability: Properly designed concurrency can make systems more predictable under load by isolating independent tasks. This predictability is critical for user-facing applications where responsiveness is key.
  5. Good Concurrency Aids Debugging: Concurrency done right—using clear task boundaries and well-defined communication channels—makes debugging easier. Debugging parallelism, on the other hand, is like finding a needle in a stack of needles.
  6. Parallelism is in Your Hardware: Adding more cores or threads to process tasks in parallel only works if your program is designed to handle it. Concurrency enables parallelism, but parallelism can’t magically make a non-concurrent program faster.
  7. Get the Balance Wrong, and You’ll Suffer: If you aim for parallelism without designing for concurrency, you’re looking at bottlenecks, race conditions, and headaches for days.
  8. Measure Before You Optimize: It’s tempting to throw threads and parallelism at a problem, but always measure your system’s behavior first. Sometimes the bottleneck isn’t in execution but in I/O or other external factors.
  9. Parallelism is a Power Boost: If you’ve got a well-structured, concurrent program, adding parallelism is like strapping a rocket to your bicycle. Without that structure, you’re just pedaling harder.
  10. Get this balance right, and you’ve got the recipe for systems that scale, respond well, and don’t collapse under load. Get it wrong, and, well… enjoy debugging those race conditions.

Real World Examples

Web Servers

Our favorite website is likely a masterpiece of concurrency and parallelism:

  • Concurrency: A web server like Nginx can handle thousands of simultaneous user requests by interleaving tasks—reading from disk, querying a database, and sending responses—all without waiting for one task to complete.
  • Parallelism: Behind the scenes, the server might use multiple CPUs to process parts of these requests in parallel, splitting the workload for better performance.

Database Queries

When querying a database:

  • Concurrency: A database connection pool lets multiple applications send queries. Even though the queries are processed one at a time per connection, the pool ensures all clients make progress without waiting indefinitely.
  • Parallelism: A distributed database runs queries across multiple nodes, splitting the work among them so that different parts of the data are processed simultaneously.

Gaming: Player Actions vs. World Updates

In video games, concurrency and parallelism play distinct roles:

  • Concurrency: When a player moves, the game engine updates the player’s position, calculates interactions with the environment, and renders the next frame. These tasks are queued and managed to happen in sequence without blocking the player’s experience.
  • Parallelism: Physics calculations (like collisions) or AI behaviors (for multiple NPCs) can run simultaneously on different threads to make the game world more dynamic and responsive.

Wrapping Up

Concurrency and parallelism aren’t just technical concepts—they’re tools to help us build smarter, more efficient systems. They teach us how to:

  • Think intelligently, balancing tasks with care.
  • Stay responsive, always keeping users in mind.
  • Scale gracefully, ready for whatever comes next.
  • Be efficient, making the most of our resources.

At the end of the day, it’s not about perfection; it’s about progress. Managing this balance might feel tricky, but with patience and practice, we can learn to design systems that truly shine. After all, every great coder starts somewhere—and even the most elegant solutions grow from a few messy experiments. Let's keep learning, keep experimenting, and keep improving. We've got this.


While researching for the article, I found this point of view as well:

"To me parallelism is one form of concurrency and no amount of blog posts can convince me otherwise."

And to be honest, you can subscribe to this school of thought too, like till we are making great software, who cares? Right?

Sarthak Sinha

Engineering at Flipkart | TIET'21

3 个月

Good Read ??

Archin Gupta

SDE 2 @ Flipkart Java | Spark | Elastic Search

3 个月

Crisp, precise and easy to read. Good one Shivam Behl

Sahil Batra

SDE2 Atlassian, Ex-Microsoft | Expert on codeforces | Web developer | Networking

3 个月

The way you have written is quite engaging. Waiting for more such content.

Rohin Suri

Ex-SDE-2 @Amazon

3 个月

Compelling read!!????

Shreyansh Sinha

SWE at Microsoft | Ex Siemens Healthineers, JP Morgan Chase | Hackerrank Certified at Problem Solving | Open for Mentor/Instructor Roles | DM for Collaborations.

3 个月

Interesting, I did see that video, and the examples were truly helpful!

要查看或添加评论,请登录

Shivam Behl的更多文章

社区洞察

其他会员也浏览了