What Are Concurrency & Parallelism? A Developer's Guide
Can your application handle multiple tasks at once? Understand the difference between concurrency and parallelism to write performant, thread-safe applications.
Definitions
Concurrency
Multiple tasks making progress in overlapping time periods. They don't need to run simultaneously — they interleave.
Parallelism
Multiple tasks running literally at the same time on multiple CPU cores.
| Aspect | Concurrency | Parallelism | |--------|------------|-------------| | Goal | Manage multiple tasks | Speed up computation | | CPU cores | One is enough | Multiple required | | Example | Web server | Video encoding |
Async/Await
Concurrency without threads:
// Sequential (slow) - 3 seconds
const user = await getUser(42);
const orders = await getOrders(42);
// Concurrent (fast) - 1 second
const [user, orders] = await Promise.all([
getUser(42),
getOrders(42)
]);
Race Condition
Inconsistency when multiple threads access shared data:
Thread A: read balance (1000) → 1000 + 500 = 1500 → write (1500)
Thread B: read balance (1000) → 1000 - 200 = 800 → write (800)
Result: 800 (500 vanished!)
Solution: Mutex
lock = threading.Lock()
with lock:
balance += amount
Deadlock
Two threads wait for each other's locks forever.
Solutions: Lock ordering, timeouts, lock-free algorithms.
Concurrency Models
| Model | Language | Key Feature | |-------|----------|-------------| | Multi-threading | Java, Python | Shared memory | | Event Loop | Node.js | Single thread, non-blocking I/O | | Actor Model | Elixir, Erlang | No shared memory, message passing | | CSP | Go | Goroutines + channels |
Go Channels
ch := make(chan int)
go func() { ch <- expensiveCalc() }()
value := <-ch
When to Use What
| Scenario | Approach | |----------|----------| | I/O-bound (API, DB) | Async/Await | | CPU-bound (math) | Parallelism | | Web server | Event Loop or Thread Pool |
Best Practices
- Avoid shared state — Use immutable data when possible
- Minimize locks — Synchronize only when necessary
- Use thread pools — Don't create a new thread per task
- Prefer atomic operations — CAS over locks
- Deadlock detection — Timeouts and ordered locking
- Stress test — Race condition testing
Conclusion
Concurrency and parallelism are critical for modern application performance. Choosing the right model and managing synchronization issues is the key to building reliable systems.
Learn concurrency on LabLudus.