Listened to this talk by Rob Pike twice to ensure I got it.
This is my version of what he says:
- Concurrency is being able to break down a problem into smaller chunks and handling them.
- Concurrency does NOT mean being able to run these chunks simultaneously. They could or could not. It doesn’t care.
- Parallelism is when you can run these chunks simultaneously (on a multi-core machine).
- Concurrency enables parallelism. If you can break a problem down into smaller chunks and handle them, given more cores you could make them run simultaneously.
- Parallelism by itself does not enable concurrency.
That’s really it in a nutshell.