Listened to this talk by Rob Pike twice to ensure I got it.
This is my version of what he says:
Concurrency is being able to break down a problem into smaller chunks and handling them.
Concurrency does NOT mean being able to run these chunks simultaneously. They could or could not. It doesn’t care.
Parallelism is when you can run these chunks simultaneously (on a multi-core machine).
Concurrency enables parallelism. If you can break a problem down into smaller chunks and handle them, given more cores you could make them run simultaneously.
Parallelism by itself does not enable concurrency.
From what I have seen, we run the concurrent tasks in parallel. What would be an example of not running the chunks simultaneously?
Aim for this weekend if to read Chapter 8 and 9 of the Go Programming Language
Running them on a single core machine. Then only one thing can run at a time. You can achieve the same effect on a multi-core machine by setting NumCpus to 1.
Doesn’t have such good ratings on Amazon. We should check if Jeff Dean recommended some books on distributed systems.