If your data loading pipeline grows even slightly complex, then yes, you absolutely need concurrency in order to deliver your samples to the GPU fast enough.
The current workarounds to make this happen in python are quite ugly imho, e.g. Pytorch spawns multiple python processes and then pushes data between the processes through shared memory, which incurs quite some overhead. Tensorflow on the other hand requires you to stick to their Tensor-dsl so that it can run within their graph engine. If native concurrency were a thing, data loading would be much more straightforward to implement without such hacks.
You end up having to do a lot of things in a ML training run, some of which you can do in parallel because it’s not important now (eg saving metadata) or because you’d otherwise be resource limited (eg loading data and formatting batches for training)
Concurrency generally makes things run faster. If you test your ML methods your tests will complete faster if the ML methods are able to use and take advantage of concurrency. Some people consider that useful.
lopatin|2 years ago
NaiveBayesian|2 years ago
The current workarounds to make this happen in python are quite ugly imho, e.g. Pytorch spawns multiple python processes and then pushes data between the processes through shared memory, which incurs quite some overhead. Tensorflow on the other hand requires you to stick to their Tensor-dsl so that it can run within their graph engine. If native concurrency were a thing, data loading would be much more straightforward to implement without such hacks.
substation13|2 years ago
1. Loading data
2. Running algorithms that benefit from shared memory
3. Serving the model (if it's not being output to some portable format)
There are also general benefits of using one language across a project. Because Python is weak on these things, we end up using multiple languages.
not-my-account|2 years ago
itronitron|2 years ago
formulathree|2 years ago
Go and elixir provide some parallelism but the primary focus for both languages is concurrency.
throwawaymaths|2 years ago
antupis|2 years ago
beeburrt|2 years ago