burn
burn copied to clipboard
WASM support for training
Support training on the wasm target. For this, we probably need to have alternative implementations of file checkpointers, file loggers, and similar components. It's not clear whether we want to support training in the browser, or if supporting wasm runtimes alone is sufficient. In the latter case, we could use an alternative file system API provided by such a runtime.
Requires #1253
I am personally not in favor we support training in the browser. This is such niche use case that requires the code base to accommodate.
In my opinion we can support training in the browser at a later time in a new crate called burn-wasm
. In this way we do not have to change the other burn
crates, but we can create an independent piece of software which has the goal of R&D to test the potentiality of training neural networks inside a browser. So for me it is a yes, but with the label of R&D
I think we can actually support training in the browser without having to change much of the architecture. It's similar to how we can support no-std
: by having primitive type stubs in burn-common
. So when #1250 is done, it wouldn't be that hard to support training on the web.
We have to keep in mind that supporting wasm
can be beneficial as a deployment format, which can be used similarly to Docker, so it can be useful.
Also just to chime in - I imagine training massive LLMs or even big convnets in the browsers is niche. However, consider NeRF and gaussian splat like models, training in a browser could be totally fine & useful. Generally, being able to use ML as "just" good numerical optimization is nice! It's not all about massive models.
Tbf, those types of models might not need much from burn-train anyway.