SPFlow
SPFlow copied to clipboard
Add high-level backend specific gradient based optimization procedures
We want to provide high-level gradient-descent based optimization procedures. The idea is that a user constructs some model structure with parameters $\theta$, has a dataset $\mathcal{D}$ and can now maximize $p(\mathcal{D} | \theta)$ with any backend. An examplary method could look like the following:
def optimize(model, data, optimizer, epochs, batch_size, ...):
...
which then uses optimizer to maximize the data likelihood batch-wise for epochs number of epochs.
This is not supposed to be very flexible but should provide a user with a simplistic version to train a model in a specific backend. More advanced users will most likely write their own optimization procedure.
Since tensorly does not provide any dataset/optimizer system, this needs to be implemented in all supported backends.