Lilith.jl
Lilith.jl copied to clipboard
ToDo
Activation functions
- [x] relu
- [x] elu (CPU only)
- [x] leakyrelu
- [x] tanh
- [x] sigmoid
- [x] logsigmoid
- [x] softplus
- [x] softsign
- [x] softmax
- [x] logsoftmax
Loss functions
- [x] MSELoss
- [x] CrossEntropyLoss
- [x] NLLLoss
- [ ] KLDivLoss
- [ ] BCELoss
- [ ] CosineEmbeddingLoss
- [ ] TripletMarginLoss
General-purpose layers
- [x] Linear
- [x] Sequential
- [x] BatchNorm
- [ ] Dropout
CNN
- [x] Conv1d
- [x] Conv2d
- [x] pool2d
RNN
- [x] Vanilla RNN
- [x] LSTM
- [x] GRU
Optimizers
- [x] SGD
- [x] Adam
- [x] RMSprop
Metrics
- [x] accuracy
- [x] precision
- [x] recall
- [x] confusion matrix
CUDA
- [x] to_device
High-level API
- [x] fit!
Zoo
- [x] Simple CNN
- [x] Resnet
- [x] RealNVP
- [x] VAE
Tutorial
- [x] Plain gradient
- [x] Training a linear regression
- [ ] Simple CNN
- [ ] Simple RNN
Other
- [x] Performance comparison vs. PyTorch (1 test)
- [ ] Benchmarks for tracking changes in performance
- [ ] Load pretrained models
- [ ] Docstrings for all functions and layers
- [ ] ONNX import & export