mlx
mlx copied to clipboard
Extend activation functions
Hi everyone,
Proposal:
I would like to propose the addition of several other activation functions to the framework:
- [ ] LeakyReLU
- [ ] PReLU
- [ ] ReLU6
- [ ] Tanh
- [ ] Softplus
- [ ] Mish etc.
I am willing to contribute these or others.
That would be awesome, please add them (+ tests / docs) if you can. We'd love to take a PR for that.
We mostly follow the PyTorch nn API so try to follow that for naming conventions / arguments where it makes sense.
Here are other activation functions implemented by PyTorch, while MLX don't have:
@awni If it is necessary, I can add them and make PRs.
Cool! I'm not opposed to adding some of these, but It's also not a big priority as they are not used that much. If you are interested in contributing them that would be great