AtomicVar
AtomicVar
Hi there, I just wanted to thank the maintainers for their hard work on the project, and I wanted to let them know that I submitted a few pull requests...
Here are other activation functions implemented by PyTorch, while MLX don't have: - [RReLU](https://pytorch.org/docs/stable/generated/torch.nn.RReLU.html#torch.nn.RReLU) - [Threshold](https://pytorch.org/docs/stable/generated/torch.nn.Threshold.html#torch.nn.Threshold) - [Softmin](https://pytorch.org/docs/stable/generated/torch.nn.Softmin.html#torch.nn.Softmin) - [Hardtanh](https://pytorch.org/docs/stable/generated/torch.nn.Hardtanh.html#torch.nn.Hardtanh) - [Tanhshrink](https://pytorch.org/docs/stable/generated/torch.nn.Tanhshrink.html#torch.nn.Tanhshrink) - [Hardshrink](https://pytorch.org/docs/stable/generated/torch.nn.Hardshrink.html#torch.nn.Hardshrink) - [Hardsigmoid](https://pytorch.org/docs/stable/generated/torch.nn.Hardsigmoid.html#torch.nn.Hardsigmoid) @awni If it is...
Install from Conda will have the same issue.
I have the same problem with version 7.0b1.
自行编辑 `zh-academic.css` 文件。
文章很棒。一处笔误:「显示运行时链接」=> 显式运行时链接