Update pytorch_feedforward_neuralnetwork.md
Activation saturates at 0 or 1 with gradients $\approx$ 0 should be: Activation saturates at -1 or 1 with gradients $\approx$ 0
I know this was an old pull request.
But why do you want to change the range from 0,1 to -1,1?
For extremely large negative values of $x$, then $e^{-x}$ approaches infinite. Hence $\theta(x)$ approaches 0.
Folks, I think this is a scam. Don't allow them to access your company's private repos to them, okay? It's easy to imagine the code analysis being used for hacking and ransomware on your products.
How should we report this to Github?
Folks, I think this is a scam. Don't click or link your Github account to them, okay?
How should we report this to Github?
Thank you, I've reported the user via the report feature and marked his comment as spam to hide the contents. Thanks for alerting me to this!