deep-learning-wizard icon indicating copy to clipboard operation
deep-learning-wizard copied to clipboard

Update pytorch_feedforward_neuralnetwork.md

Open Walser52 opened this issue 3 years ago • 4 comments

Activation saturates at 0 or 1 with gradients $\approx$ 0 should be: Activation saturates at -1 or 1 with gradients $\approx$ 0

Walser52 avatar Jun 07 '22 06:06 Walser52

I know this was an old pull request.

But why do you want to change the range from 0,1 to -1,1?

For extremely large negative values of $x$, then $e^{-x}$ approaches infinite. Hence $\theta(x)$ approaches 0.

ritchieng avatar Jan 03 '23 04:01 ritchieng

Folks, I think this is a scam. Don't allow them to access your company's private repos to them, okay? It's easy to imagine the code analysis being used for hacking and ransomware on your products.

How should we report this to Github?

kenn avatar Feb 20 '24 01:02 kenn

Folks, I think this is a scam. Don't click or link your Github account to them, okay?

How should we report this to Github?

Thank you, I've reported the user via the report feature and marked his comment as spam to hide the contents. Thanks for alerting me to this!

ritchieng avatar Feb 20 '24 01:02 ritchieng