Dann icon indicating copy to clipboard operation
Dann copied to clipboard

[🔷 Feature request ]: Derivative of Softmax

Open matiasvlevi opened this issue 3 years ago • 0 comments

Feature

Softmax activation function.

Type

  • [x] Dann
  • [ ] Matrix
  • [ ] Layer
  • [x] Activation functions
  • [ ] Loss functions
  • [ ] Pool functions
  • [ ] Datasets
  • [ ] Documentation
  • [ ] tests & examples
  • [ ] Other

Description

Here is the softmax function I wrote not so long ago:

/**
* Softmax function
* @method softmax
* @param z An array of numbers (vector)
* @return An array of numbers (vector)
**/
function softmax(z) {
  let ans = [];
  let denom = 0;
  for (let j = 0; j < z.length; j++) {
    denom += Math.exp(z[j]);
  }
  for (let i = 0; i < z.length; i++) {
    let top = Math.exp(z[i]);
    ans.push(top / denom);
  }
  return ans;
}

This function is not implemented in the repository yet.

For this function to work in a Neural Network, we would need to write the derivative of this function. This might be a difficult task since this function takes in & outputs vectors. Vectors that are represented as arrays.

These two functions would need to be implemented in src/core/functions/actfuncs.js.

For this function to work with a Dann model, we would need to change how to activations are handled since it expects a vector instead of a number value. I could work on that once the derivative is implemented.

matiasvlevi avatar Jun 26 '21 22:06 matiasvlevi