blaze_tensor icon indicating copy to clipboard operation
blaze_tensor copied to clipboard

Request for `softmax` with `axis` overloads

Open taless474 opened this issue 5 years ago • 0 comments

Blaze supports the softmax operation and its rowwise and columnwise overloads (ref). I like to request for the blaze_tensor to support this operation for tensors. As an example:

#include <iostream>
#include <blaze/Math.h>
int main()
{
    blaze::StaticTensor<double, 2UL, 3UL, 3UL> A{ { { 1.0, 2.0, 3.0 }
                                     , { 4.0, 1.0, 2.0 }
                                     , { 3.0, 4.0, 1.0 } },
                                      { { 3.0, 6.0, 2.0}
                                     , { -2.0, 2.0, 0.0 }
                                     , { 1.0, 1.0, 3.0 } } };
    blaze::StaticTensor<double, 2UL, 3UL, 3UL> B;
    B = blaze::softmax<blaze::columnwise>(A);     
    std::cout << B << "\n";
    return 0;
}

results in

[[[ 0.09003057,  0.24472847,  0.66524096],
[ 0.84379473,  0.04201007,  0.1141952 ],
[ 0.25949646,  0.70538451,  0.03511903]],

[[ 0.04661262,  0.93623955,  0.01714783],
[ 0.01587624,  0.86681333,  0.11731043],
[ 0.10650698,  0.10650698,  0.78698604]]]

An example of possible python implementation is:

import numpy as np
def softmax(x, axis=-1):
    y = np.exp(x - np.max(x, axis, keepdims=True))
    return y / np.sum(y, axis, keepdims=True)

however, the blaze implementation would be mappped on the above function if we use softmax<rowwise> as axis=1 and softmax<columnwise> as axis=0.

taless474 avatar Mar 08 '19 17:03 taless474