Bijectors.jl
Bijectors.jl copied to clipboard
Question on simplex bijector implementation
Hi,
It appears that torch.probability
simply uses softmax for the simplex bijector.
Is there a reason our simplex transform is much more complicated?
I was also thinking about a GPU-friendly implementation, which the current implementation appears hard do.