probability icon indicating copy to clipboard operation
probability copied to clipboard

Document that tfp.distributions.MixtureSameFamily uses softmax on the fractions

Open wiso opened this issue 1 year ago • 0 comments

Hello, it tooks me a whlie to understand this:

my_pdf = tfp.layers.MixtureNormal(3, 1)(np.array([1/3, 0/3, 2/3,
                                 4,  # mean1
                                 5,  # std1
                                 6,  # mean2
                                 7,  # std2
                                 8,  # mean3
                                 9   # std3
                                 ], dtype=float))
print(my_pdf.mixture_distribution.probs_parameter())
print(my_pdf.components_distribution.mean().numpy())
print(my_pdf.components_distribution.variance().numpy())
tf.Tensor([0.3213219  0.23023722 0.44844085], shape=(3,), dtype=float32)
[[4.]
 [6.]
 [8.]]
[[25.067198]
 [49.012764]
 [81.00221 ]]

Where the first three numbers are coming from? Why they are not 1/3, 0/3, 2/3? Then I look to the code and found this

https://github.com/tensorflow/probability/blob/main/tensorflow_probability/python/distributions/mixture_same_family.py#LL343C20-L343C39

Why a softmax is applied to the fractions? E.g.

np.exp(tf.math.log_softmax([1/3, 0/3, 2/3]))

I am not sure why this is done, but please can you document it? I undestand you want the numbers to sum to 1, but I guess this can be requested.

In addition I have recently discovered that also a softsum is applied to the stds

wiso avatar Jun 02 '23 22:06 wiso