experiment: add gumbel softmax
This PR is an attempt at replacing the regular softmax for MoE gating with a Gumbel-Softmax function instead.
The paper Approximating Two-Layer Feedforward Networks for Efficient Transformers suggests that softmax may not be ideal for gating mechanism, arguing that it's too competitive and may result in some experts being disproportionately used in the selection process.
This PR is an experiment to see if replacing Softmax with Gumbel-Softmax can help with diversifying the expert selection process. By adding noise to the logits, the selection process will hopefully introduce enough fluctuation to the softmax probabilities that if one expert has a slightly higher logit value compared to another one, it won't be chosen every time.
I've added a new parameter, temperature, to control the amount of randomness added. This will need to be inserted in the model's config.json.