David Liu

Results 5 comments of David Liu

Hi thanks for your suggestions! The code modifications are not that big but the usage will change slightly. What I have coded up requires the input to be higher dimensional....

Yes I agree with what you wrote there. So in practice, I can have a go at modifying the kernel module. However, this also involves modifying the conditional Gaussian function...

what about @property decorators? I.e. define setters and getters with the suitable softplus transformation whenever you access the non private variable alpha versus alpha = softplus(_alpha). I got this from...

Yes exactly, so from the outside it just looks like alpha is automatically constrained. prms probably doesn't need the decorator though, what was the reason to add it there? So...

Ah ok that makes sense, yeah in the PyTorch distributions they just label some of the parameters with a constraint and somehow internally this transformation is applied. Looks pretty clean...