TabPFN
TabPFN copied to clipboard
Bag prior weights
Hi! I wanted to check in about the weights in the bag prior. It seems they are 0.961 for the MLP and 0.038 for the GP. Does that seem right? That's what I got after MCMC'ing out the double sampling.
That can very well be. The GP prior is not very helpful for most datasets, and the MLP prior does include the SCM setting if I remember correctly, so that would check out. And what do you mean by MCMC'ing out? Shouldn't it just be the softmax over this line? https://github.com/automl/TabPFN/blob/5805a9a1481c10502909b0b142f978c580ca810f/tabpfn/scripts/model_builder.py#L225
i.e. softmax(2,1)[0] = .73?
Hm I don't think that line is used, because there's this:
https://github.com/automl/TabPFN/blob/5805a9a1481c10502909b0b142f978c580ca810f/tabpfn/priors/prior_bag.py#L15
which overwrites the parameter in the chaining of get_batch
. So you're first sampling a number x
between 2 and 10 and then you're sampling from the softmax(1, x)
That is what I did the sampling over (actually MC'ing, not MCMC'ing, sorry)