efficient-kan icon indicating copy to clipboard operation
efficient-kan copied to clipboard

About the randomness introduced during training

Open mhy9989 opened this issue 10 months ago • 3 comments

During my training, I found that even if I fixed all random seeds and used a deterministic algorithm, I still got similar but different results after several epochs of training. Which part of the algorithm might be introducing randomness?

mhy9989 avatar Feb 25 '25 12:02 mhy9989

I have the same issue. Did you find a solution for this problem? If you find that how to handle it please tell me. Thank you for your earliest response.

During my training, I found that even if I fixed all random seeds and used a deterministic algorithm,在训练过程中,我发现即使我固定了所有随机种子并使用确定性算法, I still got similar but different results after several epochs of training.经过几个时期的训练后,我仍然得到了相似但不同的结果。 Which part of the algorithm might be introducing randomness?算法的哪个部分可能会引入随机性?

Bernardor006 avatar Apr 05 '25 16:04 Bernardor006

I'm sorry, I still don't have a good way to resolve the randomness of KAN

mhy9989 avatar Apr 05 '25 16:04 mhy9989

I'm sorry, I still don't have a good way to resolve the randomness of KAN对不起,我仍然没有一个好的方法来解决 KAN 的随机性

I found a way to settle it. It is needed to add a code.

def setup_seed(seed): torch.manual_seed(seed) torch.cuda.manual_seed_all(seed) np.random.seed(seed) random.seed(seed) torch.backends.cudnn.benchmark = False torch.backends.cudnn.deterministic = True torch.use_deterministic_algorithms(True)

The last one is the way to handle it. But the result would be lower than before dramatically!

Bernardor006 avatar Apr 07 '25 13:04 Bernardor006