pykan icon indicating copy to clipboard operation
pykan copied to clipboard

MultKAN.__init__ sets global seeds

Open lfrommelt opened this issue 5 months ago • 0 comments

In the current implementation, the constructor of MultKAN sets the seeds for the torch, numpy and random modules on a global level. This can lead to side effects when using randomization in between model initializations. And it really makes you question your sanity if for instance completely different batchings lead to exactly the same training losses. Or, if you happen to seed multiple KANs with seamingly random but in fact deterministic seeds (see #351 ).

Proposed Solution

numpy

Initialize custom rng:

self.np_gen = np.random.default_rng(random_seed)

and replace all np.random with self.np_gen

torch

# It is important to actually instantiate a fresh Generator and not use self.torch_gen = torch.manual_seed(seed)
self.torch_gen = torch.Generator().manual_seed(seed)

and include the keyword generator=self.torch_gen in all torch.random functions.

random-module

Replace by np or torch, its probably even faster then (not tested).

In case I get the permission to create branches, I would love to contribute the pull request for this issue (and potentially others) :)

Best, Leonard

lfrommelt avatar Aug 26 '24 16:08 lfrommelt