gluonts icon indicating copy to clipboard operation
gluonts copied to clipboard

Randomness Control in DeepAR. Advice Needed.

Open heaynking opened this issue 11 months ago • 0 comments

Description

I couldn't control randomness when training the model using DeepAR as follows:

def seed_everything(seed: int) -> None:
    np.random.seed(seed)
    torch.manual_seed(seed)
    pl.seed_everything(seed=seed, workers=True)
    mx.random.seed(seed)
    random.seed(seed)
    torch.cuda.manual_seed(seed)
    torch.backends.cudnn.deterministic = True
    torch.use_deterministic_algorithms = True
seed_everything(123)

Therefore, I'm trying to train multiple models and use their statistical values. However, I noticed a difference in behavior between simply collecting models using a for loop and using Joblib Parallel. The for loop exhibits a large variance, whereas joblib's Parallel shows a much smaller variance. And the prediction trends also differ.

I would appreciate any advice you may have.

Environment

  • Operating system: linux
  • python = "~3.10"
  • gluonts = {extras = ["mxnet"], version = "^0.14.4"}

heaynking avatar Mar 15 '24 16:03 heaynking