LOBCAST
LOBCAST copied to clipboard
_pickle.UnpicklingError: state is not a dictionary
When I run "python -m src.main_run_fi", I get the following message:
`Running on server ANY Running FI experiment on Models.MLP, with K=FI_Horizons.K5 Global seed set to 500 0.00s - Debugger warning: It seems that frozen modules are being used, which may 0.00s - make the debugger miss breakpoints. Please pass -Xfrozen_modules=off 0.00s - to python to disable frozen modules. 0.00s - Note: Debugging will proceed. Set PYDEVD_DISABLE_FILE_VALIDATION=1 to disable this validation. Create sweep with ID: njx5clfa Sweep URL: https://wandb.ai/quantml/LOB-CLASSIFIERS-%28FI-EXPERIMENTS%29/sweeps/njx5clfa wandb: Agent Starting Run: 82t1jd72 with config: wandb: batch_size: 64 wandb: epochs: 100 wandb: hidden_mlp: 256 wandb: lr: 1e-05 wandb: num_snapshots: 100 wandb: optimizer_name: Adam wandb: p_dropout: 0 0.00s - Debugger warning: It seems that frozen modules are being used, which may 0.00s - make the debugger miss breakpoints. Please pass -Xfrozen_modules=off 0.00s - to python to disable frozen modules. 0.00s - Note: Debugging will proceed. Set PYDEVD_DISABLE_FILE_VALIDATION=1 to disable this validation. wandb: WARNING Ignored wandb.init() arg project when running a sweep. Setting model parameters {'batch_size': 64, 'epochs': 100, 'hidden_mlp': 256, 'lr': 1e-05, 'num_snapshots': 100, 'optimizer_name': 'Adam', 'p_dropout': 0} dataset type: DatasetType.TRAIN - normalization: NormalizationType.Z_SCORE
dataset type: DatasetType.VALIDATION - normalization: NormalizationType.Z_SCORE
dataset type: DatasetType.TEST - normalization: NormalizationType.Z_SCORE
GPU available: True (cuda), used: True
TPU available: False, using: 0 TPU cores
IPU available: False, using: 0 IPUs
HPU available: False, using: 0 HPUs
Traceback (most recent call last):
File "
How to solve it?