lag-llama
lag-llama copied to clipboard
Update requirements.txt
bump gluonts version. Previous version use old version of pydantic.
@ShaneOss thanks mate! I was thinking of upgrading to lighting instead of pytorch-lightning... what do you think?
Hey @kashif , good idea. It works, no issues. Also fixes a few legacy dependency problems.
no worries, this week is choka for me but will have a look soon
Thanks @ShaneOss. I'd have to check if this causes any problems when training, since we found our version of GluonTS to be the most stable one. Will check it soon.
Regarding lightning, @kashif, that'll require a good amount of refactoring if you want to upgrade: are there any benefits?
@ShaneOss so we realized that if we require gluonts[torch]
as the top level requirement, it pulls in the working versions of lighting and pytorch-lightning so closing this PR for now