Lorenzo Stella
Lorenzo Stella
Addressed in #3226 and #3227, which will be included in the upcoming 0.16 release
@suzhoum gluonts 0.16 is released, with support for numpy 2. Closing this for now, feel free to open any issues you may find with it. And thanks for opening the...
Hi, this is not a bug, but a breaking change listed in the [release notes](https://github.com/awslabs/gluonts/releases/tag/v0.15.0). It was introduced in #3093. The methods implementing negative log-likelihood as loss associated with parametric...
@leica2023 I'm adding a commit to enable a test of the feature, which I previously had to mark as skipped since it was broken
@abdulmeral since the models are based on PyTorch, you can refer to PyTorch [documentation](https://pytorch.org/docs/stable/notes/randomness.html#reproducibility) about reproducibility. In particular, setting the random number generation seed with ```python import torch torch.manual_seed(0) #...
> @lostella I think it maybe better to use [`transformers.set_seed`](https://huggingface.co/docs/transformers/en/internal/trainer_utils#transformers.set_seed) for seeding. It seeds everything under the sun, so you will have consistent results. TIL
Hi @corneliusroemer, thanks for opening the issue. The warning is generated here https://github.com/huggingface/transformers/blob/69bc848480d5f19a537a70ce14f09816b00cd80f/src/transformers/models/t5/modeling_t5.py#L1024 which appears not to be checking for the `None` case (last `elif` clause never applies). I’m wondering...
A similar fix was applied here to other models https://github.com/huggingface/transformers/pull/33541 maybe all is needed is a similar one for T5
@hakkelt I see that you're attempting various changes in different packages, all motivated by StructuredOptimization. I think it might be fine to move some core definitions to ProximalCore, but I...
Thanks @hakkelt, I’ll take a look at it asap